The United States has a particularly blood-soaked history. By some measures, the country has been engaged in wars for 93.5 percent of all years between 1775 and 2018. The Founders explicitly regarded the country as an “infant empire,” and its early history was marked by an annihilationist conquest of the land’s native inhabitants. Beneath rhetoric about how the “country we love” is “clear-eyed,” “big-hearted,” and “optimistic that unarmed truth and unconditional love will have the final word”—in the words of an Obama State of the Union address—lies power, backed by violence.[1]
“Much that passes as idealism…is disguised love of power,” Bertrand Russell said. Indeed, U.S. history can be traced along two parallel tracks: the track of rhetoric, appearing in newspapers and presidential speeches, and the track of fact, as experienced in the lives of the victims. In every age the press is full of pious statements. Meanwhile, beyond the annihilation of the Indigenous population, the U.S. conquered the Hawaiian Kingdom and the Philippines, seized half of Mexico, intervened violently in the surrounding region, and (since World War II) extended its resort to force throughout much of the world. The number of victims is colossal.[2]
In one high-level postwar document after another, U.S. planners stated their view that the primary threats to the new U.S.-led world order were “nationalistic regimes” that are responsive to “popular demand for immediate improvement in the low living standards of the masses” and production for domestic needs. The planners’ basic goal, repeated over and over again, was to prevent such “ultranationalist” regimes from ever taking power—or if, by some fluke, they did take power, to remove them and to install governments that favor private investment, production for export, and the right to bring profits out of the country.
Opposition to democracy and social reform is never popular in the victim country. Thus the United States expects to rely on force and makes alliances with the military—“the least anti-American of any political group in Latin America,” as the Kennedy planners put it—in order to crush any indigenous popular groups that get out of hand.
Under some conditions, forms of democracy are indeed acceptable. But democratic decision-making will only be accepted if it is consistent with strategic and economic plans. The United States has consistently opposed democracy if its results can’t be controlled, tolerating social reform only when the rights of labor are suppressed and the climate for foreign investment is preserved. As Thomas Carothers, who worked in the Reagan State Department on “democracy enhancement” projects, concluded, Washington “sought only limited, top-down forms of democratic change that did not risk upsetting the traditional structures of power with which the U.S. has long been allied.” What mattered was not whether a government was democratic but whether it was aligned with “U.S. interests.” A fascist coup in Colombia, inspired by Franco’s Spain, brought little protest from the U.S. government; neither did a military coup in Venezuela, nor the restoration of an admirer of fascism in Panama. But the first democratic government in the history of Guatemala, which modeled itself on Roosevelt’s New Deal, elicited bitter U.S. antagonism. Things didn’t change much in the years to follow. When the rights of investors are threatened, democracy has to go; if these rights are safeguarded, killers and torturers will do just fine.[3]
The basic dilemma facing policymakers is sometimes candidly recognized at the dovish liberal extreme of the spectrum, for example, by Robert Pastor, President Carter’s national security staffer for Latin America. He explained why the administration had to support the murderous and corrupt Somoza regime in Nicaragua, and, when that proved impossible, to try at least to maintain the U.S.-trained National Guard even as it was massacring the population “with a brutality a nation usually reserves for its enemy.” The reason was the familiar one: “The United States did not want to control Nicaragua or the other nations of the region, but it also did not want developments to get out of control. It wanted Nicaraguans to act independently, except when doing so would affect U.S. interests adversely.”[4]
There are numerous cases of outright aggression. Plotting the (sometimes successful, sometimes not) overthrow of governments, such as those of Guatemala, Chile, Iran, Cuba, Haiti, and British Guiana, to name just a few. Over and over again, there have been possibilities of diplomacy and negotiation, which might not have succeeded, but which looked promising. But these were abandoned and dismissed in favor of force and violence. The current arms race with China, and the possibility that the war in Ukraine could have been avoided, are particularly tragic examples of how the U.S. preference for threats over cooperation is leading us constantly into new disasters and creating a vastly more dangerous world.
Even in cases where the United States was not the aggressor, the country’s resort to extreme force has produced totally unnecessary carnage. The Pacific war in World War II was brutal on both sides, but numerous racist atrocities against the Japanese have gone largely unacknowledged. The firebombings of Japanese cities (which destroyed sixty-nine cities and killed up to half a million people) were calculated to maximize the number of civilian casualties, with U.S. military tacticians even producing “flammability maps” of cities to ensure that as many people as possible were burned alive. Philosopher A. C. Grayling, in a careful evaluation of the Allied area bombings of civilian populations, concludes that they have to be considered “moral crimes.” Curtis LeMay was right to point out that had the Allies lost the war, they would likely have been prosecuted as war criminals, but postwar criminal trials were constructed in such a way that only crimes that we did not commit ourselves were considered crimes. Nuremberg prosecutor Telford Taylor observed that “there was no basis for criminal charges against German or Japanese” leaders for aerial bombardment because “both sides had played the terrible game of urban destruction—the Allies far more successfully.” As it turns out, the operational definition of a “crime of war” is a criminal activity of which the defeated enemies, but not the victors, are guilty.[5]
Throughout all of this, the myth of American idealism has persisted. The internal records often reveal that U.S. decision-makers were motivated by nothing of the kind, that they wanted to serve “national” economic interests or protect “credibility.” And yet the unshakable belief in American goodwill and generosity continues to stultify political thinking and debase political discourse. Sometimes, foreign policy is portrayed as vacillating between “Wilsonian idealism” and “Kissingerian realism.” In practice, the distinctions are mostly rhetorical. Every great power toys with the rhetoric of benign intentions and sacrificing to help the world. Our belief in our own exceptionalism is the most unexceptional thing about us.
Also ready on the shelf is the doctrine of “change of course.” True, we made errors in the past, a result of our innocence and excessive goodwill. But that is behind us, and we can therefore keep to the grand vistas that lie ahead, ignoring all of history and what it might suggest about the functioning and behavior of institutional structures that remain unchanged. The doctrine is invoked with impressive regularity, always with sober nods of approval for the profundity of the insight.
There is a striking bipartisan consensus on the legitimacy of U.S. dominance. After Joe Biden came into office, The New York Times observed in a headline that “On U.S. Foreign Policy, the New Boss Acts a Lot Like the Old One,” citing Biden’s menacing of China, warm embrace of the murderous Saudi crown prince, and continuing support for Israel despite international condemnation of the occupation of Palestine. The Times quoted Donald Trump’s former deputy secretary of state making the accurate comment that “continuity is the norm, even between presidents as different as Trump and Biden.”[6]
No ruling powers have ever thought of themselves as evil. They believe they are good, and it is their opponents who are evil. We must make sure we are not falling into the trap of believing we are on the right side simply because we have been told so. Instead we must confront the ugly truth and pay attention to the victims of our country’s actions. A reigning doctrinal system pervades the media, journals of opinion, and much of scholarship. An honest inquiry reveals that striking and systematic features of our international behavior are suppressed, ignored, or denied. It reveals further that our role in perpetuating misery and oppression, even torture and mass slaughter, is not only significant in scale, but is also a predictable and systematic consequence of long-standing geopolitical conceptions and institutional structures.
But even if one chooses to maintain the belief in the good intentions behind U.S. violence, intent is not particularly morally significant. After all, we rarely consider intentions when evaluating enemy states. We do not measure the legitimacy of the invasion of Ukraine on the basis of whether Vladimir Putin truly believed it was full of Nazis. His sincerity is considered an irrelevant factor, because his actions were criminal. The Chinese famine of 1958–61 is not usually dismissed on the grounds that it was a “mistake” and that Mao did not “intend” to kill tens of millions of people. Nor is it mitigated by speculations about his personal reasons for the orders that led to the famine. In the case of adversaries, we often blame them for the predictable consequences of their actions, regardless of whether they felt themselves to be doing good. We recognize that even the worst monsters may have convinced themselves that they are engaged in something morally worthy.[7]
We know that those who conquer and suppress describe themselves as doing it for the victims’ own good. Instead of saying they wanted easily exploitable, cheap labor for their own benefit, enslavers said they were acting for the benefit of the enslaved. John C. Calhoun, defending slavery as a “positive good,” said, “Never before has the black race of Central Africa, from the dawn of history to the present day, attained a condition so civilized and so improved, not only physically, but morally and intellectually.” Do we care whether Calhoun was sincere in believing this? Does it mitigate anything if he was?[8]
Instead of focusing on what we meant to do, then, we should look at what we have done. We distinguish ourselves from the “terrorists” by pointing to the fact that when they shoot civilians, they do so intentionally, whereas we and our allies only ever do so inadvertently. Our victims are “collateral damage.” Of course, this explanation doesn’t make much difference to the victims. But also: Does it matter whether one who drops a bomb on a village intends to kill the villagers or just to flatten their houses?
The application of a double standard (or rather, the aforementioned single standard, namely that we can never be malevolent by definition) results in extraordinary intellectual contortions. If Fidel Castro had organized or participated in multiple assassination attempts against the United States president, or tried to destroy livestock and crops, he would be the very symbol of barbarian evil. Yet we claimed the right to do just that to Cuba. We also took it for granted that we had the right to put missiles in the Soviets’ backyard. But when they tried to exercise the same right, we nearly started World War III. The inconsistencies are barely noticed.
To ask serious questions about the nature and behavior of one’s own society is often difficult and unpleasant. Difficult because the answers are generally concealed, and unpleasant because the answers are ugly and painful. But we must engage in the exercise, because the danger of maintaining our delusions continues to grow.
In 1999, political analyst Samuel P. Huntington warned that for much of the world, the United States is “becoming the rogue superpower,” seen as “the single greatest external threat to their societies.” A few months into George W. Bush’s first term, Robert Jervis, president of the American Political Science Association, warned that “in the eyes of much of the world…the prime rogue state today is the United States.” Yet Americans find it difficult to conceive of their country as aggressive or a threat. We only ever engage in defense.[9]
Whenever you hear “defense,” it’s usually correct to interpret it as “offense.” The imperial drive is often masked in defensive terms: it is not that we are seeking to dominate an integrated world system, but rather that we must deny strategic areas to the Kremlin, or China, thus protecting ourselves and others from their “aggression.” The masters of the Soviet Union affected a similar pose, no doubt with equal sincerity and with just as much justification. The practice has respectable historical antecedents, and the term “security” is a conventional euphemism. The planners merely seek to guarantee the security of the nation, not the interests of dominant social classes.
The United States is already far in the lead in conventional forces and weapons of mass destruction, outspending the next ten countries combined. It continues to fuel a vast global arms race, and is trying to move to a new frontier that hasn’t yet been militarized: outer space. This would violate the Outer Space Treaty of 1967, which has so far prevented the militarization of space (the United States and Israel abstained from reaffirming it in the UN). The goal, as U.S. Space Command documents explain, is to dominate “the space dimension of military operations to protect U.S. interests and investments.” The U.S. is also leading the way in developing and deploying new kinds of autonomous weapons systems that make their own decisions about when and whom to kill. The danger here could not be more extreme, though it goes mostly undiscussed.[10]
There is an alternative path to the one we have pursued, namely to take stated ideals seriously and act on them. The United States could commit itself to following international law, respecting the UN Charter, and accepting the jurisdiction of the International Criminal Court and the World Court. It could sign and carry forward the Kyoto Protocol. The president could actually show up to international climate conferences and take the lead in brokering deals. The U.S. could stop vetoing Security Council resolutions and have a “decent respect for the opinion of mankind,” as the Declaration of Independence mandates. It could scale back military spending and increase social spending, resolving conflicts through diplomatic and economic measures rather than military ones.
For anyone who believes in democracy, all of these are mild and conservative suggestions. They are mostly supported by the overwhelming majority of the population. They just happen to be radically different from existing public policy.
Once we see the consequences of the attempt to impose U.S. hegemony through force, we have an obligation to oppose it. It is the fundamental duty of the citizen to resist and to restrain the violence of the state. It is cheap and easy to deplore the crimes of others, while dismissing or justifying our own. An honest person will choose a different course.
Those who have the capacity to act have a duty to act. Living in a free society where extraordinary wealth is available confers, at the very least, a responsibility to understand how power works and ask basic moral questions.
Even those who are not “heroes” by nature are capable of resistance. Mass popular movements have always been comprised of everyday people who have the courage and intellectual integrity to face the moral challenges of their time. The world is full of suffering, distress, violence, and catastrophes. Each person must decide: Does something concern you, or doesn’t it?
Many who have access to privilege may be reluctant to forgo the ample rewards that a wealthy society offers for service to power and to accept the sacrifices that the demands of honesty may well entail. Even in the most humane and democratic society it requires considerable courage to refuse to take part in crimes against peace.
Fortunately, such courage is not lacking. The history of the world is not just a bleak compendium of atrocities, but also the story of resistance by those who refused to accept cruelty and oppression as natural, normal, or inevitable. Wherever there is injustice, there are also people trying to stop it.
In the United States, mass movements have achieved striking successes. In the nineteenth century, workers tried to create an independent labor movement based on the principle that “those who work in the mills should own them.” Under conditions immeasurably more difficult and repressive than those existing today, they tried to secure better conditions for themselves and each other. They were ultimately defeated, but their work had lasting effects. These same years saw the rise of mass education, a major contribution to democracy (hence, unsurprisingly, a main target of today’s assault on democracy). Emerging out of the ashes of Wilson-era repression, the militant labor movement of the 1930s led America to social democracy while Europe was succumbing to fascism (another process now being reversed under assault). Instead of fascism, they delivered Social Security and the guaranteed right to collective bargaining.[11]
During the 1960s, large groups of people chose to enter the political arena to press their demands rather than remain passive and apathetic. The movements they began—for Black civil rights, women’s liberation, LGBTQ rights, environmental protection, and an end to the Vietnam war—made the United States a better country, in ways that are permanent. Today, there is greater sensitivity to racist and sexist oppression, more concern for the environment, more respect for other cultures and for human rights. There is much to learn from studying the words and actions of those who launched the Mississippi Freedom Summer, the American Indian Movement, the Free Speech Movement, the Chicano Movement, the Movimiento Estudiantil in Mexico, and the other major global uprisings that tried to reapportion power.
We have also seen a significant effort to improve public understanding of the country’s history and present-day injustices. Out of the activism of the 1960s came Black Studies and Women’s Studies programs that drew attention to perspectives that had been entirely left out of mainstream scholarship. Major contributions like Howard Zinn’s A People’s History of the United States (and the companion Voices volume) lifted the veil on the standard patriotic histories and aired aspects of the country’s past that many would rather not discuss. Exposure of these truths creates backlash, with an effort to censor and purge this supposedly dangerous material. “Critical race theory,” for instance, is now used as a scare phrase to refer to any study of the systematic structural and cultural factors that gave this country a four-hundred-year history of racist repression. There is an organized effort to ensure that young people are only exposed to propagandistic narratives that uncritically celebrate and venerate the United States.[12]
Today, thanks to the efforts of activists, there is more popular revulsion at U.S. crimes around the world. For instance, in 1963, when the Kennedy administration launched a direct attack against South Vietnam, there was almost no protest in the United States. By the late 1960s, public outrage had grown so substantial that one reason the military hesitated to send more forces to Vietnam was that they were expected to be needed at home—to quell public uprisings. That greater public scrutiny of U.S. conduct has endured. Activist pressure helped limit, and ultimately end, U.S. support for South African apartheid. The Reagan administration’s support for atrocities in Central America was clandestine in part because it was known that there was little public support for the policy.
In 2003, when the Bush administration launched its criminal war against Iraq, it immediately sparked the largest antiwar protests in history. Protesters could not stop the war, but there was clear evidence of an increased unwillingness to tolerate atrocities—an example of the “civilizing effects” of the 1960s. To muster public support for the Iraq invasion, it was necessary for the Bush administration to launch a huge propaganda offensive depicting a weak country as the ultimate evil and an imminent threat to our very survival. Popular resistance in this country can impose certain constraints on state violence.[13]
We can recount the stories of plenty of individuals who saw matters clearly and summoned the personal courage to act, even at the risk to their own freedom. In the United States, Chelsea Manning exposed U.S. war crimes in Iraq, landing herself in solitary confinement for years and being driven repeatedly to the brink of suicide. Edward Snowden knew that when he exposed the reach of the U.S. surveillance state, he would be driven into permanent exile. In Israel, nuclear technician Mordechai Vanunu endured nearly two decades in prison (including eleven years in solitary confinement) for blowing the whistle on his country’s secret nuclear program. Rachel Corrie, an American student in Israel, became a martyr for peace when she was killed by an Israeli bulldozer, which she was trying to stop from demolishing a Palestinian home. Berta Cáceres, a Honduran environmental activist and Indigenous leader, was murdered for organizing protests to stop the plunder and destruction of her community. (One of her murderers was, unsurprisingly, trained by the U.S. government.[14])
But the stories of heroic individuals give a false impression of how movements succeed. Necessary social change happens because of large numbers of dedicated people, most of whose names are never known, working together at all levels, day in and day out. History books, which pick out only a few famous leaders, mislead us. In reality, from the abolition of slavery to the democratizing movements of the 1960s, to Black Lives Matter and the democratic socialist movement today, as the late Howard Zinn put it, “what matters are the countless small deeds of unknown people, who lay the basis for the significant events that enter history.”
In our own time, there is much to be inspired by. The Palestinians who risked Israel’s (U.S.-funded) bullets to demonstrate on the Great March of Return showed incredible courage. The Kurds of Rojava have not just resisted a hostile (U.S.-supported) military, but have experimented with a remarkable new social model that emphasizes popular participation in government and women’s liberation. The Zapatistas of Mexico also offer an example of authentic democratic politics. There are extraordinary popular movements for justice across the Global South.
We have seen examples of how movements can achieve significant policy changes. The environmental movement of the 1960s succeeded in forcing a Republican administration to take important steps toward reining in pollution. Today, the Sunrise Movement is at the forefront of activism on climate and has engaged in civil disobedience. They successfully pressured the Biden administration to improve its climate policies. The popular movements of our time, many nourished by the Bernie Sanders campaigns of 2016 and 2020, have forced the Biden administration to adopt progressive stances that would previously have been out of the range of the politically possible. While Biden’s record on labor issues remains underwhelming, he is the first president since Franklin Roosevelt to take a strong public stand in favor of unionization. This is not just out of personal conviction, but because a newly energized and popular labor movement forced him to do so. This is how the New Deal came about, too—through a combination of militant labor action, CIO organizing, sit-down strikes, and a sympathetic administration.
The record of crimes can be numbing. It is easy to feel hopeless, to see an immovable hegemon. But there are ample opportunities to help create a more humane and decent world, if we choose to act upon them. Those who want to shift policy in a progressive direction must grow and become strong enough so that they can’t be ignored by centers of power. We can learn a great deal from the long and hard struggles for social justice in past years, and we can and must move forward to build on their achievements and to surpass them.
We live entangled in webs of deception—often self-deception. But with a little honest effort, it is possible to extricate ourselves. If we do, we will see a world that is rather different from the one presented to us by a remarkably effective ideological system. We will also learn that the system of thought control can collapse very quickly, as happened during the Vietnam War, with consequences that persist today. The main achievement of hierarchy is to get the “unpeople” to accept that oppression is natural. The first step toward making change is to recognize the forms of oppression that exist. The lessons of history teach us a good deal, but nothing more clearly than the fact that we often remain quite unaware of the forms of oppression of which we are victims, or sometimes agents, until social struggle liberates our consciousness and understanding.
We now need what the great antiwar activist A. J. Muste called “revolutionary pacifism.” Muste urged that “one must be a revolutionary before one can be a pacifist”—by which he meant that we must cease to “acquiesce [so] easily in evil conditions,” and must deal “honestly and adequately with…the violence on which the present system is based, and all the evil—material and spiritual—this entails for the masses of men throughout the world.”[15]
We citizens of democratic societies must develop critical thinking skills as a form of intellectual self-defense, to protect ourselves from manipulation and control. We can do it. There’s nothing in the social sciences or history or whatever that is beyond the intellectual capacities of an ordinary fifteen-year-old. We have to do a little work. We have to do some reading. But there is nothing too deep to grasp.
We are at a unique moment in history. Decisions that must be made right now will determine the course of our species’ future (if there is to be one). We have a narrow window to implement the measures necessary to avert the cataclysmic destruction of the environment. Unfortunately, the “masters of mankind” in the world’s most powerful state have been hard at work to close that window and to ensure that their exorbitant short-term profit and power will remain untouched as the world goes up in flames.
World-destroying nuclear weapons are being accumulated, and the countries in possession of these weapons are unable to cooperate and talk openly of the possibility of war with one another. The Bulletin of the Atomic Scientists’ “Doomsday Clock,” which provides experts’ best estimate of the risk of civilization-wide disaster, has recently been set to ninety seconds to midnight, the closest it has come to termination. The analysts who set the clock cited the two most salient reasons: the growing threat of nuclear war and the failure to take the required measures to prevent global heating from reaching a point where it will be too late. Ninety seconds may be too generous an appraisal, unless those who want to save the world from worse horrors act quickly, firmly, and decisively.[16]
There is a lack of public understanding of the urgency of the situation. A Pew Research Center poll offered respondents a set of issues to rank in order of urgency. Nuclear war did not even make the list. Climate change was ranked close to last; among Republicans, only 13 percent said mitigating climate change should be a top priority.[17]
An extraterrestrial observer looking at our species would say that our primary trajectory is toward suicide, that we are collectively running toward a cliff. Human civilization, having started almost ten thousand years ago in the Fertile Crescent, may now be approaching its inglorious end. It may turn out that higher intelligence was a kind of “evolutionary mistake.” One of the theories put forward for why no intelligent life has so far been discovered elsewhere in the universe—the “Fermi paradox”—is that intelligent life may be a kind of lethal mutation that annihilates itself whenever it arises. We are a new species, having been around for a mere second in the evolutionary time scale, and so far we seem intent on proving the theory that intelligence leads to self-destruction.
We are now engaged in an experiment to determine whether our humanity’s moral capacity reaches far enough to control our technical capacity to destroy ourselves. Unfortunately, the prospects look grim, and the observer might well conclude that the gap between moral capacity and technological capacities is too immense to prevent species suicide.
But the observer could be mistaken. It’s up to us to prove this judgment wrong.
We don’t know that honest and dedicated effort will be enough to solve or even mitigate the problems we face. Still, we can be quite confident that the lack of such efforts will spell disaster. Freedom and democracy are by now not merely values to be treasured, but are quite possibly the prerequisite to survival. We therefore have only two choices. One is to say, “It’s hopeless. Let’s give up.” This guarantees that the worst will happen. The other is to say, “We want to make things better, so we will try.”
Given the urgency of the crises we face, there is no time to lose.