CHAPTER 5

Imperial War, Imperial Money

The Dollar’s Rise to Global Dominance

“The dollar is our currency, but it’s your problem.”

—US Treasury Secretary John Connally, 1971

It has been said that “one of the things that makes a nation a nation is its currency.”1 Consider in this light two of the earliest forms of money in the United States: Virginia’s tobacco receipts, and notes based on mortgaged land. The first, tobacco money, became legal tender in 1642 and persisted for nearly two hundred years. The second, currency issued by land banks, originated in South Carolina in 1712 and would soon be found in eight more American colonies. The first, of course, is a receipt on the product of slave labor, and the second is backed by lands appropriated from indigenous peoples. Across these monies, therefore, run traces of concealed sources of American capitalism: slavery and settler colonialism.

Typically, the historical study of money in the United States is posed in terms of a number of binary oppositions: paper money versus metallic coins; debtors versus creditors; state banks versus central banks. Undoubtedly, all of these oppositions are significant and important. But as explanatory matrices, they all elide fundamental questions of violence, expropriation, domination, and labor. They wash blood from money. This chapter reframes the origins of US currencies, beginning instead from their roots in war and subjugation.

Coined Blood: Scalp Bounties and the Violent Economy of Indigenous Displacement

“You white people get together and measure the earth and then divide it.”

—Too-schul-hul-sote, indigenous “dreamer”2

Land is, of course, foundational to capitalism everywhere. Primitive accumulation pivots on the theft, enclosure, and parcelization of the earth. But in the Americas, the metamorphosis of land from communal to private property was accomplished by centuries of war. Looting, scalping, raping, pillaging, shooting, burning villages—all the techniques of organized slaughter abetted settler colonialism. This New World marriage of war and economy was consummated through the displacement and elimination of indigenous peoples. “Territoriality is settler colonialism’s specific, irreducible element,” notes one scholar. “Settler colonialism destroys to replace”; its goal is the destruction of indigenous society and its replacement by settler colonies.3 To ground money in land banks, as did South Carolina, Pennsylvania, New York, and other states, was to root it in soil drenched with the blood of indigenous peoples.

Discussing banks based on real estate, Benjamin Franklin referred to their currencies as “coined land.”4 But before it could be coined—turned into a form of money—land had first to be privatized and commodified. And this was accomplished by way of what military historian John Grenier calls extirpative war. To extirpate, according to the Oxford English Dictionary, is to “eradicate or destroy completely.” Indeed, the Latin root is the verb exstirpare, meaning to rip out from the roots. Precisely this was the nature of the “Indian Wars” upon which the US state and economy were founded. This foundational violence was most frequently waged by militarized settlers, marauding groups of land-hungry Europeans. If anything, this heightened the murderous violence, as “rangers” and armed settlers had no respect for rules that were presumed to limit the targets or the tactics of military violence. Grenier’s description of the Indian Wars of 1607–1814 is apt: “For the first 200 years of our military heritage, then, Americans depended on arts of war that contemporary professional soldiers supposedly abhorred: razing and destroying enemy villages and fields; killing enemy women and children; raiding settlements for captives; intimidating and brutalizing enemy noncombatants; and assassinating enemy leaders.”5

An inflection point arrived with the French and Indian Wars (1755–64), part of the Seven Years’ War between Britain and France. As much as this conflict catapulted Britain to unrivaled global supremacy, it also consolidated unrelenting hatred toward Indians among Americans who fought alongside the British. War against indigenous peoples now became systematic, at the very time the independent commodity-producing society of the US Northeast was being subsumed into agrarian capitalist development. Settler colonialism now consolidated itself in the form of settler capitalism. All “primitive” accumulation of capital, as Marx reminds us, is based on the violent dispossession of people from the land.6 In the United States, this sort of primary accumulation was carried out by means of military expropriation of indigenous peoples, and became more intensive, extensive, and methodical following the French and Indian Wars.7 It was in this period that Virginia set out to seize lands as far west as the Mississippi by driving indigenous nations into the US interior, and when the Carolinas turned to a much more aggressive expansionism. As part of these processes, American troops destroyed the power of the Cherokee Nation in a campaign of indiscriminate carnage.8 The American revolt against Britain in 1776 only intensified these trends.

During the American Revolution and after, federal military forces moved to the forefront of “Indian removal,” taking over the role previously assumed by armed settler groups. In 1779, George Washington ordered his troops to undertake “the total destruction and devastation” of the Iroquois Confederacy.9 Forced relocation, bribery, exploitation of debts, and tribal animosities would all be tools to this end, deployed strategically by President Jefferson in a campaign of ethnic cleansing in the early 1800s.10 Monstrous chapters were written with the Louisiana Purchase (1803). Then, the War of 1812 against Britain drove militarized expropriation to ever-higher levels, creating the conditions for General Andrew Jackson’s murderous rampages through Georgia, Alabama, and Florida (1812–25), the conquest of Texas beginning in 1825, and that of New Mexico, Arizona, California, Nevada, Colorado, and Utah across the 1840s, followed by the systematic “cleansing” of the West after the Civil War.11 By 1887, indigenous peoples in the United States had been dispossessed of nearly three billion acres of land, or more than 98 percent of the land mass of the continental United States, in one of history’s most colossal and merciless processes of primary capitalist accumulation.12

The commodification of land was sealed in violence against indigenous bodies. And that violence, both material and symbolic, was directly monetized in the form of scalp bounties. Rewards for Indian scalps appeared in American colonial laws beginning in the 1670s. Massachusetts and South Carolina were among the most aggressive promoters, with the former offering ten pounds sterling for a scalp, about ten times the maximum day wage of a laborer. Even “pacific” Pennsylvania got into the act, regularly increasing scalp bounties as years went by.13 This grim commerce also excelled at something slavery pioneered: the reduction of persons to monetary sums. In the case of scalp bounties, this entailed a New World corpse economy, a morbid exchange between money and severed human body parts—scalps.14 Indeed, the currencies issued by colonial land banks were literally secured by redskins, as American settlers dubbed the bloody corpses they left to rot after scalps had been claimed. Land bank currencies were thus not merely coined land; they were also coined blood. And the same applied to most other monies issued by colonial states. A large number of colonial currencies, after all, were debt notes issued for war finance. And these debts were paid off with land sales once indigenous peoples had been pushed out of their environments. “In the final analysis,” remarks one commentator, “most Americans saw currency for what it was: a measure of the value of land … that states would sell once the shooting stopped.”15 It is instructive that America’s first major financial crash in 1792 was triggered by the defeat of the United States Army by Little Turtle and the Indians of the Western Confederacy in what is now northwestern Ohio. Since military defeat meant no new lands—and thus no land sales to pay off war debts—it immediately induced economic panic.16

These deep social connections between Indian wars, slavery, and early American banking are starkly revealed in the life history of America’s first president, George Washington, and of his friend Thomas Willing, probably the wealthiest man in Philadelphia from the mid-1790s through the War of 1812. These two men were not merely rapacious individuals. They were also “personifications of capital,” to use Marx’s term—individuals who personally embodied the social dynamics and behavioral norms of emergent capitalism.

In late 1755, Willing wrote from Pennsylvania to his cousin in London, inquiring about investing two thousand pounds in Bank of England stock. Often described as America’s first banker, Willing was fed up with the lack of vigor with which his colony’s leaders waged war against indigenous peoples. In the absence of a more aggressive military policy, he turned his eyes to England’s bank of war finance. A mere four months earlier, this future president of the Bank of North America (1781–91), and then of the Bank of the United States (1791–1807), had advertised for sale “a parcel of likely servant men and boys”—bonded laborers from Ireland, Germany, Wales, and England. It would not be long until enslaved Africans imported from the West Indies were added to his sales catalog.17

Biography rarely encapsulates the sweep of history with such clarity. Yet, in Willing’s case, banking, military contracting, slaving, and support for war against indigenous peoples were bound together in a personal trajectory that traced the path of capitalist development in colonial America. At the 1795 wedding of Willing’s daughter, attendees included a who’s who of America’s elite, including its sitting president, George Washington. If the first president had achieved glory on the battlefield, it was in large measure thanks to gunpowder, cannons, and thousands of arms provided by Willing to the troops Washington commanded against the British.18 The friendship between these powerful men also highlights the social connection between Philadelphia bankers and Virginia planters that underlay early American capitalism, a connection that was forged ever tighter through war and war finance. And for Washington personally, it was war that enabled his acquisition of land, and his entrée into the world of plantation production.

It seems fitting that the future president found his original profession as a surveyor, since military displacement of Indians generated a huge demand for the mapping of expropriated land. Like many in his trade, Washington found time to snatch up speculative holdings for himself, purchasing a thousand acres in the Shenandoah Valley in 1750. Then, before the decade was out, his marriage to Martha Custis made him one of northern Virginia’s largest landowners. To these estates he added twenty-five thousand acres, grabbed as reward for military service, much of it against indigenous peoples in the French and Indian Wars. Continued war service brought him forty-five thousand acres more in 1773. Yet, none of these lands were worth much without labor. And on Virginia’s large estates, the work was done by enslaved people of African descent. Always the social climber, Washington was not one to brook the practices of his class. At his death, the United States’ first president owned 277 bonded persons.19

Washington excelled in the two practices foundational to planter capitalism: indigenous displacement and African enslavement. We have observed his 1779 presidential instructions that the US Army should march on the Iroquois to bring about “the total destruction and devastation of their settlements.” His officers complied, destroying at least forty Iroquois towns.20 Just over twenty years later, Congress created the US Land Office, which financed real estate purchases. Measured in terms of its loan book to borrowers, the Land Office would soon be the world’s largest bank.21 As Indian wars and indigenous displacement took on continental dimensions, the United States rushed “like a comet into infinite space,” as one federalist critic put it.22 Land was now being coined on a scale Franklin would have found inconceivable. Yet, the inner secret of coined land and of American capitalism’s rush into infinite space was, as scalp bounties remind us, coined blood.

Revolution, War Finance, Capitalists, and Con Men

War pivots on finance. And it also breeds new forms of it. The American War of Independence against Britain was no exception. Revolutionary quartermasters, colonels, and treasurers in the colonies sought weapons, provisions, horses, and basic supplies on a tremendous scale. During the winter of 1777–78, American soldiers consumed almost 2.3 million pounds of flour and nearly as much beef. In the month of May 1778 alone, the army’s horses ate two and a half million tons of hay and a quarter-million bushels of grain.23 Arranging the logistics of war economy was the order of the day.

Inevitably, this pushed public credit to the fore, since wartime supplies were needed before they could be paid for. As they gathered weapons, food, ammunition, and supplies, colonial officials frantically issued promises to pay at a later date. These pledges, as we have seen, were ultimately backed by land that would be seized “once the fighting had stopped.”24 But many bills would come due long before then. To manage this, two-thirds of the cost of the Revolutionary War was funded by the use of bills of credit.25 To the good fortune of the Americans, a market in these promissory notes developed among investors in Paris and Amsterdam willing to bet on a victory for the colonial rebels (and on the land seizures that would accompany it). America’s government could thus receive cash up front in return for debt notes.

Massachusetts issued its first war bonds in May 1775. The following month, the Continental Congress started printing a national paper currency, known as continentals. Treasury Secretary Alexander Hamilton was soon to propose a national bank with powers to print notes, coin money, receive deposits, and make private and public loans. A decade later, this proposal would come to fruition with the creation of the First Bank of the United States in 1791. But even without a national bank, the Americans found means of war finance in the form of “floods of paper money,” as future president John Adams was to put it.26 By the time the shooting had stopped, Congress had issued $226 million in notes, while an additional $100 million in paper currency had flowed from the states.

Americans had already demonstrated a unique fondness for paper money, much to the dismay of the imperial metropole, which repeatedly prohibited it (in laws of 1720, 1741, and 1751).27 But the Revolution scaled a new summit. Paper money would reign supreme in the United States for the next fifty years, notwithstanding widespread fetishism of precious metals as the only “true” money. And with proliferating paper currency came intensified monetization and precocious financialization. As in the ancient Greco-Roman world, war was a medium for monetizing social life—first, through the huge demand for market goods like provisions and weapons, and second, through wages paid to soldiers, many of them recruited from largely self-sufficient farms. A feedback loop developed in which government bought up foodstuffs for soldiers and horses; young men spent their army pay; and farmers, under pressure to pay off land loans or to acquire more land, pushed foodstuffs to the market. Meanwhile, rocketing demand for agricultural goods drove up wartime prices, just as wages rose due to labor shortages, brought on by the absence of farm boys due to military service. As monetary transactions expanded, so did financial institutions. As early as 1794, when four chartered banks could be found in the whole of the British Isles, the United States already hosted eighteen. By 1825, the United States had nearly two and a half times as much banking capital as did England and Wales. Riding this precocious financialization, bank assets as a share of aggregate US output rose steadily from 1785—hitting levels in the 1820s comparable to those that many countries reached only in the 1990s.28

This postrevolutionary surge of banks and paper currency provided fertile ground for an astonishing rise in the number of swindlers and con men, of the sort later depicted in Herman Melville’s novella The Confidence Man (1857). Specialists in constructing “pawnshops for promises,” con men regularly bilked small investors in a plethora of get-rich-quick schemes.29 In so doing, they contributed to the mistrust of paper money that was such a powerful force in US financial history. Heading the pack of early nineteenth-century experts in the art of the con was Boston-based Andrew Dexter Jr., who began building his paper money machine in 1804. Dexter chose Rhode Island, the mecca of paper money, for the launch of his innocuous-sounding Farmers Exchange Bank. From there, he churned out tens of thousands of notes, which he sent as far from their source as possible, thus delaying, if not preventing, their redemption for specie.30 En route, he set up several more banks, including one in Detroit and another in Pittsfield, Massachusetts—each an instrument for the production of baseless banknotes. Amazingly, Dexter’s pyramid scheme thrived for five years before it all came crashing down. By the time it did, in 1809, the Farmers Exchange had issued over $760,000 in notes, backed by a mere $86 in specie.31

The panic associated with Dexter’s meltdown was one element in a larger reaction against paper money and national banks. So intense was the early nineteenth-century revulsion against paper money that it also brought down the First Bank of the United States (BUS), which had faced hostility from its inception. Created in 1791 as a component of Alexander Hamilton’s program for an activist state promoting capitalist development, the Philadelphia-based BUS was modeled on the Bank of England.32 From the start, powerful Virginia tobacco planters opposed Hamilton’s national bank. Immersed in an economy where they controlled finance as well as production of a global export commodity—tobacco—these planters mistrusted any shift of financial power out of the Old South. Republicans like Thomas Jefferson inflected this opposition with anti-centralist rhetoric, and in 1801 they created the US Land Office as an alternative. Then, in the wake of monetary scams and financial panics, an awkward coalition of agrarian populists, financial capitalists outside Philadelphia’s Chestnut Street, and working-class radicals all turned their enmity on the bank and the very idea of central banking. By 1811, the BUS could not muster enough support in the Senate to get its charter renewed. With the elimination of the BUS’s power to regulate banking, new banks could now multiply like weeds.

In a single legislative act, Pennsylvania created forty-one new banks in 1814. Three years later, Kentucky wished forty banks into being, with a nominal capital of ten million dollars—though in truth, they had not a coin in their treasuries. These were soon known as “caterpillar banks,” as they were said to gobble everything in their path. A country that had 114 banks in 1811 boasted 256 five years later. Despite official bullionism, the United States was blatantly a country of footloose paper currencies, a mecca of easy money. The more than two hundred banks to be found in 1815 reported a combined $82 million in capital, only one-fifth of which was backed by silver and gold.33 By this time, canal companies, railways, blacksmiths, and various academies were also issuing a dazzling array of notes that circulated as money. American capitalism had set off down the road of fragmented finance. Yet, down that road also lay the potholes of scams and panics that regularly provoked anti-banking fevers, and a fetish of precious metal.

Indian Hunting, Market Populism, and the Rise of Wall Street

No sooner had the First Bank expired than support for a strong central bank was renewed, following the trauma of the War of 1812—which saw the British burn down the White House, provoking a financial panic during which banks suspended conversion of notes into specie. With public expenditures running two to three times higher than government income, it required substantial sales of interest-bearing Treasury notes to finance the war. Accepted as legal tender for all government transactions, including taxes, these notes became a crucial part of the money supply.34 In 1816, shortly after the war’s end, Congress approved the launch of the Second Bank of the United States, though it remained little more than an accessory to the Treasury, the real central bank at the time. But the Second Bank soon found itself with half of all bank-held specie, and it began to assume many of the coordinating and regulating functions of a modern central bank. However, its history was plagued by crises, scandals, and intensifying political opposition. Particularly during the tenure of Nicholas Biddle as BUS president (1823–36), with the United States in the throes of intensifying capitalist transformation, the bank became a lightning rod for social grievances against moneyed interests and Washington officials. The acclaimed Indian killer, Andrew Jackson, shrewdly mobilized these sentiments on his road to the presidency, fostering a market populism that extolled economic individualism while condemning monstrous bankers and bureaucrats. Dynamic capitalist development in the United States was thereby joined to a fragmented, decentralized, and largely unregulated banking system.

It must be underlined that monetary fragmentation did not significantly hinder capitalist accumulation in the United States. There has been a widespread tendency, particularly since the onset of global financialization in the 1970s, to treat finance as the prime mover of capitalist development, a view that all too easily meshes with neoclassical conceptions of capitalism as a “money economy” fueled by individual property rights.35 Such perspectives miss the vital sources of capitalist growth in labor, exploitation, and accumulation of means of production. For it was as a powerful machinery for harnessing human labor that the US economy thrived in the decades after 1800, notwithstanding its highly localized financial system.

A growing body of research has demonstrated that this phase of vigorous capitalist growth had fundamentally agrarian roots. Far from “revolutionary” market forces overturning a “conservative” economy and culture based on land,36 American capitalism developed on the basis of landed production. Where dispossession of indigenous people had been largely completed, the transition from independent farming to agrarian capitalism occurred in large measure through household production, rather than via its eradication, as had been the story in Britain with the expropriation of small tenants. In the US case, family farms were rendered market-dependent via the effects of land prices, mortgages, and market pressures. Farmers were increasingly compelled to produce monetizable cash crops in order to make debt and mortgage payments to government and banks, and to purchase farm implements and household goods. All of these capitalist relations subjected petty commodity producers to the imperatives of the market. By the 1830s, farmers in the Ohio Valley, drowning in debt, were in a state of insurgence against banks in general, and the Second Bank of the United States in particular.37 If it is true, as historian Jonathan Levy asserts, that “after 1870, mortgage debt pressured farmers into growing the product that brought in the most cash” in states such as Kansas, Minnesota, Wisconsin, Nebraska, and the Dakotas, it is also the case that in Ohio and areas of the Northeast this transition had begun decades earlier.38

Well before the outbreak of the Civil War, a market-integrated, commodity-producing agriculture held sway in the dominant regions of the US economy, generating monetized surpluses and increasing demand for manufactured goods.39 This in turn stimulated the large-scale industrial manufacture of shoes and textiles, especially in Massachusetts; the development of water-driven mills and cotton-spinning machinery; the concentration of urban populations; and the expansion of roads, turnpikes, and canals—to be followed by steamboats and railroads.40 While powerful internal transformations propelled these developments, European wars again intervened—this time the conflicts of 1793–1815 provoked by the revolution in France—enabling US shipping to emerge as the world’s premier mover of world goods, and enticing European investors back to American markets.41 American growth further heightened the country’s attractive power to immigrants. Population soared from 3.9 million in 1790 to 9.6 million twenty years later, just as capitalist industrialization pushed the number of cotton mills from fifteen to eighty-seven in the space of four years, at the same time as the quantity of spindles increased tenfold. Beginning in the 1790s, the corporate form of organization emerged in the North, soon becoming widespread. By 1861, American states had incorporated over twenty-two thousand enterprises, making the United States the original “corporation nation.”42 The precocious rise of the corporation also fostered the growth of finance, as joint-stock firms took out loans and issued equities, bonds, and other securities. This was the stimulus for a wave of new banks, whose numbers jumped from four in 1791 to 250 by 1816.43 Banking in the United States was thus a major beneficiary of a feverish process of social and geographic expansion of commodity production and trade, alongside the corporatization of American business. None of these processes was unduly hindered by the fragmented character of US banking.

It would be easy to imagine that the localism of US banking owed much to peculiar forms of finance in the Southern slave states. But, as much as the market in enslaved people lent distinctive features to finance in the South, banks there were tightly connected with both Northern and British mercantile groups. In fact, given that cotton was the world’s most widely traded commodity by the 1830s, banking in the South was a force for financial integration, not fragmentation. The same was true for the commerce in enslaved people.

Enslaved people were the largest capital investment in the Southern economy, and slave trading was a powerfully rationalized business. Bonded persons were widely used as collateral in debt transactions, from the purchase of shares in Louisiana banks to the contracting of a mortgage. In the Louisiana parish of East Feliciana, enslaved people secured 80 percent of antebellum mortgages.44 In addition to collateralizing investments, enslaved people comprised one of the commodities most actively bought and sold across the South, and banks were keen to provide funds to businesses engaged in buying and selling bonded persons. In the case of the Bank of North Carolina, perhaps two-thirds of its loans were made to slave traders.45 Significantly, those states in the Deep South that imported the most enslaved people—and thus had the most active markets in bonded persons—were also the most monetized. So much did slave markets foster banking that by 1840, Louisiana, Mississippi, Alabama, and Florida were circulating more bank money per capita than any other US states.46 In this respect, as in many others, there was nothing premodern about the Southern economy.47

As indicated, Southern slave-based banking was thoroughly integrated into financial markets in the eastern United States, as well as the global market based in London. Baltimore’s premier merchant bank, Alexander Brown & Sons, eventually the nation’s second-largest mercantile exchange, connected investors in Liverpool, London, South America, Africa, and beyond to the purchase and sale of cotton and enslaved people.48 Lending in the Mississippi Valley Cotton Belt was dominated by the Second Bank of the United States, thus incorporating slave and cotton finance into the monetary circuits of America’s de facto central bank.49 Many banks originating in the South, like the Consolidated Association of Planters of Louisiana, issued loans backed by collateralized enslaved people (loans financed by selling bonds to Baring Brothers of London).50 Slave trader Jean Baptiste Moussier, working out of New Orleans, partnered with a Virginia bank to build an interlocking slave-trading network that featured branches in New York, London, Le Havre, and New Orleans.51 Banking in the American South was thus integral to international capital flows that dealt in financialized instruments secured by enslaved bodies.

Notwithstanding its regional specificities, therefore, Southern finance was not the source of the uniquely fragmented character of US banking. More significant was the way in which intra-elite regional conflicts converged with popular grievances against the “market revolution.” The latter was articulated into a market populism that channeled subaltern protest into the political insurgency of Andrew Jackson.

With Philadelphia bankers controlling both the First and Second Banks of the United States, financiers in cities such as New York, Baltimore, Boston, Richmond, and New Orleans frequently bristled at the privileges conferred on their competitors in the City of Brotherly Love. They eagerly joined the chorus declaiming the monopoly powers of Philadelphia bankers, a chorus that grew amid the heightened social tensions over gender, race, and class inequality that accompanied the “market revolution” of the early nineteenth century.52

Andrew Jackson would be the ultimate beneficiary of the anti-elite sentiments of this period. The first presidential candidate of non-gentry background, Jackson was the offspring of rough-and-tumble Scotch-Irish immigrants who populated the Carolina backcountry. Theirs was a culture of patriarchy and white supremacy, and Jackson himself owned fifteen enslaved people by the time he was thirty. An abject failure as a planter, merchant, and land speculator, the violent and short-tempered Jackson found his calling as a killer of indigenous people. The War of 1812 was a turning point for the Indian hunter, as it was part of a pivot by the United States toward more concerted practices of conquest, expansion, and ethnic cleansing. Jackson played a decisive role here, massacring indigenous peoples while seizing a fifth of Georgia and three-fifths of Alabama in the process—twenty-three million acres in all.53 Presidents Monroe and Adams would now turn Indian eviction, dispossession, and relocation into systematic state policy.54 But it was Jackson’s conquest of Florida (1813–18)—which he snatched in a frenzy of slaughter from the Creeks, the Seminoles, and defiant African Americans (as well as the imperial claims of Spain)—that raised militarized displacement to unprecedented heights, while cementing his political reputation. When a group of congressional leaders turned against him for effectively declaring war against Spain on his own, Jackson rode to Washington and rallied supporters to reject resolutions condemning him, galvanizing the movement that carried him to the White House in the election of 1828. Blatantly anti-Indian aggression and belligerent continental expansion were now the order of the day, symbolized in the decision of the new state of Mississippi, created on the ancestral lands of the Choctaws, to name its capital “Jackson.”

Jackson’s violently anti-indigenous sentiments were entirely wanting in originality. Even his wedding of bellicose expansionism to an anti-elite discourse that targeted the Second Bank of the United States lacked inventiveness. After all, hostility to banks was emblematic of democratic politics in the first decades of the nineteenth century. Farmers struggling with mortgages, working-class radicals protesting exploitation of labor, and Jeffersonian Republicans displacing their qualms about slavery—these groups and others singled out banks for special reproach. The six new states that joined the union between 1816 and 1820—Indiana, Maine, Illinois, Mississippi, Alabama, and Missouri—all adopted comparatively democratic constitutions (subject to predictable racial and gender exclusions). They also all noticeably constrained banking.55 No doubt, Jackson’s hostility to bankers was genuine. But it was also shrewd politics, for it enabled him to channel subaltern grievances of farmers and workers—who were undergoing the pressures of capitalist transition—into an expansionist program of indigenous displacement that glorified the virtues of the white male yeoman farmer and artisan. Class antagonisms could thus be mobilized against entrenched privilege of the sort represented by the Philadelphia bankers who dominated the Second BUS, while glorifying a market individualism based on white male producers. The strident campaign Jackson waged against the Second BUS during his second presidential term (1833–37), which included withdrawing all its federal deposits in 1833, effectively destroyed the institution, whose charter was not renewed in 1836. Yet, as much as this appeared as a victory for popular forces, the destruction of the bank was in fact largely “a blow at an older set of capitalists by a newer, more numerous set…. Destruction of the bank ended federal regulation of bank credit and shifted the money center of the country from Chestnut Street to Wall Street.”56

In addition to shifting the center of financial power, the demise of the Second Bank, like that of its predecessor, opened the floodgates to pell-mell creation of new banks and new paper monies. There was an irony here, since Jackson was a “hard money” man, as manifest in the “specie circular” he and his Treasury secretary issued in 1836, which directed federal land agents to accept only silver and gold in payment for relatively large parcels of public lands. Yet, an expanding US capitalism could not function on the limited monetary resources of gold and silver bullion and coin. So, having destroyed the effective central bank of the United States, the hard-money president unwittingly oversaw a manic proliferation of paper, much of it produced by so-called wildcat banks, operating on next to no capital and prey to counterfeiters (that is, when they were not generating funny money themselves).

A total of 379 banks had produced paper money by the time Jackson made his concerted moves against the Second Bank in 1832. The number leapt to 596 in 1836, the year renewal of the bank’s charter was denied, and it jumped again to 711 in 1840.57 By 1860, some seven thousand different banknotes could be found circulating in the United States, courtesy of one thousand six hundred state banks. Traveling alongside those were up to four thousand counterfeit issues.58 There may be some poetic license in describing the United States as a “nation of counterfeiters” at this point. But it is certainly fair to say that waves of frauds, scams, and wildcat banks, and a dizzying proliferation of paper money generated profound anxieties. Rather than providing a trusted means of exchange and social interaction, money inspired abiding uncertainties about financial institutions, bankers, politicians, the salesperson in the shop, and the medicine man and his elixir. A society overflowing with people (and currency) that pretended to be something other than what they were nourished epistemological doubt. As in Melville’s The Confidence Man, everyone’s identity seemed in doubt, never mind the veracity of their stories or the authenticity of the wares they peddled. Systematic suspicion appeared to be an entirely rational attitude.

It is all too convenient, however, to conclude that the market magically conjured up a dynamic order out of this financial chaos. In reality, it was the antebellum US state that produced financial order, by providing a greater degree of monetary regulation than most historians have appreciated. True, American capitalism lacked a typical central bank at the time. But the independent Treasury fulfilled many of its functions. We have already seen the decisive role of Treasury notes during the War of 1812; and these were also widely deployed in response to the Panic of 1837 in order to stimulate the economy, and later, to cover deficits incurred during the Mexican-American War (1847). In providing monetary stimulus during panics and in financing wartime deficits, the Treasury performed key central bank functions. Congress authorized such practices in both 1840 and 1846, when it passed Independent Treasury Acts as an alternative to sanctioning a national bank.59

Notwithstanding the regulating role of the Treasury, financial fragmentation had more than a few defects. Not only was it a complicated business to properly assess the relative values of hundreds of different banknotes, but the plethora of largely unregulated banks also fostered scams and frauds. The latter contributed to speculative bubbles and painful panics like the 1837 crash, and its successor twenty years later.60 Moreover, market populism’s hostility to central banking and capitalist developmentalism—which promoted state-supported infrastructure, like roads, canals, and railways—was far from optimal for the common interests of capital. A bureaucratically centralized state capable of military defense and expansion, the management of economic crises, the construction of transport infrastructure, the domination of labor, and the suppression of oppressed groups is very much in capital’s general interests, so long as its interventions do not impinge upon the rights of property and the “liberties” of business. Such a state is an integral aspect of the capital relation, expressing and managing the forms of alienation unique to capitalist society and the conflicts it generates through institutions of impersonal power. Excessively fragmented state powers are generally suboptimal to this project, and they carry the risk that conflicts between regionalized part-sovereigns can become militarized.61 This was to be the story of the United States in the 1850s and 1860s, when aggressive expansionism on the part of the slavocracy provoked armed conflict. And from that conflict was born a new centralization of money and power.

Blood on the Fields: Civil War and the Making of American Money

“Success in crushing the rebellion and maintaining the Union is much more a financial than a military question.”

—Ohio Senator John Sherman, 186362

War makes states, just as it makes money. In the American case, it was the Civil War (1861–65) that “nationalized” the state and the banking system—a metamorphosis that began, predictably, with war finance.63

On April 13, 1861, Fort Sumter fell to Confederate soldiers from South Carolina defending the slavocracy. Full-fledged war was now unavoidable. Within weeks, President Lincoln ordered up 83,000 troops for the army and navy—numbers that would soon represent mere drops in the buckets of blood. When Congress met in July, the president requested “at least four hundred thousand men and four hundred millions of dollars.”64 No US government had ever imagined something on this scale, never mind implemented it. At the time Sumter fell, Lincoln commanded 16,000 troops. He was now requesting 250 times as many. Before 1861 was out, he would have 600,000 soldiers in uniform. Six months later he would add half as many again—all prior to the self-recruitment of nearly 200,000 African Americans into the Union Army.65 As new troops swelled the ranks of the army, war costs skyrocketed. During Lincoln’s first fiscal year, which closed on June 30, 1861, US government expenditures totaled $67 million. A year later they had climbed to $475 million, topping out at $1.3 billion in 1865—a level they did not reach again for over fifty years.66

While the Union government managed to improve and augment its revenue collection, taxes could not possibly keep pace with the mushrooming costs of war. Lincoln’s tax revenues comprised only about one-quarter of what his government spent. The rest had to be covered by borrowing (selling bonds), or by printing money and declaring it legal tender. And even this required significant increases in taxes, since bonds can be sold only if creditors believe in government’s capacity to makes its interest payments. But war could not have been waged, never mind won, without innovative debt instruments and new forms of money. And these were underwritten by a tremendous expansion of state powers.

It has rightly been said that two critical acts of 1862 “advanced the national government’s powers far beyond what had ever been ascribed to it before.”67 One, the Revenue Act of July 1862, enshrined federal powers of taxation, laid the basis for income tax, and radically expanded the powers of the federal government—all critical to the maintenance of the state’s credit in money markets. But it was an earlier bill, the Legal Tender Act of February 1862, that had inadvertently launched a revolution in money and finance.

From the early days of the Civil War, the Union had been issuing Treasury notes as a means to raise funds. One version of these, known as demand notes, did not pay interest and was used widely as currency for everyday payments. But the Treasury secretary, Samuel Chase (another gold and silver bug), resisted making these notes legal tender. Doing so would have made paper equivalent to bullion, fully usable in all monetary transactions, and would thereby have eliminated many difficulties of war finance. Unwilling to take this course—which would eventually become inevitable—Chase instead clung to the idea of gold as real money. In selling his Treasury notes, therefore, he insisted that banks buy them with specie. Yet, this meant no legal expansion of the money supply. All it meant was that gold moved to the government from the banks, with the latter holding Treasury notes in return. In not making its notes legal tender, the government failed to increase the circulating medium with infusions of state credit money. The results were predictable: within months of war, commercial bank purchases of Treasury notes had exhausted their supplies of gold. With gold disappearing, government suppliers went unpaid, and Treasury notes were heavily discounted in money markets (since, in the midst of a gold shortage, sellers would accept amounts of gold much smaller than the face value of their notes). Something had to give.

Once again, war dictated monetary innovation. If the war was to be prosecuted, the sacred status of precious metal would have to be sacrificed, and paper made as good as gold—simply by fiat of the state.68 Unpalatable as this might have been to many politicians, it was either that or the collapse of war finance. Introducing debate on the Legal Tender Act on January 28, 1862, Representative E. G. Spaulding declared the bill “a war measure.” He continued, “In carrying on this existing war … it is necessary to exercise all the sovereign power of the government to sustain itself.”69 Contra Jefferson and the republican tradition, this meant the constitutional right of the federal government to create fiat money. In a further repudiation of the Jeffersonian vision, the war state ushered in an unprecedented enhancement of federal power. Supporting the Legal Tender bill, Radical Republican Thaddeus Stevens intoned, “If no other means were left to save the Republic from destruction, I believe we have the power under the Constitution and according to its express provision, to declare a dictator.”70 And if Congress possessed the right to appoint a dictator in times of national emergency, it most certainly had the authority to create a national currency. And so, the famous notes known as greenbacks were born, backed by nothing more than the state’s promises to pay—but enforced as legal tender by an act of government.

Lincoln signed the Legal Tender Act into law on February 25, 1862. All individuals were required to accept both the $100 million in new greenbacks the government was furiously printing, as well as the existing $50 million in demand notes. While greenbacks had a predecessor in Treasury notes, their scale gave them a colossally greater impact. Treasury notes had been of marginal importance, fluctuating in volume between $3 million and $20 million. Before the war was over, more than $450 million in greenbacks were in circulation. Alongside those went $300 million in national banknotes, which were backed by US government bonds. At one level, the US government now had something equivalent to the currency issued by the Bank of England—legal tender notes backed by the credit of the state. But whereas pound notes were legally convertible into gold, Lincoln’s government went a major step further, in imitation of British wartime practice, by prohibiting convertibility of greenbacks for specie—a suspension that would in fact persist for seventeen years, long past the end of the conflict. The United States was thus operating with fiat money, pure and simple. Nevertheless, as the New York Times reported on April 14, 1862, greenbacks were immediately accepted with “universal confidence” and “esteemed in all respects the equivalent of gold.” Four weeks later, the paper announced that the new legal tender notes had won acceptance as “universal currency.”71

Many commentators were befuddled as to how this could be so. In May, the Economist declared the success of the greenbacks to be incomprehensible. Yet Karl Marx, observing the situation from London, was not in the least surprised. The triumph of “the paper operations of the Yankees,” he wrote, derived from three social factors: confidence in Lincoln’s government and its cause; the desperate need for currency in the US West; and the Union’s favorable balance of trade.72 He might have added that the latter—a result of the North’s advantage in international trade—was a product of the vitality of its agrarian and industrial capitalism. In 1860, for instance, 110,000 manufacturing enterprises were active in the North, compared to 18,000 in the South. The South possessed no machine shops that could build marine engines for its navy, while during the war the Union constructed 671 warships, 236 of them steam-powered vessels. Its industrial base enabled the North to manufacture 1.7 million rifles; the South produced barely any. Rather than being sent into disarray by civil war, the Northern economy surged industrially, as Congress provided massive land grants to launch the Union Pacific and Central Pacific railways. Railroad expansion stimulated domestic steel production, and the first commercial steel using the Bessemer process appeared in 1864. Output in industries such as iron ore, machine tools, wool, and lumber jumped two to three times between 1861 and 1865. And while the Southern economy sputtered, productivity rose in the North in both agriculture and manufactures, a key reason that “Northern soldiers were probably better fed and supplied than any army in history.”73

An expanding economy supplied the army and underpinned war finance, via taxes and secure borrowing. Indeed, the industrial and commercial power of the North (and its capacity to raise taxes) conferred on it the capacity to finance war with fiat money. Even a tripling of the money supply from 1860 to 1865 did not create monetary instability. In the Confederacy, on the other hand, the more than $1.5 billion in notes (“graybacks”) pumped into circulation plummeted in value rapidly—not only because they were not legal tender, but also due to economic dislocation and financial disorder. By 1864, one Confederate officer related that graybacks had “ceased to have even a nominal value.”74 In some respects, the North’s victory was sealed on its factories and farms, its railway lines and steel mills. Notwithstanding strategic bungling and political hesitations on the part of the Union, its economic resources enabled it to wage war for as long as proved necessary. As financial scholar Bray Hammond observed: “In the North an industrial system gave assurance of the necessary production of supplies, and required mainly a reform in the system of payments. In the South, a reform of the system of payments would accomplish nothing, the means of production being at want.”75 That industrial system was the economic foundation of Northern victory. In political-military terms, however, it was the self-emancipating activity of African Americans, including the “general strike of the slaves” and the entry of two hundred thousand black troops into the Union Army, that would prove decisive.76 But whereas black insurgency would be broken in the Reconstruction era, the key wartime monetary transformations that underwrote military success would persist, sometimes in modified forms, in the capitalist economic expansion that followed Union victory.

These transformations required, however, that the banking system be “nationalized,” and the state banks brought to heel. The federal government would thus have to become the regulator of the American financial system. Lincoln pushed this process in a letter of January 19, 1863, which advocated “a uniform currency”—rather than the cacophony of hundreds of contending bills and notes—to be supplied by a new system of federally constituted banks. This letter, intriguingly, was a reply to the “workingmen of Manchester,” in which the president felt obliged to assert the vigor with which his government was developing the state powers necessary to win the Civil War.77 The real heavy lifting on the banking front was done, however, by Ohio Senator John Sherman.

Nothing fosters state centralization like war, and it was not long before Sherman laid out a program to nationalize and centralize government authority. The Ohio Republican steered through the Senate a bill inaugurating a system of national banks, pronouncing, “The policy of this country ought to be to make everything as national as possible, to nationalize our country.”78 Commitment to defeating the Confederacy now made political and economic fragmentation anathema. And Sherman happily took the lead among the centralizers. When many state banks continued to print their own notes rather than switch over to greenbacks, he launched a tax offensive against them. When his initial 2 percent tax did not bring the state banks to heel, he pushed through legislation raising it to 10 percent. The government was now dictating what would and would not function as money. Determined to annihilate the notes issued by state banks, Sherman invoked constitutional powers. “It was the intention of the framers of the Constitution,” he urged, “to destroy absolutely all paper money, except that issued by the United States.”79 Early the next year, as the 10 percent tax on currency-issuing state banks took effect, large numbers of these banks quickly nationalized themselves, relinquishing their own currencies and agreeing to exclusive use of greenbacks and national banknotes.80 American finance thus revolved around two national currencies, both issued by government—United States notes (greenbacks) and national notes tied to government bonds. American money was now genuinely the product of the national state.81

Not Yet World Money: The Dollar, Gold, and the Creation of the Federal Reserve

If American money was now national, it was still a long way from being global. For the dollar to become a principal currency of international business, multiple transformations would be necessary—economic, political, and institutional. Over the half century that followed the Civil War, these would all be put in place. Yet the processes of change were typically piecemeal and confused, sometimes outright ludicrous. Having created a national fiat money, America’s rulers now tried to get rid of it by restoring convertibility of banknotes to gold. Yet so ham-fisted were the efforts that it took them nearly fifteen years to pull it off.

Immediately after the Civil War, the government began retiring and destroying greenbacks, just as economic growth turned up and state revenues leapt higher. The money supply thus contracted, while demand for money rose in response to the rising volume of transactions. The result was predictable: as the money stock declined by about 7 percent a year, so prices dropped by 8 percent annually from 1866–68.82 Declining prices are almost always disastrous under capitalism, as investors and individuals postpone purchases and investments in order to take advantage of the lower prices expected next week, next month, next year. The economic effects were worst in the South and West of the United States, which were cash-starved long before hostilities between the Union and the Confederacy had ended. These regions strenuously opposed the resumption of dollar–gold convertibility and frequently advocated for silver to become an additional component (sometimes the primary one) of the money supply. Yet, notwithstanding widespread public hostility, the party of resumption eventually prevailed, though it was not until 1879 that the convertibility of dollars for gold came into effect.

The ruling-class push for resumption owed something to a concern that government fiat money might get recklessly out of control—with the stuff being churned out at a rate that would fuel price inflation. But the most farsighted capitalists had an additional concern. They knew that while a state might impose whatever it liked as legal tender within its borders, it had no such authority in world markets. In short, they understood, to recall Hobbes’s words, that there had still to be some “measure of the value of all things else between Nations.”83 It followed that if the United States was to be a top-tier player in international markets, it needed a currency that carried global legitimacy, one that might become a recognized instrument of global finance.

We live in an age that is bedazzled by finance. It is essential to remind ourselves, therefore, that what brought about the internationalization of the dollar was a dynamic process of capital accumulation in industry, agriculture, and transportation. Agricultural expansion remained vital to these developments after the Civil War, although it was now increasingly integrated with finance and burgeoning manufacturing industries. During the 1870s, farm acreage grew by 44 percent. This extensive growth dovetailed with increases in the productivity of agricultural labor to generate often-staggering rises in output. “Between 1866 and 1886 the corn produced in Kansas rose from 30 million bushels to 750 million. In 1880 the wheat crop of North Dakota was not quite 3 million bushels. In 1887 it passed 60 million. These figures had no historical precedent.”84 Between the end of the Civil War (1865) and the 1898 Spanish-American War, US wheat production jumped by more than 250 percent, and the output of corn by over 220 percent.85 All of this was tied, of course, to industrial transformations of the landscape: canals and waterways, steamboats, the telegraph, and—most dramatically—the railways. Over the course of the 1850s, the national railroad system more than tripled in size, from nine thousand miles of track to thirty thousand. And no city benefited more than Chicago, by this time a hub of railways, grain trading, meatpacking, and finance.86

The expropriation of land from indigenous peoples also lay at the heart of Chicago’s rise. On the backs of that dispossession came a half century of truly remarkable growth beginning in the 1830s. Again, military conflict was crucial. During the Crimean War (1853–56), US wheat exports to Europe soared; Chicago’s wheat shipments tripled. The city’s board of trade had been organized in 1848, the same year that telegraphs reached the city, enabling rapid dissemination of price information and deeper financial integration with New York, the most important business center in the country. The city’s evolution into a railway hub expanded all its networks of industry and commerce, and by 1852, two rail lines linked Chicago to New York. Then, the Civil War worked its economic transformations. The Union Army’s insatiable appetite for oats and pork kindled huge rises in production and exchange of these goods. As demand soared, Chicago’s nine largest railroad companies joined with the city’s Pork Packers’ Association to develop the mammoth Union Stock Yard and Transit Company. Building thirty miles of drainage, and siphoning half a million gallons of fresh water per day, the consortium operated ten miles of feed troughs. This industrial system covered sixty acres of city land and funneled one hundred tons of hay (as well as corn) each day to five hundred pens. These titanic increases in the production and exchange of grain, animals, and packed meat also excited a plethora of speculative trades, as financiers gambled on the ups and downs of prices. This was the origin of futures trading, as the Chicago Board of Trade pioneered markets in financial pseudo-commodities based on bets about future prices (a development to which we shall return). Fittingly, it was in 1865, the year the Civil War ended, that the CBT introduced standardized rules governing futures trading.87

Complex financial markets developed in tandem with so-called financial banking, which enabled businesses to procure loans backed by securities—primarily stocks—rather than simple promises to repay, such as promissory notes or commercial paper. In the decade after 1896, loans by New York banks and trust companies secured by stock exchange collateral jumped by over 200 percent, rising three times as fast as loans tied to an enterprise’s commercial paper. Within a few years, roughly 60 percent of the loans made by New York banks were backed by negotiable securities.88 Not only did US capitalism witness, as we have seen, the most widespread adoption of the corporate form; it also developed some of the most intricately structured financial markets in the world.

What was emerging in the United States, as illustrated in the case of Chicago, was a dynamic symbiosis between agriculture and manufacturing, in which finance served as a leavening agent. Between 1869 and 1883, the American economy grew faster than ever before, at a rate of about 9 percent a year. Railroads were at the heart of this boom, with more than one hundred sixty-two thousand miles of track laid in the forty years after 1860.89 By 1890—twenty-five years after the end of the Civil War—US capitalism produced more steel than did Great Britain. And a decade later, its total manufacturing output had also overtaken that of Britain. Meanwhile, the American economy had entered a phase of furious concentration and centralization of capital. Huge conglomerates emerged, like the Standard Oil Trust (1892), the US Steel Company (1901), and General Motors (1908). By 1902, there were nearly one hundred industrial firms with capitalizations in excess of ten million dollars—a size that was extraordinarily rare just a decade earlier. So overweening was the power of giant corporations that by 1909 nearly two-thirds of all manufacturing workers labored for fewer than 5 percent of all industrial enterprises.90

The dynamism of post–Civil War American capitalism and its growing financial sophistication positioned the US economy for a jump to global status. Booming exports of manufactures and semifinished goods delivered a favorable balance of trade virtually every year after 1873. But this rise posed significant questions about the US monetary order. Among other things, global power in capitalist society works through world money. Capitalism involves a hierarchy of monies—from those with the most limited range of exchangeability (e.g., tokens accepted only at a particular store) to those of universal equivalence—in which the more limited have less social force than the most general. In Great Britain under the gold standard, for instance, country banks issued their own currencies backed by the private notes of London banks and Bank of England notes, while using gold reserves as reserve assets. London banks, in turn, held a combination of Bank of England notes and gold, while the Bank of England’s official reserves consisted entirely of gold. Gold thus sat atop the monetary pyramid in Britain; and as world capitalism went to the gold standard, the yellow metal became the ultimate expression of world money. This is not to say that gold was the medium of everyday commerce; banknotes and coins typically played that role. Rather, gold was the measure of world value as well as its embodiment (as the store of global value). It was the means of comparing national prices and earnings and, thereby, of globally measuring what one nation-state owed another in the course of world trade and investment flows (tallied via each state’s balance of payments with every other). Great Britain had fixed the price of an ounce of gold at £4.247, while the US government set its value at $20.671. This determined a rate of exchange between dollars and pounds based on gold. Each major currency had a gold value, which provided for straightforward conversions of one currency into another through the medium of gold. “It was almost as if the gold-standard world possessed a single international money,” remarks one historian.91 There is in fact no “as if” about it. Gold was the ultimate means of payment in international transactions—the only one that was fully universal.92 Assessing the role of gold in 1870, the majority report of a US congressional committee pronounced, “For all purposes of internal trade, gold is not money … but for all purposes of foreign commerce it is our only currency.”93

However, at the end of the Civil War, the United States was not on the gold standard: it had a pure fiat money until 1879.94 And even after its return to gold, strong forces continually pressed for a silver standard, or a bimetallic standard (gold and silver). Indeed, the Silver Purchase Act of 1890 remonetized silver by requiring the government to undertake regular purchases of the metal as a way to augment the money supply. Yet, bimetallic standards are inherently fraught, as developments in private markets shift the relative values of each metal away from their official government values. This inevitably leads to a drain of one metal at the expense of the other, as speculators exploit deviations of market prices from the official rate of conversion (which is often set in law) between the two metals. If the market price for silver is worth 5 percent less than the government rate, for instance, then investors—including foreign ones—will trade silver for gold at government windows (and pocket the extra 5 percent in gold). In the 1890s, this would have meant a persistent drain of gold away from the United States, moving it, sooner or later, to a silver standard for its currency as gold reserves declined. However, those sections of American business that were eyeing a jump to global status knew that their project required a dollar tied exclusively to gold.

The urgency of committing the dollar to gold became more pressing in 1871, as the international gold standard gained momentum. Having militarily defeated France, German Chancellor Otto von Bismarck took his newly unified state onto gold. This quickly forced France onto the gold standard, as well. In rapid order, four other European nations converted to gold, as did Japan, India, Russia, and Argentina throughout the 1890s, followed in the next decade by Austria-Hungary, Mexico, Brazil, and Thailand. As the gold standard became genuinely international, Britain, its originator, established itself as the center of world finance. Indeed, governments and investors the world over had more confidence in the pound’s convertibility into gold than they had regarding any other currency—a confidence that was supremely practical, given Britain’s leading position in the global economy.

From early on, British capitalism had developed surpluses above its own domestic investment needs, and much of this could be profitably loaned to international borrowers. In addition, it had an empire for the circulation of its currency, and dominions that could be bullied into buttressing sterling when necessary. If London was the center of global finance, this also meant that its gold standard was the de facto foundation of world money—and thus, crucially, that gold was now the basis of war finance. For any government entering the late-nineteenth-century scramble for empire, or even simply building its armed forces for defense against aggressors, gold was essential as payment for weapons, ships, armored vehicles, aircraft, provisions, cotton and linen for uniforms, and iron—increasingly the world’s key industrial material. It is no accident, therefore, that the internationalization of the gold standard occurred in the decades leading to the First World War, when military budgets were soaring amid colonial scrambles. Total military expenditures for Britain, Germany, Russia, Austria-Hungary, France, and Italy more than tripled between 1880 and 1914, as the imperial powers lurched toward war. Observing these trends, while his own country engaged in a pell-mell military buildup, one astute Japanese analyst wrote: “Bismarck said there are only two things: iron and blood. I, on the other hand, believe that there are only iron and gold.”95

This substitution of gold for blood goes, of course, to the very heart of the argument I have been developing. As the blood of the modern state, gold was the medium of imperial life. It thus underwrote the sorcery of death-as-life—necropolitics—practiced by states as war machines.96 Gold, the means of war finance, had become the circuitry of world power. As lifeblood of the military commonwealth, it was the instrument for making real blood flow across the world’s oceans and battlefields.

Cognizant of this, US President Grover Cleveland warned Congress in 1893 that a monetary system orbiting around silver would mean the United States could “no longer claim a place among nations of the first class.”97 Congressman Josiah Patterson of Tennessee expressed the same idea, in the parlance of everyday ruling-class bigotry: to move to a silver standard, he pronounced, would position the United States “not with the enlightened nations of Christendom, but side by side with China, with the republic of Mexico, with the republics of Central and South America, and every other semicivilized country on the globe.”98 Adoption of the gold standard became the measure of all things wholesome—civilization, global standing, war, and empire.

To be sure, the gold standard served important domestic functions. Among other things, it limited just how expansionary central banks could be with the money supply, since the requirement to convert currency to gold upon demand constrained profligate tendencies, real or imagined. (For instance, an excessively rapid expansion of currency might provoke a bank run that could exhaust the gold reserves.) But most importantly, with the emergence of an international gold standard from 1873–90, global capitalism had acquired a singular form of world money—and American capitalists were intent that their system should revolve around it. After passage of the Silver Purchase Act of 1890, which threatened to move America off gold, they spearheaded a campaign to secure the supremacy of the yellow metal. As much as they invoked domestic reasons for this, American business leaders were also insistent about its necessity for war finance. America’s extensive global investments, explained Treasury Secretary Lyman Gage, required “a well-supplied war chest with an impregnable credit”—and impregnable credit meant a dollar as good as gold.99

In fact, war once again proved a turning point. In 1898, US capital turned its sights on Spain, a colonial power in terminal decline. Offering little real resistance to American military aggression, Spain handed over its colonies in the Philippines, Guam, and Puerto Rico, while Cuba was made a US protectorate.100 American capitalism was now asserting itself as the rising imperial power of the era. But it still needed to establish itself as a heavyweight in the sphere of world trade, investment, and finance. Twenty-four months after Spain was vanquished, the Gold Standard Act of 1900 was passed. The dollar was now officially tied to gold and backed by a $150 million gold reserve at the Treasury.101 Expansionist dreams had carried the day. But they were far from exhausted.

While consolidation of the gold standard would be critical to the projection of power abroad, so would the institutional capacities to manage crises and panics. In fact, with the latter in mind, corporate America had been promoting a US central bank since the early 1890s, as the transition to mammoth firms exacerbated tendencies toward overproduction. The growth of output prompted by giant corporations, deploying new productivity-boosting technologies, pushed prices into a downward movement. Between 1875 and 1896, for instance, US prices dropped by an average of 1.7 percent each year.102 Compounding these trends were financial crises like that of 1893, in which four hundred banks collapsed, all of which reinforced the big business campaign for a central bank with powers of countercyclical crisis management.103 But progress toward central banking was slow, in part because some sections of capital distrusted any state initiative that seemed too readily influenced by popular forces. Then the Panic of 1907 struck, rocking markets and severely damaging the international reputation of America’s financial system.

The crisis broke out in October of that year, with a run on the Knickerbocker Trust Company, which then suspended all cash withdrawals (whether currency or gold). Within days, as panic spread, major banks in New York and Chicago did the same. Overnight, companies could not meet payrolls, and commercial and personal credit dried up. Cash was now at a premium, with some brokers receiving up to a 4 percent surcharge on every dollar they provided.104 As individuals and financial institutions hoarded cash, interest rates on stock exchange call loans—overnight loans secured by shares—leapt to 70 percent. Hoping to alleviate the panic, Treasury Secretary George Courtelyou pumped $25 million into New York banks. He may as well have been fighting wildfire with a water pistol. Only when stock prices had dropped sufficiently was the downturn arrested by the appearance of large inflows of gold from European investors (looking to take advantage of falling share prices). Six months later, Congress authorized the Treasury to create up to $500 million in emergency currency to relieve future panics.105 But progress toward fashioning a central bank remained erratic. And when it was finally accomplished with the Federal Reserve Act of 1913, it was too late to address the trauma that seized global financial markets with the outbreak of world war in August 1914.

Global Slaughter, the Second Thirty Years’ War, and the Triumph of the Dollar

“In Europe they mobilize armies and navies. In America we mobilize bank reserves.”

—Minnesota Senator Knute Nelson, August 1914106

By the time war erupted, the US economy had become the world’s largest, responsible for one-third of global industrial output—nearly as much as that of Britain, France, and Germany combined. Yet, notwithstanding its industrial heft, America was a lightweight in world financial markets. There, Britain reigned supreme. When countries needed to raise money, they sought out British pounds, which were effectively as good as gold. Even the German mark and the French franc received more use as a means of international payment than did the dollar. Certainly, some countries tapped US markets for loans, though rarely were the bonds issued in New York denominated in dollars, given the US currency’s lack of global reach.107 But this mismatch between industrial and financial power would not last. The game-changer, once again, was war—and war finance.

In fact, America’s rise to financial supremacy would take a mere six months of war. And this time, gold would work in its favor.

One after another, the belligerent states of 1914 were forced to suspend gold convertibility, as the demands of war finance exhausted their supplies. France, Germany, and Russia did so in August, the first month of the conflict. The reason was basic: gold was the most universally accepted means of paying for the weapons, raw materials, and foodstuffs essential to sustaining armed forces. Warring states would no longer part with the precious metal except for the purchase of war materials. Each and every soldier and sailor had to be trained, equipped, clothed, fed, and transported to bases and fronts. They needed to be supported by armored vehicles, ships, and planes. Once the shooting started, for instance, France required up to two hundred thousand shells daily, nearly seventeen times its prewar output.108 Given the scale of mobilization, which went on for nearly five years, the primary combatants were starved for cash, while America, which would wait three years before entering the conflict, was flush with loanable funds. This is why Minnesota Senator Knute Nelson could declare, “In Europe they mobilize armies and navies. In America we mobilize bank reserves.” And this time America’s leaders, intent on achieving global status, refused to be pushed off the gold standard. That would be the difference maker.

Adhering to gold was far from easy. As Europe moved to war in late July 1914, its governments feverishly stockpiled bullion. To do so, they began selling their US financial assets, like stocks and bonds, and cashing in the dollars they received for US gold, which was immediately shipped across the ocean (or sometimes to Canada, in the case of Britain). During the final week of July, $25 million in gold was drained from America’s reserves in this way. This seemed bad enough. But the drain had actually started earlier, with $9 million in gold exports in May, rising to $44 million a month later. Altogether, $83 million worth of US gold had been siphoned to Europe, even before the shooting started. Worse, in 1914 European investors owned $4 billion worth of US railway stocks alone—all of which could be liquidated and turned into gold in due time.109 Of course, the American government could take the dollar off gold for the duration of the war, but this posed two problems. First, America was not at war (and would stay out of the fighting for three more years), and war was the only accepted rationale for suspending dollar–gold convertibility. Second, notwithstanding its entry into the conflict, Britain was committed to maintaining the gold standard in order to preserve the unique role of the pound as a world currency, and that of London as the center of world finance. For the United States to close the gold window would be an admission that it could not compete with sterling as a world money, setting back New York’s emergence as a site of world finance. All of this compelled America’s leaders to cling to gold. Expressing a growing bourgeois consensus, Benjamin Strong, first president of the Federal Reserve Bank of New York, vowed to make the dollar “an international currency” by building “confidence in the redeemability of dollars in gold at all times.”110 Among other things, this also implied a growing global role for American banks, and to this end the Federal Reserve Act authorized US national banks with at least $1 million to set up foreign branches, a privilege of which the largest quickly availed themselves.111 Yet, the global expansion of American banks rested on the stability of the dollar. And as gold exited the United States in record amounts, that stability seemed precarious.

Treasury Secretary William McAdoo had a plan, however. He would shut down Wall Street, rather than see the United States go off gold. This would close off the main avenue by which European investors could sell their American stock holdings for gold. If European investors had nowhere to sell their US financial assets, exports of US gold could be considerably constrained. So, he arranged for the New York Stock Exchange to close its doors on July 31, 1914. And closed it would remain for the next four months.112 At the same time, McAdoo knew that in shutting down Wall Street, he was simply buying time. Eventually, the market would have to be reopened. The key was to pump out exports, particularly of grain and cotton, to European states desperate to feed and clothe their troops. This would stop the gold outflow, since the belligerents would need the precious metal to pay for American goods. By autumn, US exports were booming, the value of the dollar was ticking upward, and European states were hurrying to New York to borrow money. Gold was no longer flowing out, while foreign money was pouring in. By the time the United States entered the global conflagration in 1917, foreign governments had raised more than $2.5 billion in wartime funds by selling dollar-denominated securities in New York.113 The dollar had now displaced sterling as the global currency of choice. By the early 1920s, interest rates were lower in New York than in London, which encouraged governments and investors to continue to do business there. At long last, American capitalism had mastered money’s second modal form—central bank currency backed by government debt and redeemable in gold.

image

By most measures, America had become the premier capitalist economy in the world by 1915. But its formal ascendance to imperial hegemon, the power that dominates global politics and economics, would not transpire for another thirty years. This reminds us that the organization of hegemonic power involves complex social processes. Between immediate events and the longue durée of centuries, argued French historian Fernand Braudel, there occur cycles or conjunctures that work themselves out over a decade or a quarter century.114 The thirty-year period consisting of the two World Wars (1914–19 and 1939–45), in whose interregnum occurred a global depression, might be seen as just such a conjuncture. Only out of the turmoil of trench warfare, blood-soaked fields, revolutions and civil wars, fascism, and the atom bomb would there emerge a new global hierarchy based on American power—and the US dollar.115

It is incumbent to say a few things here about the total wars of the first half of the twentieth century. These were conflicts characterized by the industrialization of killing, the use of machine technology—tanks, modern battleships, bomber aircraft, chemical weapons, and the atom bomb—in order to murder and destroy. To rework Walter Benjamin’s expression, we have here the eruption into world history of mechanically reproducible death.116 More than sixteen million people probably perished in the first global conflagration, vindicating Rosa Luxemburg’s verdict that “shamed, dishonored, wading in blood and dripping with filth—thus stands bourgeois society.”117 Reflecting a decade later on “the nights of annihilation of the last war,” Benjamin contemplated the ominous mobilization of means of destruction that reshaped the earth: “Human multitudes, gases, electrical forces were hurled into the open country, high frequency currents coursed through the landscape, new constellations rose in the sky, aerial space and oceans thundered with propellers, and everywhere sacrificial shafts were dug in Mother Earth.”118

The shock registered by the brutalities of total war unsettled and traumatized its combatants as well as its victims. Even a dedicated imperialist like Winston Churchill registered its dreadful features in a speech given after World War I. “The Great War differed from all ancient wars in the power of the combatants and their fearful agencies of destruction,” he declared, continuing, “Every effort was made to starve whole nations into submission without regard to age or sex…. Bombs from the air were cast down indiscriminately. Poison gas in many forms stifled or seared the soldiers. Liquid fire was projected upon their bodies. Men fell from the air in flames, or were smothered, often slowly, in the dark recesses of the sea…. Europe and large parts of Africa and Asia became one huge battlefield.”119

Twenty years later, the science and technology of destruction had become frighteningly more powerful as the Second World War exterminated at least sixty million people, fully 3 percent of the world population. And now a second feature of total war—the systematic killing (genocide) and displacement of civilian populations—came fully into its own, producing the industrialized death camp (Auschwitz), and giving rise to the new human categories of refugees and the stateless.120

Total war was “the largest enterprise hitherto known to man,” wrote Marxist historian Eric Hobsbawm.121 It required colossal efforts to coordinate food production and distribution, armaments manufacture, rationing of goods, worldwide transportation and communication, financial management, mobilization of “manpower,” patrol of the oceans and the skies, movement of troops, and orchestrated destruction and killing on a planetary scale. In the interregnum between the two concentrated periods of global slaughter, capitalism underwent its most devastating slump, the Great Depression of the 1930s, reminding us why millions came to reject a system that seemed capable of nothing but murder and economic hardship.

The story of the Depression has been told many times. The drama is usually associated with the great stock market crash in New York, beginning in October 1929.122 In fact, “business was in trouble long before the crash.”123 Profits and output had started to turn down prior to the meltdown on Wall Street. But the latter thoroughly destroyed personal fortunes and traumatized the financial system. Four thousand banks collapsed in the United States. By 1933, US gross national product had dropped by nearly a third, and one person in every four in the labor force was unemployed. The American economy was particularly hard hit by the slump, but it was far from isolated. Seventeen other countries saw bank panics and collapses, and one after another, each went off the gold standard. The United States ended gold convertibility for domestic purposes with the Gold Reserve Act of 1934, though it retained the yellow metal for settlement of international transactions.124 Trade protectionism and currency devaluations induced a massive contraction in world trade, which underwent a decline of more than a third by 1932. Plummeting prices, particularly for raw materials and agricultural commodities, hammered nations like Canada, Argentina, and Australia. After more than five years of persistent decline, it seemed the bottom had been touched. Output and investment started to grow, unemployment eased a bit. Then, in 1937, the bottom fell out once more.125 But this new slump would not drag on like that of 1929–35. What ended it was war—total war to be precise, and human carnage on a planetary scale.

That capitalism should thrive on destruction comes as no surprise. The frantic production of tanks, fighter aircraft, jeeps, battleships, millions of rifles, and millions of uniforms—and the iron, steel, electrical goods, and more that all these required—catapulted economies out of depression. As economist John Kenneth Galbraith observed, “The Great Depression of the thirties never came to an end. It merely disappeared in the great mobilization of the forties.”126 Yet, it was not just troops who were mobilized; so was capital. And as with the troops, it was the state that did the mobilizing. World capitalism revived, in other words, on the back of a state-directed war economy.

Following the decade of depression, the American gross national product grew by 65 percent from 1939 to 1944. Industrial production expanded more rapidly than during any other period in US history. Driving all of this was military demand: production of war goods catapulted from 2 percent to 40 percent of all output over the same five-year period.127 By 1943, the US government was initiating 90 percent of all investment, and the arms sector was generating half of all output.128 Factories were humming, and farms were working flat out to meet military demand.

But as much as war revived world capitalism, its effects were highly uneven. And there was no greater beneficiary of this unevenness than Uncle Sam. To begin with, the United States did not enter the war until late 1941, more than two and a half years after the rest of its allies. Yet, while it remained outside the conflict, it nonetheless profited from it—producing ever more goods for its European trading partners, and providing them loans. Even when it did enter the brutal fray, the United States did not experience the massive physical and industrial decimation that accompanied the human slaughter elsewhere.

Japan, for instance, lost one-quarter of its factories and a third of its industrial equipment as a result of the carnage. Italy’s steel industry saw a quarter of its capacity destroyed, while Germany suffered damage to 17 percent of its stock of fixed capital. Meanwhile, one-tenth of France’s industrial stock was wiped out.129 While economies like these were ravaged by war, the United States boomed. At the commencement of hostilities in 1939, the US economy was about one-half the combined size of those of Europe, Japan, and the Soviet Union. Six years later, when war at last ended, it was larger than all of them combined. By that point, the United States accounted for half of global industrial production and held almost three-quarters of the world’s supplies of gold. Lifted ever higher on oceans of blood, the United States now dominated the international economy. And the dollar was its unrivaled world money.

The Great Boom, Vietnam’s Unraveling, and Global Fiat Money

Once the bombs had stopped falling, the losers’ countries occupied, and the victors’ lust for conquest (temporarily) satisfied, world capitalism entered a sustained quarter-century expansion (1948–73). The contrast with the previous period—three decades of war and world depression—could scarcely have been starker. To be sure, there were cyclical fluctuations across the Great Boom, with periodic domestic recessions. But overall, global capitalism experienced a period of high growth rates, robust profits and investment, low unemployment, expanding world trade, and generally rising living standards. This “golden age” for Western economies had its brutal undersides of course: imperial interventions, racist violence, the Cold War arms race, an intensified regulation of gender and sexuality. Yet, Western economies just kept humming.

World manufacturing output quadrupled between the early 1950s and the early 1970s, while world trade in manufactures leapt ten times higher. The stock of plant, machinery, and equipment per worker more than doubled, driving labor productivity forward at record pace. Food production rose more quickly than did world population, with grain yields doubling over a thirty-year period. Rates of economic growth hit record highs in every major capitalist nation, with the exceptions of Great Britain and the United States.130 As labor militancy was contained and pro-business policies consolidated, the era of high profits created the space for governments to expand social provision (“the welfare state”) without compromising capital accumulation. A myth of social harmony triumphed, according to which capital, labor, and the unemployed could prosper together in a new managed capitalism. John Kenneth Galbraith, not known for recycling conventional wisdom, suggested, in his 1958 book The Affluent Society, that since the problem of wealth-creation had been solved, the social challenge was now to manage affluence and share the wealth.131

Although often described as the “Keynesian era,” the boom had considerably less to do with the influence of British economist John Maynard Keynes than is often supposed. What fueled expansion was not deft fiscal and monetary management so much as the maintenance of high rates of profitability. Through a complex algebra—whose variables included the impacts of wartime destruction of capital, the stimulating effects of permanent arms budgets, the mobilization of new technologies, the containment of labor insurgency, and the reconstitution of reserve armies of labor132—capitalism seemed to have found a new growth trajectory that would forever eliminate crises and depressions.

Floating atop the boom, coordinating its flows, was the almighty dollar. After climbing into the top tier during the First World War, the supremacy of the greenback became incontestable after the second global conflagration. Dollar hegemony was enshrined when Western leaders gathered at Bretton Woods, New Hampshire, in the summer of 1944 to hammer out a new monetary regime.133 Contrary to a present-day myth, this conference was anything but a harmonious gathering of responsible leaders seeking the common good. Bretton Woods was another exercise in great-power politics, with the United States refusing proposals from Keynes, the British representative, in order to impose dollar supremacy. The global economy would henceforth revolve around the dollar as the world’s primary reserve currency. All major currencies would be pegged at fixed rates to the dollar, which in turn was to be tied to gold at a ratio of thirty-five dollars to an ounce. Like the British monetary system, this was a gold exchange standard. The dollar was to be the basis of monetary payments, albeit with the understanding that central banks could exchange the latter for gold. To stabilize the global structure, exchange rates of other currencies would be held fairly steady to the dollar by the use of capital controls, which would prevent so-called “hot” flows of money. The major powers also agreed to finance a new institution, the International Monetary Fund, empowered to provide loans in the event of currency crises should a member country encounter severe balance-of-payments difficulties.

Ironically, in light of what would later transpire, the main challenge to this arrangement in the early years of postwar boom was a dollar shortage in Europe. By mid-1947, the United States was running a twenty-billion-dollar export surplus—goods that could only be paid for by America’s trade partners with dollars or gold, each of which was in short supply due to their massive concentration in US hands. Without an outflow of dollars to Europe and Japan to support international trade and capital flows, the dollar-based system risked seizing up. The conundrum was at first evaded when the American government announced the Marshall Plan and its program of aid to Europe. But it was the decision to rearm Europe, and to provide military aid to this end, that really turned the tide. And this decision crystallized into strategic policy direction when war commenced in 1950 on the Korean Peninsula. This policy was accelerated in the face of the election of left-leaning governments in France and Italy, worker revolts in countries like Japan, and fears that the end of war would induce another wave of working-class and socialist insurgence. As arms spending soared, and the American state shifted to “Military Keynesianism” at home and continuous military assistance to allies abroad, an outflow of dollars lubricated the gears of the world economy.134 But US corporations were soon to augment these outflows by increasing their offshore investments in order to build multinational operations and thereby profit from sales in rebounding European economies. The “solution” initially provided by militarism was now amplified by corporate globalization. The result was a consolidation of the new regime of world money, as a persistent US balance-of-payments deficit (driven by overseas military spending and aid), alongside foreign investment, generated the dollar liquidity essential to the smooth working of the world economy. Yet, so long as America had a surplus on its merchandise trade, foreign-held dollars were welcomed as payment for US-produced goods.

But this postwar balance would not last. It was no longer the case, after all, that the United States mobilized bank reserves while others mobilized armies. As the imperial hegemon, America was paying for troops and tanks, bombs and aircraft, and for ever more expensive ballistic missiles. Simultaneously, it was exporting dollars to maintain foreign bases, and to feed, clothe, house, and pay wages to soldiers overseas—all costs that rose with the escalation of the war in Vietnam. In 1964, America’s debts to foreign central banks exceeded its gold reserves for the first time in the postwar period, due to a balance-of-payments deficit that stemmed from Vietnam-driven foreign spending. Official government figures estimated that overseas military spending by the United States equaled $3.7 billion during the following three years (1965–67). Yet these figures excluded a $2.1 billion outflow due to military grant agreements—consisting of military goods and services paid for by the American government and delivered as gifts to client states—during these years.135 These “gifts” would grow by more than a billion dollars by 1971, when the dollar crisis erupted. By that point, US military outlays abroad exceeded its foreign sales of armaments by almost $3 billion, with some analysts estimating that the foreign costs of empire amounted to $8 billion per year.136 The dollar outflow—and the weakening position of the greenback that accompanied it—was thus directly connected to the costs of war and empire. To be sure, the US state believed in the necessity of such imperial expenditures. But, the empire itself appeared to be in decline at the time, nowhere more so than in Vietnam, where it was losing the war despite ongoing military escalation. As the rest of the world watched the gyrations of the dollar and the humbling of America’s war machine at the hands of Vietnam’s national liberation movement, the US empire underwent a credibility crisis.137 A vice president of Citibank insightfully informed a Congressional committee in 1970, “Hegemony, not liquidity, is what the dollar problem is all about.”138 While that may not have been the whole story, it was a crucial part of it.

Underlying this challenge to hegemony was American capitalism’s relative economic decline. While the US government spent extravagantly on empire, its domestic economy performed much more sluggishly than did the very economies that had seemed so weak and broken in 1945. America’s listless progress can be seen in growth rates of the “capital stock” in manufacturing—the accumulation of plants, machines, and equipment. During the critical fifteen-year period from 1955 to 1970, US capital stock in industry grew by 57 percent. In Europe, it grew twice as rapidly (116 percent), while in Japan it rose an incredible nine times faster (500 percent). By 1972, American business had spent several years at the bottom of the rankings of industrial nations for reinvestment of corporate earnings.139 The United States was thus among the least dynamic of the major capitalist economies. As a result, it now began to lose markets, especially for manufactured goods, to up-and-coming rivals. To make matters immeasurably worse, the so-called Great Boom was winding down, ordaining the downfall of the Bretton Woods system.

The decline of the Great Boom was driven by the overaccumulation of capital and a downward movement in the general rate of profit that commenced around 1968.140 But like all economic slowdowns, the effects were borne unevenly. This time, having lost ground to its main competitors, American capitalism endured some of the harshest blows, among them a full-fledged dollar crisis.

As early as 1960, while Europe and Japan recovered, the reserves of dollars held overseas exceeded America’s supply of gold. Had all those dollars been converted for the precious metal, the United States would have been abruptly forced off the gold exchange standard. Instead, US governments enforced a series of stopgap measures, some of them at odds with the spirit of Bretton Woods. As early as 1961, Americans were prohibited from holding gold outside the country. Then, European governments were strong-armed into contributing to a Gold Pool, and pressed to refrain from converting dollars to gold. Shortly after, American residents were barred from collecting gold coins. But none of these ad hoc moves could offset the structural trend: the United States was now importing more goods than it exported, and shipping out billions to finance overseas military spending, while covering the shortfall with a currency in which its trade partners were drowning.

It was one thing for the United States to have a deficit in its overall balance of payments; but this was now joined by a deficit in merchandise trade. By 1969, the American economy was running a four-billion-dollar shortfall on trade in consumer goods. And two years later, its overall trade balance turned negative.141 The world was now awash with billions of dollars that many of its holders simply did not need. So long as they bowed to US pressure and abstained from demanding gold, America’s trading partners were accepting useless IOUs in exchange for the real goods and services they supplied to the United States. And by exchanging inconvertible IOUs for goods and services, as one economic analyst noted, the US was effectively appropriating “from the dollar receiving countries an equivalent amount of their surplus value for its own use.”142 To make matters worse, the era of escalating war in Vietnam was also one of steadily rising price inflation. The real buying power of the dollar—the actual material goods it could purchase—declined persistently. To hold dollars was thus to hang on to a depreciating asset, one that could buy less and less. French President Georges Pompidou openly grumbled about having to use as a global measure of value “a national currency that constantly loses value.”143 The dollar was no longer as good as gold—but it could still be redeemed for the precious metal by foreign central banks. Notwithstanding US threats, foreign governments and investors rushed to do just that. By 1968, more than 40 percent of the US gold reserves had left the country.144 On a single day in March of that year, some four hundred million dollars was presented for conversion.145

As the precious metal exited its vaults over the course of 1968, the American government ceased providing gold to private dollar holders at the official price of thirty-five dollars per ounce. But the real bombshell exploded in 1971. On August 15 of that year, President Richard Nixon slammed shut the gold window. No longer would the US Treasury provide bullion for dollars, even to foreign central banks. The foundations of Bretton Woods had collapsed. Initially, this move sent shock waves through government and financial circles. Told that Nixon was contemplating going off the gold exchange standard, one Treasury official “leaned forward, put his face in his hands, and whispered, ‘My God!’”146 Yet Treasury Secretary John Connally was prepared to play hardball. When warned by Federal Reserve President Arthur Burns that other nations would retaliate against the United States’ unilateral move, Connally replied: “Let ’em. What can they do?”147 Very little, as it turned out.

The initial effect of the US abdication of gold convertibility was to volatilize currency values. The Bretton Woods system of fixed exchange rates between currencies crumpled as the United States bullied its trade partners—particularly Germany and Japan—into revaluing the deutsche mark and the yen. (The former appreciated by 13.6 percent; the latter by nearly 17 percent.) This was simultaneously a devaluation of the dollar—one that was meant to assist American exports and the balance of trade, as US-based goods would now be cheaper in Germany and Japan (since every dollar for which a US good was sold would require fewer deutsche marks and yen). Over the first two years after Nixon closed the gold window, the dollar declined by 25 percent relative to the yen, deutsche mark, British pound, and French franc.148 For a time, there was an effort to restabilize currencies at a new series of fixed rates of exchange. But with ever-larger global flows of funds, particularly of dollars held outside the US (known as Eurodollars) that eluded control by central banks, every attempt to fix exchange rates broke down, as speculators bet against currencies they perceived as overvalued, while gambling on those they expected to rise.

In earlier days, governments might have used capital controls to block the movement of speculative finance in and out of currency markets. But those days were long gone. The stunning growth of the Eurodollar market (to which we return shortly) meant that billions moved in a monetary space outside the sphere of state regulation. The buying and selling of currencies now became a world growth industry. In 1973, the daily turnover in foreign exchange (forex) markets amounted to $15 billion; by 2007 it had grown more than two hundred times, to $3.2 trillion per day. Meanwhile, the daily turnover in nontraditional forex markets also exploded, reaching $4.2 trillion in 2007.149 With financial movements this massive, there was no way that governments could set the value of currencies. Having abandoned a world money anchored to gold, capitalism found itself in a new era of floating exchange rates—which changed every day under the influence of massive flows of finance across global markets. So unstable were monetary values that one world leader referred to the new monetary regime as “a floating non-system.”150 And this nonsystem, as I have argued elsewhere, fostered a proliferation of new financial instruments—particularly financial derivatives—that were meant to hedge the risks associated with volatile money, but that actually had the effect of heightening instability by producing even more complex means of speculation. The global financial crisis of 2008–2009 was in part a dramatic expression of these very volatilities.151 Lurking behind this monetary instability was the reality of a dollar that had become a global fiat money.

It is important to emphasize here that fiat refers only to the ability of the state to enforce the acceptance of its currency—not to impose its value. The latter is determined in the long run by the relative productivity of the capital within the nation-state in question. And this is approximated by the rate of exchange a national currency maintains with others (and with the world of commodities).

The inherent paradox of a currency that serves as world money is that it is both a credit money produced by a single nation-state and a global means for measuring value and making international payments. The system works most effectively when the imperial hegemon spends extensively outside its borders—thereby furnishing the system with liquidity—while also maintaining decided advantages in the production of vital goods and services, so that holders of the world money require it for ongoing transactions. This was the story of the dollar for the quarter century from 1945 to 1970. But once the United States lost its decisive strength in world manufacturing, dollars became little more than inconvertible IOUs accumulating in the hands of its major trading partners. Henceforth, the outflow of dollars in order to police world capitalism became an economic liability. This is what produced the run on US gold reserves—better, after all, to get gold for your dollars if you don’t need American-produced cars or electronics. Yet the run on US gold could only be a stopgap. By the summer of 1971, gold was exiting the United States at a rate of thirty-five billion dollars per year. Sooner rather than later, America’s gold reserves would dry up. Nixon’s decision to close the gold window simply recognized the inevitable. Foreign governments objected vociferously to America’s unilateral abdication of dollar convertibility, to which Nixon’s Treasury secretary retorted, “It’s our currency but it’s your problem.”152 Indeed it was.

Reconstituting Imperial Money after Gold

The overwhelming assumption during the 1970s was that the US dollar was in unambiguous decline. On the Left, this was generally joined to the view that the downfall of the dollar was a manifestation of the breakdown of the American empire.153

Nearly half a century later, it is clear that such predictions were skewed. Yes, the dollar-gold standard collapsed, and the global economy endured a decade of profound turmoil. But something quite novel—and with which theory has generally not caught up—emerged in its place: imperial fiat money. Consider that, more than four decades after it was taken off gold,

•the dollar is used in 85 percent of all foreign exchange transactions;

•it is the medium in which the world’s central banks hold nearly two-thirds of their currency reserves;

•it is the currency in which over half of the world’s exports are priced;

•and it is the money in which roughly two-thirds of international bank loans are denominated.

In short, notwithstanding its detachment from gold or America’s enduring and massive payments deficits, a reconstituted dollar dominates the world economy. And with that domination we find ourselves in the age of the third modal form of money—one in which a national fiat money, untethered to any sort of physical commodity, operates as world money. I refer to this as global fiat money.154

We can readily understand this changed modality by returning to the earlier discussion of Bank of England notes. These, as we observed, were essentially private monetary instruments based on government debt. What prevented them from being pure fiat money was the bank’s obligation to redeem them for gold or silver at a guaranteed rate. During wars, however, the Bank of England often suspended convertibility, just as Lincoln’s government did during the Civil War. Throughout those periods, money was backed by nothing more than state debt and the government’s injunction that these currencies had to be accepted as legal tender (the latter making them fiat monies).

What distinguishes the period after 1973, when Nixon announced that there would be no return to a gold-based dollar, is that inconvertible fiat money was made permanent. The emergence of this third modal form has sometimes been described as the de-commodification of money, which is both formally correct and substantively misleading. Officially, the dollar is no longer exchangeable with gold at a fixed rate guaranteed by the US state. In this sense, we have moved beyond commodity money. At the same time, however, the dollar can be readily exchanged for gold or any other commodity at prevailing market rates. In this sense, the dollar is entirely convertible with gold (as it is with any other commodity). What is described as the “de-commodification” of money thus actually refers to its de-anchoring from a fixed rate of exchange with gold. Rather than money becoming inconvertible, then, it is more accurate to say that it has become destabilized in a regime of floating exchange rates.

Nevertheless, the shift to a floating regime for world money was a system-altering transformation, and it continues to pose significant theoretical challenges for critical political economy, not least because the complete de-linking of world money from a fixed commodity is without precedent. For instance, one mainstream economist, writing in the late 1970s, argued that “commodity money is the only type of money that, at present time, can be said to have passed the test of history in market economies … it is only since 1973 that the absence of any link to the commodity world is claimed to be a normal feature of the monetary system. It will take several more decades before we can tell whether the Western world has finally embarked, as so often proclaimed, on a new era of non-commodity money.”155 Those several decades have now passed, and non-commodity money rules the world. All critical and radical theory is obliged to come to terms with this metamorphosis. Yet, it must be said that many efforts to make sense of this mutation in world money have been distinctly incapacitating.

Perhaps the most significant of these is a decidedly unhelpful position often associated with deconstruction and poststructuralism. Recall two texts with which I have taken issue in previous chapters. One intoned that “the economic base or mode of production” of modern societies is “a financial abyss.” Another urged that finance “is a particularly interpretive and textual practice.”156 In the same vein, a recent work declared that “money is in itself nothing but representation.”157 Whether acknowledged or not, all of these formulations are responses to the severing of fixed convertibility of the dollar for gold. Ostensibly freed from any referent (such as gold), money is proclaimed to be purely self-referential. Rather than epitomizing something else, national currency is said to reflect nothing more than itself, leading us into a monetary house of mirrors.

These analyses are products of an influential line of commentary on the post-1973 era that has diluted critical thinking. I am referring, in particular, to “economic” analysis that has gone under the banner of postmodern criticism, emblematized by Jacques Derrida’s Given Time, whose central arguments about money and (post)modern economics have been widely reproduced. Derrida informs us in that text that we now live “in the age of value as monetary sign.” He does not mean this in the commonplace sense that all money has a symbolic form—as coin, paper currency, or digital inscription. Derrida intends us to understand that money represents nothing but itself, that there is nothing that sustains it or to which it refers. It follows that capital (accumulated money and its effects) is infinitely self-reproducing because it issues “from a simulacrum, from a copy of a copy (phantasma).”158 There is no “real” here; there is no sphere of labor and capital to which money is tethered. We are simply inside the simulacrum, trapped by an image capable of endlessly proliferating. Derrida’s argument dovetails with the pronouncements of postmodern theorist Jean Baudrillard who, in addition to declaring that the First Gulf War did not happen, has also proclaimed that we live in a “virtual economy emancipated from real economics,” one that has arrived at “the end of labor. The end of production. The end of political economy.” In this virtual economy without labor or production, “money is now the only genuine artificial satellite,” endlessly orbiting around itself.159 While it might be tempting to pass over ludicrous pronouncements of this sort, their pervasive and disabling influence demands critical engagement. After all, if these were accurate depictions of our world, then it would make very little sense to be concerned about sweatshop labor, precarious work, or bonded migrant laborers, as these would comprise outmoded references to a now-obsolete world—one that vanished with the end of labor, production, and political economy.160

Equally unhelpful is the Lacan-inspired claim that, as much as we are all exploited by it, money in fact “does not exist.”161 Weirdly, this argument dovetails with right-wing libertarian attacks on modern “funny money,” so-called because it lacks a commodity basis. Referencing the pledge on a British five-pound note that “the Bank of England promises to pay the bearer on demand the sum of five pounds,” one pundit asks, “Five pounds of what?” The answer, post–gold standard, is entirely circular. Since the Bank of England is not offering a fixed exchange with gold or any other commodity, all it can really do is replace one five-pound note with another.162 Of course, this commentator ignores the simple fact that the note in question is legally validated by the state as exchangeable for five pounds worth of any market good, including gold. It has, in other words, infinite convertibility in the world of commodities, which is no small thing for a hungry person.

A central problem with such analyses is that they tend to describe the post-1973 monetary order in terms of a dematerialization of money. These tendencies have been strengthened by the digitization of money, which resulted, by the early 2000s, in actual cash composing merely 3 percent of the money stock.163 Yet, intangible money still has material force. To think otherwise is to adopt a philosophically naive view in which something must be tangible and/or visible to be material—a position that would make gravity an immaterial force.164 Running through such interpretations, as I have argued elsewhere, is a mode of thought that uncritically reproduces the fetish character of commodities and money by taking the immediate form of appearance of a phenomenon—in this case non-commodity money—as the basis for knowledge claims. Yet, in an alienated social world, things appear in forms that systematically mystify their grounding in human practical activity. The responsibility of critical theory is to de-fetishize by deciphering the practical foundations of mystifying social phenomena, including money, in alienated forms of social praxis—not least because this opens the possibility that they might be transcended by way of dis-alienating praxis.165

In invisibilizing the world of production and reproduction, the positions I have been canvassing abdicate the work of critique and eliminate any coherent basis for a transformative politics.166 By occluding the world of labor, analyses of this type obscure the immense expansion of wage labor on a world scale over the neoliberal period (since the late 1970s). The global paid working class effectively doubled over about a quarter century, from roughly 1.5 billion to 3 billion wage workers.167 At the heart of this great doubling was the drawing of hundreds of millions of newly dispossessed laborers in Asia, especially China, into capitalist production.168 Indeed, the new forms of money and finance we see today are integrally related to this expansion of capital accumulation and the working class on a world scale. To note this, however, is not to invite radical theory to pretend that nothing has changed, and that all is well with inherited concepts. These are major challenges—ones we cannot shirk as we build comprehensive accounts of the changed and changing shape of late capitalism.

At the same time, it is equally unhelpful to treat such changes as simple negations. Particularly disabling in this respect are ostensibly Marxist accounts that conclude that in the absence of commodity money, the classical laws of motion of capitalism are no longer in play. In this perspective, across the twentieth century—and particularly since 1973—capitalism has been in “managed decline” as a consequence of the detachment of money from gold.169 This, however, is to confuse transformation with decline. Rather than decipher the historical renovations of late capitalism, such positions confuse mutations with exits from capitalism’s laws of motion. In place of explicating system-wide transformations, which would involve working through the dialectics of continuity and discontinuity they exhibit, such positions instead declare that the latest stage does not conform to a frozen image of an earlier one. In so doing, such analyses abdicate the dialectical injunction to grasp phenomena in their becoming.170 As an organic system, capitalism is in incessant motion; it repeatedly sheds one historical form in its movement into another. And since the starting point for dialectical theory is that which has actually happened, rather than formal models, our challenge is to illuminate as best we can the actual social-historical process.171

World Finance and Imperial Debt

Let us instead explore the ways in which global fiat money expresses a specific matrix of international capitalist relations. Put in the analytical framework we have been developing, this requires a delineation of the multidimensional configuration of classes, states, empire, war making, and world finance characteristic of the post–Vietnam War era. If the dollar has been reconstituted as a new modal form of world money, this speaks to a dynamic nexus of capitalist power in which an imperial fiat money can operate as the regulator of a historically specific arrangement of the capitalist mode of production on a world scale. To be sure, all such arrangements are inherently contradictory; they manifest fractures and fault lines (some of which I discuss below). At the same time, imperial fiat money has been sufficiently functional as to regulate the global reproduction of capital. With this in mind, let us return to the historical situation after 1973.

Writing about the post-1973 advent of “pure paper money,” radical geographer David Harvey observes that when money supply is “liberated from any physical production constraints … the power of the state then becomes much more relevant, because political and legal backing must replace the backing provided by the money commodity.”172 There is a valuable insight here. After all, what backs the US dollar today is not gold, but government debt (US Treasury bills and bonds, in particular) and government fiat, the state’s declaration that a given currency is legal tender. Of course, this is not entirely new. Nevertheless, even the British pound, which also rested atop state liabilities, was simultaneously governed by the “metallic barrier” described by Marx, particularly in its functions as world money. Convertibility of pounds to gold limited the production and circulation of private credit money, from notes created by country banks to corporate IOUs. During a crisis in credit markets, holders of IOUs and banknotes would rush to exchange them for gold—the highest and most universal form of money. As much as a Bank of England note was a claim to a share of future state taxes that would repay its borrowing (and thus “anchored” by fictitious or futural capital), such notes were also officially linked to past labor in the form of bars of gold or metallic coins. This dual temporality of money tied notes based on future wealth to existing products of past labor (gold bars and coins). Within the hierarchy of money, gold loomed above pound notes, as became clear every time a panic induced a stampede for the precious metal. Bank of England notes thus possessed a hybrid structure characteristic of money’s second modal form, linked as they were to future returns on loans to the state, which could generally be converted into gold as product of past labor.

What is distinctive of money in its third modal form is that, released from its formal ties to gold, it obeys neither of its former masters: not the credit system nor bullion. With the constraints imposed by the latter now removed, we move in a world of full-fledged credit money. “In contemporary economies, then,” economist Duncan Foley points out, “a fictitious capital, the liability of the state, rather than a produced commodity, functions as the measure of value.”173

When I receive a US dollar, I accept a note based exclusively on future payments derived from government revenues. Of course, since the state has made this note legal tender, within the United States I am obliged, like everyone else, to take it as a means of payment. This gives the dollar a universal validity within the national economy. But why should US government debt be a functional basis of world money? Why, in other words, should foreign central banks and international investors find an exclusively debt-backed dollar an acceptable and legitimate means of regulation and coordination of world payments and finance?

One tempting answer is that the United States has largely forced the dollar on the rest of the world. And this cannot be entirely disregarded. After all, the United States reaps an enormous advantage in being able to provide “decommodified” money for the world’s goods and services. Over $500 billion in US currency circulates outside the United States, for which foreigners have had to provide an equivalent in goods and services. In addition to well over half of all dollars circulating outside the United States—and thus representing cost-free imports to the US economy (that is, IOUs that are never cashed in)—as of the late 1990s, three-quarters of each year’s new dollars stayed abroad.174 Equally significant, by early 2018 foreign governments had accumulated $6.25 trillion in US Treasury securities. Dollar-receiving countries, in other words, unable to convert their dollar holdings into a higher form of money—like gold—have often used them to purchase American government debt, as well as other US assets. In essence, they have loaned back to the US state the very dollars Americans have spent to import goods and services, or to purchase foreign assets. The effect of this arrangement is to exempt the United States from a balance-of-payments constraint. Rather than having to boost exports or cut imports (and domestic consumption) in the event of sustained deficits in its payments with the rest of the world, the United States can simply issue IOUs that are redeemable primarily for its own government debt.175 This is indeed an “exorbitant privilege,” as a former French finance minister complained.176 It amounts to allowing the United States—and it alone—to issue as means of global payment IOUs that in principle never have to be repaid. As a former top official at the Bank of France complained in the 1960s, “If I had an agreement with my tailor that whatever money I pay him he returns to me the very same day as a loan, I would have no objection at all to ordering more suits from him.”177

Because of this arrangement, since the mid-1980s the United States has been the world’s largest capital importer, taking in anywhere from two-thirds to three-quarters of global investment funds. For the United States as a national unit, dollar inflows are really refluxes of dollar outflows meant to cover its payment deficits with the world. IOUs thus finance America’s deficits with the rest of the world while subsequently returning to sustain its public debt (Treasury bills and bonds). Yet, we should pause before simply characterizing the United States today as an imperialist debtor country that reproduces itself by sucking up enormous inflows of foreign capital. An overriding focus on the “exorbitant privileges” that come from issuing dollars invites the too-hasty conclusion that we live in the age of US “super imperialism.”178

This sort of analysis seems to gain credence from the fact that among those states recycling dollars into US Treasury securities are many that rely heavily on American military aid and protection—Japan, South Korea, and several Gulf states. Significant as these arrangements are, we lose critical depth if we imagine a part, the American state, as greater (i.e., more determinant) than the whole, the global capitalist system. Notwithstanding the power of the US state, it remains ultimately determined by the world system of which it is an integral part.179 In the capitalist world economy that emerged from the downfall of Bretton Woods, the enduring dominance of the dollar, and of US financial markets, is itself a function not just of the needs of the American state, but of the general needs of global capital, however much these involve dynamic contradictions. With respect to the operation of the law of value on a world scale—the ultimate regulator of all capitalist production and profitability—the system requires benchmarks that facilitate measures of value, risk, and profitability, and that enforce monetary discipline over labor. The dollar and the US Treasury securities that underpin it have proved crucial in these regards.

And all of this has much to do with two key points we have made: first, that the decline of the Bretton Woods system had its roots in the internationalization of production and investment driven by US multinational corporations (more on this below); and, second, that these processes, combined with heightened capital mobility, led to a geographic expansion of capitalist accumulation and the emergence of a much more global working class. Rather than capitalism having witnessed a global slowdown of investment and accumulation—a misapprehension based on a focus on domestic data in the historic capitalist core—there has instead been a geographic restructuring of investment and accumulation, with the highest rates taking place in so-called newly industrializing countries. Moreover, once we introduce foreign direct investment flows and the earnings they generate into the picture, the United States no longer appears simply as an imperialist debtor country.180 For, as Brett Fieberger and Cédric Durand have shown, the income the US economy earns from its overseas investments more than outweighs the payments it makes on other deficits with the rest of the world—and the same is true for Germany, France, and Japan.181 It is the flow of surplus value from overseas investment that “rebalances” the finance of the main imperialist powers. In this context, one of the key things the US dollar has done is to provide a highly liquid world money in an age of exceptional globalization of production, investment, and exchange. It is precisely this that has made the dollar significantly “functional” for world capitalism. Nevertheless, its role in providing a world measure of value and a highly liquid means of payment and exchange also involves contradictory dynamics—indeed, ones that might ultimately upset the very global structure it is meant to support.

World Money in the Age of Floating Currencies and Financial Turbulence

The new role of the dollar was prefigured in the growth of Eurodollar markets across the 1960s. These Euromarkets represented dollarized spaces outside the control and regulation of the US state, or any other. Greenbacks deposited overseas, especially in banks in the City of London, were beyond the authority of the American state and also evaded regulation by foreign central banks. At first, these markets were meant as a place where Russian-bloc governments could park dollars. However, it was the financial needs and operations of American-based multinational enterprises that really fueled Eurodollar growth. These transnational firms, driving a boom in foreign direct investment, regularly raised funds outside the United States, just as they often deposited profits in offshore dollar accounts. Across the 1960s, as industrial capital broke out of the largely national forms it had assumed since the Great Depression, US banks followed them into these offshore markets. Indicative of this globalizing trend was the growth of international trade, which expanded 40 percent more rapidly than did world output. Ever-larger shares of commodities were thus being produced for global markets, rather than for local ones. But even more striking, as corporations outgrew national markets, was the growth of foreign direct investment, which expanded twice as fast as gross domestic product.182 But as much as dollar outflows for foreign investment by US multinationals sparked the new era of capital mobility, it was America’s Vietnam War–fueled deficits that caused the offshore dollar market to explode. As US deficits soared, requiring the American economy to ship dollars overseas, total deposits in the Eurodollar market expanded more than fifty times over between 1960 and 1970, rocketing from roughly $1 billion to $57 billion. Following the breakdown of the dollar–gold exchange system, a further explosion pushed Eurodollar holdings over $1 trillion by 1983.183

When the Bretton Woods system of dollar–gold convertibility and fixed exchange rates collapsed after 1971, the global monetary regime essentially adapted itself to the deregulated norms of the Eurodollar market. In so doing, it caught up with spatio-structural transformations in productive capital—in particular, the rise of a global manufacturing system and the foreign direct investment that sustained it. This process of adaptation was piecemeal, haphazard, and largely unplanned. Indeed, it seems clear that the managers of the US state did not grasp how the reconfiguration of money and finance might rebuild US financial power.184 But the legal and institutional adjustments introduced by the US state in the post–Bretton Woods era obeyed a clear logic: the deregulation and internationalization of finance in conformity with the new world of global manufacturing and production.185 Viewed as an overarching historical process, what the US state did in the decade after 1973 was to redesign American finance in keeping with the already-multinational configuration of industrial capital. To be sure, new forms of imperial hegemony were constructed in the process. But, and this is a point downplayed by those who incline to a theory of American super-imperialism, US rulers were able to do this in large measure because the arc of transformation was conducive to global capital in general. As economic geographer Neil Smith observed, “However powerful US capital and the American state are … globalization is not the same as Americanization. Ruling classes around the world are heavily invested in globalization for their own interests.”186

If the rise of globalized manufacturing was the dominant economic story of the 1960s, financial globalization was the saga of the 1970s. Fueling this transformation was an explosion in foreign exchange trading. As currencies became unmoored from gold and fixed rates of exchange, new levels of risk and uncertainty entered financial calculations. For instance, currency devaluations could cause profits made by a multinational enterprise in one country to effectively disappear when repatriated to banks in the home country. For this reason, multinational corporations, banks among them, sought to protect their earnings by shifting out of depreciating currencies and into appreciating ones (or by trying to correctly anticipate such movements). Once currencies began to float, monetary volatility produced an eruption in foreign exchange (forex) trading—which jumped ten times over (from $15 billion to $150 billion per day) between 1973 and 1985. By this time, however, currency trading had become a profitable business as an end in itself, and all kinds of institutions developed financial models designed to profit from the smallest of currency fluctuations. For most of these, forex trading became a mode of gambling, the placing of currency bets in the roaring markets of casino capitalism. As table 5.1 indicates, daily forex trading zoomed to $1.2 trillion in 1995, adding another $2 trillion a day by 2007 (when it hit $3.2 trillion), only to surpass $5 trillion a day by 2016. Throughout this rise, speculative activity utterly eclipsed the buying and selling of currencies for the actual needs of business.187 Whereas 80 percent of forex transactions were tied to regular business activities in 1975, and merely 20 percent to speculation, by the early 1990s speculative trading had come to account for 97 percent of forex transactions—a level it has sustained since then.188

Table 5.1: Daily Turnover in Foreign Exchange Markets (selected years 1973–2016)

Year

Amount

1973

$15 billion

1980

$80 billion

1985

$150 billion

1995

$1.2 trillion

2004

$1.9 trillion

2007

$3.2 trillion

2016

$5.1 trillion

Source: Bank for International Settlements, Triennial Central Bank Survey, multiple years

While all this was transpiring, one government after another recognized the writing on the wall and signed on to the so-called “financial revolution” of the 1980s and 1990s. After all, if governments endeavored to regulate the national space, firms could simply enter “stateless” zones like the Eurodollar market, where borrowing was often cheaper and less constrained by regulations. In order to retain financial business, nation-states inclined to the rules of the game simply mimicked these stateless spaces by deregulating finance and eliminating capital controls. As financial deregulation caught wind, gross capital outflows from the fourteen largest industrial economies jumped from an average of about $65 billion a year in the late 1970s to $460 billion per year by 1989.189 Capital now flowed more readily across borders than at any time since before the Great Depression of the 1930s, as the global investment activities of multinational firms were complemented by globalizing finance. Increasingly, the US domestic economy was a chief beneficiary. During the 1980s, after the American economy had been restructured and stabilized, capital flows into the United States, especially for purchases of US bonds and equities, grew twenty times over in real terms.190 Where capital outflows driven by US multinationals had propelled economic globalization in the 1950s and 1960s, capital inflows now joined continuing outflows as part of the complex financial architecture of the dollar-based regime of global fiat money.

It wasn’t just that capital was flowing more feely around the globe, however. It was also that financial assets were increasing as a share of world gross domestic product. As the McKinsey Global Institute has shown, the global stock of financial assets relative to world GDP began mounting steadily in the 1990s from a level equal to 15 percent of world gross domestic product in 1995 to 103 percent of global GDP by 2007.191 This is one aspect of the phenomenon often described as financialization. Yet, we need to remind ourselves that much of this international movement of finance was for productive investment in the global South, not simply for speculative activity.192

Nonetheless, there has been an increase in speculative movements of capital, as well. Too often, however, pundits lose sight of the degree to which this was fueled by tremendous growth in the world money supply once the dollar was delinked from gold. Figure 5.1 illustrates the growth in the base money supply (known as MZM, for “money of zero maturity”) in the United States since the late 1950s. Particularly evident is the formidable expansion of the US monetary base after 1980, when the American economy was stabilized following the “dollar crises” of 1968–73, and the inflationary decade of the 1970s.

image

Figure 5.1: MZM Money Stock

Freed from its “metallic barrier,” as figure 5.1 demonstrates, the money supply could now grow without the constraining effects of a fixed exchange rate with gold. Monetary growth on this scale also fostered new dynamics of price formation: no longer tied to the market value of gold, prices became much more responsive to the quantity of money and the velocity with which it circulated.193 This generates a tendency toward permanent price inflation under money’s third modal form, at least during periods of expansion—and this is one reason why central banks moved toward “inflation targeting,” in hopes of achieving predictable yet modest price increases.194

Not only do governments and big business wish to avoid excessive volatility, which can render pricing and investment highly unpredictable. They also seek to avoid explosive strike waves of the sort that occurred between 1965 and 1976 in response to mounting rates of inflation, which were eroding real wages. Under the gold standard, labor could be disciplined when, in the face of currency decline and a rush to gold, the central bank raised interest rates in order to draw gold back to its coffers. Rising interest rates in a moment of financial crisis tended to induce deep slumps that, in pushing hundreds of thousands out of work, acted to reduce wages and worker militancy. In the post–Bretton Woods monetary order, however, central bank policy has substituted for the disciplinary effects of the gold standard. Through control over the interest rate charged to commercial banks (the discount rate), central bankers assume the enforcement role previously performed by the treasury in its obligation to convert notes into gold. This took one of its most dramatic forms in the punishing deployment of record-high interest rates by the Federal Reserve under Paul Volcker in the late 1970s and early 1980s. Draconian levels of interest drove down the annual inflation rate in the United States from 14 to 3 percent, and restored global investor confidence in the dollar—all at the price of a bruising world recession.195

More recently, the German central bank, the Bundesbank, has attempted to impose a similar sort of financial discipline on the Eurozone, notwithstanding the anti-stimulative effects of such policy in the midst of a global slump.196 The irony is that since the onset of the 2007 downturn, central banks have struggled to avoid deflation of the sort that has ailed Japan since the 1990s, where it has produced chronic stagnation. Generating inflation, rather than curbing it, became the order of the day in a period of slump. The sort of obsession with financial discipline displayed by German governments is not, in these circumstances, geared toward fighting inflation. Instead, it is about the exercise of class discipline over labor—the use of austerity as a means to compress wages in the interests of profitability.197

Notwithstanding the commitment of central banks to modest inflation and economic discipline, however, the international monetary stock has in fact exploded since gold (as shown in figure 5.1), and this has been one source of continued financial turbulence. At issue here is the inability of central banks to control the production of money. This reality has exposed the monetarist fallacy about controlling the money supply. For the fact is that private banks today create about 95 percent of all money, with central banks issuing merely 5 percent.198 Contrary to most economic theory, the bulk of money originates as bank credit, in loans to borrowers (for investments, mortgages, credit card payments, student loans, etc.). When my local bank agrees to provide me with a line of credit or mortgage of one hundred thousand dollars, it does not actually go and find already-existing dollars somewhere to lend me. Instead, it creates the money ex nihilo. It simply registers that amount digitally in an account bearing my name. I can now proceed to spend that money by promising to make payments from future earnings. This means that my bank has pre-validated my debt as full-fledged money.199 In other words, it has increased the money supply by one hundred thousand dollars—an amount I have merely promised to repay—with the stroke of a few keys on a keyboard. Of course, my mortgage or line of credit is small potatoes compared to the billions that are being created every day to satisfy the borrowing needs of corporations (financial and nonfinancial), investors, and governments.

All of this private credit-money creation typically proceeds smoothly until a downturn in the economy or a financial shock induces a credit crisis. At that moment, it becomes clear that much of the bank credit money that had been created (such as my line of credit), alongside much of the stockpile of private nonbank credit instruments (corporate bonds, commercial paper, etc.) is as worthless as the IOUs passed by a penniless person. It then turns out that pre-validation—the treatment of a loan as full-fledged money—has involved a considerable amount of pseudo-validation, and much of this debt capital is exposed as largely fictitious, as mortgage-backed securities and other collateralized debt obligations (CDOs) were in 2007–2009. At such moments, there ensues a stampede to “safety,” represented by the world’s most valued currencies—and gold. And where financial institutions have been buying and selling toxic fictitious capitals to one another, the effects of a credit crunch can be calamitous, as they were in 2007–2009, when a global financial meltdown rocked the international banking system.200

When such crises occur outside the imperial centers of the system, local economies are frequently forced to endure the devastations of “structural adjustment,” in order to borrow to pay off creditors and thus exit the crisis. Mexico, Brazil, Argentina, Thailand, Malaysia, Iceland, Latvia, Greece, and many others have had such programs inflicted on them by international financial institutions. But when panic shakes the core of the system, the rules abruptly change. Rather than cutting its debt, as most states are compelled to do, the US imperial state massively expanded its borrowing throughout the crisis that commenced in 2008. Operating as the world central bank, the US Federal Reserve intervened to monetize trillions of dollars of pre-validated credit money created by private banks, alongside trillions of toxic assets produced by banks, “shadow banks,” and other institutions. Figure 5.2 captures the the dramatic growth of assets owned by the US Federal Reserve banks, which more than quintupled between 2008 and 2014 as the central bank monetized holdings throughout the global credit system.201

image

Figure 5.2: Total Assets of US Federal Reserve Banks

To understand the economic processes involved in the monetization of private credit monies like this, let us compare the current modal form of money with its second form. To be sure, money was also produced as private credit notes under the second modal form. The Bank of England, after all, was a private bank at its inception. And outside London, country banks were also empowered to print banknotes. But that second modal form of money was hybrid, as Marx argued. Not only did it obey the dictates of the credit system; it had another master: gold. A credit crunch thus induced a scramble to convert all other forms of money to gold, which was even more valued than Bank of England notes. What had been pre-validated (private bank credit money) had now to be post-validated; it had to be subjected to the test of conversion into the most universal form of money, gold. At such moments, fictitious capitals had to find their equivalent in past labor—a test that much credit money failed. Given the relative scarcity of gold (the product of past labor), the financial system invariably underwent a forced deflation, involving price declines, bankruptcies, wage cuts, and so on, as a range of fictitious capitals (paper claims to future wealth) were repudiated and suffered massive devaluation. The money supply would then contract, as it did during the bank failures of the 1930s, and would be brought into a more appropriate proportion to the existing reserves of gold. Of course, this often took place by way of a devastating slump.

Removed from the gold constraint, however, central banks today can easily reflate in the face of a credit crunch. This means pushing down interest rates in order to encourage borrowing (which, as we have seen, is a form of monetary expansion) and to directly expand the money supply. (Often this is accomplished by giving commercial banks “high-powered money”—central bank cash—in exchange for their toxic paper assets.) Put slightly differently, central banks can act to preserve the values pre-validated by private banks (as credit money) since they are under no requirement to convert them to gold. Removed from the metallic barrier, a central bank can offer high-powered money for debased credit money virtually without limit—which is why the money supply just keeps expanding. And this means, of course, that financialization continues in the face of financial crisis. In fact, the panic of 2008–2009 proved just how malleable central bank policy could be in that regard, as trillions upon trillions were pumped into a financial system in the throes of a global meltdown, and “quantitative easing” was used as if there were no limit to the production of money.202 As early as November 2011, the US Federal Reserve alone had already pumped over $13 trillion into the rescue of the global banking system—even before the Fed introduced its formal quantitative easing programs.203 And China’s massive bailout and stimulus program, along with more modest interventions by the Eurozone, added many trillions more to the global rescue package.

Fault Lines and Histories as yet Unwritten

As we have seen, the Great Depression of 1929–39 was the paradigmatic expression of crisis under the gold standard, where slumps were deep and enduring, involving massive destruction of capital—in the form of widespread business bankruptcies—and mass unemployment. Global capitalism drifted off the gold standard during that crisis, never to fully return (the gold–dollar exchange standard devised in 1944 was already a half step in the direction of a post-gold monetary order). The world financial meltdown that broke out in 2007, triggering a global slump, is symptomatic of the new pattern of capitalist crisis under the third modal form of money.

Unconstrained by a metallic barrier, the third modal form enables central banks to monetize private credit monies, in ways discussed above, in order to avert a 1930s-style slump. However, this tends to block the purge of inefficient capitals on the massive scale necessary to open vistas for new waves of investment and accumulation.204 As a result, the decade of economic recovery after 2009 was the slowest on record, due, in particular, to sluggish capital investment.205 While profitability was restored in the United States after 2009 thanks to the squeezing of workers—by way of layoffs, precarious forms of employment, speedups, and wage compression—followed by corporate tax cuts, no sustained investment boom has been generated, nor is one likely in the absence of a deep slump that destroys the least efficient capitals.

A classic capitalist downturn induces widespread industrial restructuring, with plant closures, bankruptcies, and job loss on a tremendous scale. But when central banks drive down interest rates and offer high-powered money for devalued assets, then slumps are arrested—and severe restructuring forestalled. The “cleansing” function of a depression—Joseph Schumpeter’s “creative destruction”206—is thereby stalled. But without the disappearance of the weakest capitals, the stage is not cleared for new rounds of deep and sustained investment. The result is a sluggish economy awash in cheap money: relative economic stagnation combined with repeated bubbles in financial investments like stocks, collateralized mortgage obligations, emerging market debt, “junk” bonds, and so on. Late capitalism under the third modal form of money thus experiences both ongoing financialization and a recurring pattern of speculative fevers and bursting bubbles. Indeed, that pattern is now well established: the so-called Third World debt crisis of 1982, the Mexican meltdown of 1994–95, the Asian crisis of 1997, Russia’s credit crunch and the collapse of Long Term Capital Management in 1998, the bursting of the dot-com bubble in 2000–2001, and the global financial slump of 2007–2009, to name the most significant.

All of this returns us to the role of the dollar as world money. In the midst of these shocks, as financial institutions tumble and billions worth of fictitious capital evaporates, there is a rush to safe assets. One of these is gold. But more than anything else, it is the world’s most liquid dollar-based assets—US Treasury securities—that serve as capital’s safe haven. For this reason, says economist François Chesnais, “US treasury securities come the nearest to being what Marx called ‘hard cash’, i.e., they are the closest thing today to what gold was in the past.”207

But US government securities serve as more than a shelter from the storm. They also function as a critical measure of value and profitability. Having the deepest and most liquid market in the world, along with low credit risk (the US state has never defaulted on its debts and it can always produce more dollars), US Treasury securities provide a benchmark against which all other risks can be measured. This is not to say that the interest rate on Treasuries determines profit rates.208 Far from it. But once those rates are structurally determined, the degree of assessed risk a financial investment carries can be expressed in the premium that must be paid above the interest rate on Treasury assets. So long as US public debt instruments play these essential roles—as safe haven and as benchmark measure of value, risk, and profitability—then a dollar-based international monetary system will have a rudimentary functionality for global capital, notwithstanding the system’s inner tensions and contradictions. To be sure, other currencies could operate to fulfill these roles, and the euro, in particular, has demonstrated some traction in this regard, albeit on a more limited (and largely regional) scale. But the Eurozone’s financial markets are not at all as deep and liquid as those of the United States, and the European Central Bank is a long way from playing the part of a world central bank; that has been unique to the US Federal Reserve.

Yet this should not mislead us into assuming that the dollar-based world monetary order is solid and impervious to change. On the contrary, this regime is characterized by two axes of conflict that might yet undo dollar hegemony in the third modal form. The first of these is inter-capitalist tensions that provoke rival blocs to search for alternatives to the dollar as world money. The second, and ultimately the most crucial, is the set of class tensions running through the financialized regime of late capitalist dollar hegemony. While these contradictions are inscribed within the very relations of the global fiat money regime, their resolution is not. Their working out will be determined only in the course of real social-historical contestations. With that in mind, let us take each of these axes of conflict in turn.

The “exorbitant privilege” that accrues to the US state as producer of global fiat money frustrated other major capitalist states, even before Treasury Secretary John Connally declared, in 1971, “It’s our currency, but it’s your problem.” The formation of the euro as a global currency was in large measure about escaping that problem. By creating another transnational currency—the euro is now the second-largest reserve currency in the world—and one that dominates trade in Europe, the Eurozone reduced the volume of formally inconvertible American IOUs it is forced to accept in the course of international trade and finance.209 Notwithstanding all the economic and institutional turmoil of the Eurozone since 2009, the social logic of the euro project is to reduce the bounds of dollar hegemony.

More recently, China has started to move down a similar track, albeit one “with Chinese characteristics.” In 2016, the yuan was recognized as a world currency by the International Monetary Fund, which incorporated it into the basket of currencies that make up IMF “special drawing rights.”210 Two years later, the Chinese government relaxed restrictions on banking and finance, making it much easier for foreign banks and nonfinancial corporations to buy and sell yuan and invest in China’s banking sector. At roughly the same time, China’s leaders launched the Shanghai oil futures market, where all prices are denominated in yuan.211 All of these moves are designed to position the yuan as a more genuinely global currency.212 And while the dollar will not be dethroned in short order, it is significant that the yuan bloc, as measured by analysts at the International Monetary Fund, is now the world’s second-largest currency zone.213

There can be little doubt that the global financial crisis of 2007–2009 gave a fillip in this direction. In the throes of the panic, Chinese holdings of US financial assets began to melt down. China’s leaders then directly intervened with US officials to ensure that mortgage lenders Fannie Mae and Freddie Mac would be backstopped by the American state. This was the context in which Gao Xiqing—manager of China’s enormous sovereign wealth fund (with over $200 billion in foreign assets), alumnus of Duke University Law School, and a former Wall Street lawyer—issued a powerful rebuke to the United States:

This generation of Americans is so used to your supremacy…. It hurts to think, Okay, now we have to be on equal footing to other people…. The simple truth today is that your economy is built on the global economy. And it’s built on the support, the gratuitous support, of a lot of countries. So why don’t you … be nice to the countries that lend you money. Talk to the Chinese! Talk to the Middle Easterners! And pull your troops back!

Rhetorically charged as this statement was, it also implied a deliberate program: the redesign of global financial institutions. Discussing foreign ownership of US assets, Gao continued, “If China has $2 trillion, Japan has almost $2 trillion, and Russia has some, then … get all the relevant people together and think up what people are calling a second Bretton Woods system.”214 Such statements suggest that China’s financial liberalization is part of a campaign to push world capitalism toward a new set of monetary arrangements in which at least three major currencies—the dollar, the euro, and the yuan—would coexist as world monies, effectively sharing the monetary throne in the interests of pluralizing global finance. At the time of writing, China has cut deals with oil companies in Russia, Iran, and Venezuela to accept the yuan in payment for China’s foreign purchases of oil.215 As the world’s largest importer, China’s move here will incrementally reshape global currency markets. So will its $1 trillion One Belt, One Road initiative, which will link it more powerfully with economies across Europe, Africa, and Asia. The program has also recently been supplemented by the formation of the Shanghai Cooperation Council, which will integrate the Chinese economy ever more closely with those of Russia, India, and Pakistan.

China’s track toward monetary diversification is fueled by the enduring dilemma posed for capitalism’s most dynamic trading nations by a dollar-based regime of global fiat money: in payment for goods, they are forced to accumulate dollars that have little use other than to purchase American financial assets. What’s more, those very assets are inherently unstable under a world monetary regime rife with endemic financial speculation, asset bubbles, and financial crises. Every panic threatens the very dollar-based assets that US trade partners purchase with America’s IOUs—and such panics are endemic to a world awash with formally inconvertible greenbacks. This is what China discovered in 2008 when the debt it purchased in government-sponsored enterprises like Fannie Mae and Freddie Mac began to implode.

However, it is one thing for China to want a new Bretton Woods–type agreement to redesign the global monetary system, and quite another for it to get it. As discussed above, American representatives used their global dominance to force through that “agreement.” And if there is one lesson of the history we have tracked across this book, it is that wars are often the fearsome engines of “regime change” in the domain of world money. As one analyst rightly notes, “What will resolve the current tension is a power grab by a new stakeholder determined to have its way.”216 It is difficult to foresee a scenario in which the American government would tolerate such a power grab and accept the dethroning of the dollar without extremely bitter conflict, involving at least the threat of war. Certainly, that threat might forestall “a new stakeholder determined to have its way.” But that simply leaves us with a bloody US empire, intent on belligerent assertion of its state power. Whatever the future holds in this regard, it can only be a repetition of the vicious cycle of blood for money—that is, unless the system of money as second nature is overturned.

To be sure, the prospects for such an upheaval often appear remote. Indeed, late capitalism not only threatens humankind with an intensification of violence and war, but also with catastrophic climate change. Understandably, many are filled with despair in the face of such dangers. But, as Marxist philosopher Ernst Bloch observed, this is in part because capitalist society drapes the future in its own incapacity to promote human development.217 “On bourgeois ground,” he wrote, “change is impossible anyway even if it were desired. In fact, bourgeois interest would like to draw every other interest opposed to it into its own failure; so, in order to drain the new life, it makes its own agony apparently fundamental, apparently ontological. The futility of bourgeois existence is extended to be that of the human situation in general, of existence per se.”218 Invoking the principle of hope, Bloch reminds us that critical-revolutionary thought seeks out forces of world transformation that are struggling to form themselves on the current terrain of violence and domination. This gesture is in tune with Marx’s insistence that in capitalist society, radical powers of subversion are always burrowing, often undetected, beneath the surface. In seeking out traces of those forces, he reminds us, “we recognize our old friend, our old mole, who knows so well how to work underground, suddenly to appear: the revolution.”219

As of this writing, intimations of the old mole’s burrowing can be detected on a variety of terrains. One can sense them in the growing restiveness of China’s massive industrial working class.220 They can be discerned in the recent, if shortlived, seizures of city squares—from Occupy Wall Street, to Tahrir Square in Cairo, to Taksim Gezi Park in Istanbul, to the insurgent streets of Sudan, Chile, Ecuador, and Lebanon—in the name of the struggle against austerity and inequality. And it can be seen in community uprisings like those in Ferguson, Missouri, with their insistence that Black Lives Matter, and in the International Women’s Strikes and climate justice rebellions in country after country. The mole’s subterranean movement can be glimpsed in the new global waves of feminist, anti-racist, queer, environmental, and migrant justice struggles. And it can be detected in the revival of teachers’ strikes and the rebirth of socialism in the United States.221 It cannot be known if our old friend, the revolution, will triumph in the coming decades. If not, the socialist Left will have to remake itself for new conditions. It may seem at times that this is a thankless task. But in a world entrapped in the rule of war and money, it is helpful to recall the words of a central character in Tony Kushner’s play A Bright Room Called Day:

Pick any era in history, Agnes.

What is really beautiful about that era?

The way the rich lived?

No.

The way the poor lived?

No.

The dreams of the Left

are always beautiful.

The imagining of a better world

the damnation of the present one

This faith,

this luminescent anger,

these alone

are worthy of being called human.222