When the president does it, that means it’s not illegal.
Richard Nixon (1977)1
The president can’t have a conflict of interest.
Donald Trump (2016)2
Here’s the $64,000 question: What’s legally wrong with the foregoing statements? It’s actually hard to say, just as it’s hard to say whether a sitting president can be indicted, forcibly deposed, required to turn over critical information to Congress, or be held in contempt.
Depending on the context, there may be nothing radically incorrect about the Nixonian/Trumpian statements, but this begs the question of what the context was. For Nixon, it (supposedly) involved national security, a legal nether-region over which presidents have almost unquestioned authority, although it took several court cases, a special prosecutor, and one of the biggest scoops in journalism to figure out where “almost” runs out. For Trump, the context involved demands that the president-elect divest from his ramshackle empire. Again, it takes legal digging to figure out that Trump’s statement was at least partially true, and will take even more digging to figure out where the “partially” runs out. In the face of neo-absolutist assertions of executive power, it feels like the burden of proof is all wrong. The people shouldn’t have to guess on what theory presidents exercise their powers; presidents should tell us. That presidents don’t feel the need to do so, and instead force the public to flip through law books, demonstrates that the presidency deems itself a special case.
Other branches of government routinely state the legal bases for their activity. When Congress enacts laws, it provides a “Constitutional Authority Statement” specifying the Constitutional basis for writing a given law.3 When courts decide cases, they state the precise reasons for their jurisdiction. When agencies issue regulations, they reference their statutory authority. But when presidents act, they formulaically intone that their authority derives from “the Constitution and the laws of the United States.” This is zero-degree thought, the emptiness of “because I said so.” Which part of the Constitution and which laws? Presidents decline to say.
Evasiveness grants presidents an enormous head start, allowing them to do as they please, while everyone else scrambles to assess legality. The burden of proof has landed on the people. Presidents seem free to act unless someone can provide incontrovertible proof to the contrary, and doing so takes time—long enough for presidents to get the nation into messy wars, establish special tribunals, spy on citizens, arrange slush funds, or cut deals with foreign powers. As a political fact, presidents can act unilaterally and forcefully, and when presidents entrench, they can only be moved by the full weight of the law, deployed by the other two branches of government, and usually with considerable public support. In Nixon’s case, the combination probably still would not have sufficed without reinforcements furnished by the fourth estate, itself equipped with information provided by leakers from within the executive—and even then, Nixon was not removed by law, but by his own resignation. The modern presidency reduces America to a nation of legal Lilliputians, who can only bind the giant when acting in unison, and perhaps not even then.
The Founders would have unhesitatingly condemned the modern presidency as a tyranny. A quick comparison between the Declaration of Independence and contemporary presidential powers suffices as proof. The Declaration condemned the “absolute Tyranny” of George III, who had among other sins: refused to grant assent to laws, failed to support a robust judiciary, interfered with immigration, used kangaroo courts to try subversives, employed his military and “swarms” of officials to meddle with American life, and invoked sovereign prerogative to abrogate traditional liberties.4 And now? Presidents issue signing statements that deny the force of properly enacted laws, resist judicial orders, issue decrees restricting immigration, convene “military commissions” to try “enemy combatants,” commence war without Constitutionally required declarations, and preside over an enforcement regime that enjoys sovereign immunity when it transgresses against the people. How wide is the distance between George III and Bush II? Wide, but perhaps not wide enough, and getting narrower by the year. True, the people have pushed back against specific Constitutional overreaches. But resistance to an overbearing executive is mostly episodic; the trend among judges, Congress, and the public has been toleration of an increasingly powerful executive.
Over the past century, almost all countries have concentrated power in their national executives, primarily at the expense of national legislatures and regional governments. In countries with fairly new constitutions, where civic norms have not fully entrenched, executive consolidation has been especially sharp. To a substantial degree, Hungary, Poland, the Czech Republic, and Russia have unwound the gains won in 1989–1992. But even countries with older constitutions or more deeply felt republican norms have drifted. In 2017, responding to Catalonia’s burgeoning and largely democratic independence movement, Spain’s (now ex-) prime minister Mariano Rajoy invoked emergency powers, deployed the Guardia Civil (once upon a time, Franco’s pet goon squad), suppressed regional elections, and made political arrests.5 To the northeast, Emmanuel Macron embraces a “Jupiterian” presidency—i.e., the president of France likens himself to the father of the gods (or to de Gaulle, which amounts to the same in certain quarters of France), and acts accordingly. In the many nations that use Britain’s Westminster system, which has long granted prime ministers enormous powers, executive ascendancy is harder to spot, but the trend exists all the same. Britain, for example, stands just a slip of paper away from one-man rule: thanks to Tony Blair, the Civil Contingencies Act of 2004 allows a single minister in an emergency to suspend, in theory, virtually any act of Parliament save the emergency act itself and the Human Rights Act (an outgrowth of the E.U.’s constitutional guarantees).6 The only, and completely unironic, constraint to this power is royal assent. Across the world—including America—the human sovereign has returned.
It should not (but apparently does) need to be said that concentrating so much power in individuals carries great risks. A terrible executive may provoke constitutional crisis or popular revolt—indeed, this is why Americans are American, and not British. Which raises the question: Why has this, or any other, nation tolerated the concentration of so many powers in one fallible person? An elective monarchy is a monarchy all the same, and prone to all its evils and weaknesses, as the Holy Roman Empire and Papal States demonstrated.
The response from executive apologists—many of whom have landed on the Supreme Court—is that a powerful presidency is a “necessity.” As evidence, they point to the globalization of executive power, which they claim represents universal recognition that only executives have the agility to confront a world besieged by complexity and crisis. We’ll deal with the historical obscenity of that in a moment, but for now, let’s consider the arguments for executive power in the abstract. First, there’s no a priori reason to think that modern executives are particularly nimble, much less so nimble as to warrant the upending of Constitutional balance. Executive power, after all, is wielded almost entirely through bureaucracies—hardly visions of lightning action, as we saw in Chapter 5. Equally, there’s no reason to assume that a nimble executive (assuming such exists) will be competent, ethical, or judicious. Certainly, the division of powers embedded in many constitutions recognizes the possibility of the contrary. As James Madison noted, “[e]nlightened statesmen will not always be at the helm.”7 Anyway, the “nimble executive” proves too much: if contemporary life truly requires nearly unlimited executive discretion on matters of any real importance, the republican experiment is over.
Sensing this, champions of executive imperium assert that executives are not really destroying republican constitutionalism, so much as stretching its bounds in order to save it. Wunderbar, what could go wrong? Nothing, it seems; on the contrary, everything will go quite right. As proof, executive extremists curate a museum of blockbusters to foster the impression that presidents routinely transcend the Constitution to achieve the remarkable. Law professor John Yoo (best known for providing Bush II with rickety justifications for torture) provides a typical enumeration: Lincoln’s Emancipation Proclamation, Teddy Roosevelt’s trust-busting, and FDR’s New Deal. Let’s ignore the patent insincerity of a hard-right Republican justifying executive power based on anti-monopolism and the New Deal, and hear Yoo out. Here’s what he wrote in 2008: “Our greatest presidents” expanded executive power, “for the most part… [in] a natural evolution which seemed then and now to be necessary.…”8 Now, that’s not quite right. It’s true that the executive actions on Yoo’s list worked out pretty well, but with the exception of the Emancipation Proclamation, there were decent moral arguments for and against presidential action in the other cases. This is why the decisions Yoo lists were deeply controversial in the moment, hardly the “necessary” and “natural evolution” Yoo claims existed “then and now.” Yoo also downplays or ignores that many of the executive actions he lists were confirmed by Congress and/or upheld by the judiciary—admittedly, sometimes after a tussle and after the fact, but confirmed all the same, and thus not unilateralism. Even accepting Yoo’s premise—which basically boils down to the “Good King” hypothesis—one has to take monarchs as they come. After the regime change of January 2017, Yoo found himself possessed of new reservations.9 But that’s the problem with reading history backward, as Yoo does: there’s no way to know in the moment whether a particular executive action is in fact necessary, nor whether it will better the republic and win history’s sanction. All we can know in close-to-real-time is whether a particular action is a suspension of the ordinary legal order or not. As it turns out, many of the extraordinary measures employed by Yoo’s boss, Bush II, were troubling and extralegal—as Yoo himself came close to admitting years later.10 Perhaps future scholars will discover that Bush II’s actions prevented catastrophe, but after almost twenty years, history is underway and unconvinced. Indeed, in 2014, the Senate described the “enhanced interrogation” Yoo facilitated as “brutal” and “not [] effective.”11
Because ardent supporters of executive power dip into history to make their selective case, it’s worth understanding the origins and catastrophes of extreme executive power. The original model derives from the Roman Republic, whose Senate could grant extraordinary powers to a single person, the dictator. English borrowed the Latin word, but not its legal subtleties. Under Roman law, dictatorship could exist only when declared by the Senate (not the dictator), for a limited period (generally six months), to achieve a specific purpose stated in advance (“fend off Carthage”). Though dictators had considerable latitude, laws did not entirely dissolve; within Rome, citizens still retained legal rights to appeal—a nicety Yoo’s former boss tried to jettison during the first phase of his War on Terror.12 Though Rome appointed many dictators in its long history, most relinquished power in due course; they operated, to use an anachronism, constitutionally. The notable—indeed terminal—exception was Julius Caesar. Caesar had no legal claim to the dictatorships he sought, but the Senate, fearful and debased, went along, eventually appointing Caesar dictator for life. Had the Roman Senate resisted, things might have come out differently, at least for a time. Having fed power once, the Senate found it easier to do so again when Caesar’s adopted heir, Octavian, won authority to rule as princeps civitatis, first citizen of Rome. Triumphalists like Yoo might respond that Rome, whose civic institutions were wobbling in the late Republic, received a great empire and peace when it cashed in republicanism for empire. Except that isn’t quite true: Egypt, Rome’s richest province, became the emperor’s personal property, and Rome’s subsequent history was punctuated by tyranny, rebellion, civil war, and decline. This might be of purely antiquarian interest, were not the idea of dictatorship revived repeatedly and with decreasing subtlety. After the French Revolution, the old requirements of a genuine threat to the homeland, continuation of normal judicial process, and fixed terms, lapsed. Dictatorship, born in legal chains, had shed them.
Worse, dictatorship became a constitutional feature. The notorious modern example is Article 48 of the Weimar Republic’s constitution. Article 48 permitted the Reichspräsident to suspend certain civil liberties and govern by dictate “if the public safety and order… [were] considerably disturbed or endangered.”13 (These words will recur at home, I assure you.) From the start, Weimar Germany faced so many pressures, pulling its political parties in such radically different directions, that the Reichstag managed normal majority government only sporadically. Friedrich Ebert, Weimar’s first president, repeatedly invoked Article 48 to steady the postwar economy, which worked for a time. Despite the Reichstag’s ongoing instability, a better economy permitted a remarkable civic efflorescence from 1924 to 1929, but Weimar always teetered on the edge of dictatorship. As polarization grew and legislative deadlock persisted, many Germans believed that government-by-decree was the only practicable solution. When the Depression hit, Article 48 again became the prime instrument of governance, constantly employed by President Paul von Hindenburg to sidestep a Reichstag torn between left and right. In 1932, von Hindenburg and Chancellor Franz von Papen also used Article 48 to seize control of the Prussian state government, arguing that it was unable to maintain order. That maneuver was transparently fraudulent; Prussia had experienced some civil disturbances, but its real sin was resisting von Papen and von Hindenberg’s policies.14 German federalism therefore collapsed into mere fiction.
After further dysfunction in the national legislature, von Hindenburg and the conservative establishment decided to do unto Germany as they did unto Prussia. In their view, they had only three options: continued legislative disorder, bolshevism, or Nazism. The establishment theorized that Hitler would be most biddable, and tried to co-opt him into forming an extreme, but constitutional, government. Arguably, the establishment got a perverted version of its wish: Hitler gained power constitutionally, and his government was indeed extreme. From there, the legal status of the Nazi catastrophe became less clear; Hitler used intimidation and strategic arrests to cow the legislature into passing the Enabling Act, and what the Act enabled was dictatorship. As Weimar courts lacked robust powers of judicial review and the historical prestige of American courts, challenges to the Enabling Act went nowhere, and Germany descended into tyranny. Bizarrely, the Weimar Constitution continued on paper until 1945, after which Germany became an un-country under Allied rule. In 1949, Germany received a new Basic Law, which radically curtailed presidential power.15
Unlike Weimar or Rome, America does not have a constitution that allows rule by decree. Before the twentieth century, only one president consistently exceeded his lawful powers in a semi-Caesarean way: Abraham Lincoln. Indeed, Lincoln’s presidency was something of a model to some rightist Weimar jurists, though they drew the wrong lesson—as do advocates of American executive power today. For some people, the salvation of the Union begins and ends the argument for the strong executive. But Lincoln’s story is subtler and more reassuring than the ends justifying the means. While Lincoln did operate extra-Constitutionally, he faced a genuine emergency on home soil whose legal contours were not well defined, and did not seem keen on retaining emergency powers beyond the duration of the war. Even during the war, Lincoln usually tried to obtain legislative and judicial consent to his Constitutional freelancing. Most importantly, Lincoln did not pretend everything he did was either unquestionably legal and beyond account.
For example, during the first eighty days of the Civil War, Lincoln called up troops and spent without Congressional appropriation, actions beyond his Article II powers. However, Congress was then out of session and, once legislators returned, Congress “approved, and in all respects legalized and made valid” Lincoln’s military actions “as if they had been issued and done under the previous express authority and direction of the Congress.”16 This was no Caesarean toadying. While Lincoln engaged in some dilatory gamesmanship to ensure that Congress would convene only once all of his supporters could be present in Washington, he never recomposed or physically threatened Congress to get his way, and often sounded Congressional leaders before taking executive action.
Lincoln wanted the other branches, properly emplaced and of their own will, to validate his actions. Even Lincoln’s suspension of habeas corpus, a serious Constitutional breach, was eventually tidied up by Congress, this time in its own name.17 For the most part, Lincoln’s administration also litigated novel Constitutional issues in normal courts, and continued to hold free elections in the North, including Lincoln’s own reelection in 1864, which he expected to lose.18 Democracy continued, and the other branches remained active partners; the president did not feel he could act alone. So while Lincoln trod on the Constitution several times, he never denied that the Constitution should prevail, and the sooner the better. That, as much as the nobility of his cause, is what distinguishes Lincoln from true tyrants—Lincoln crossed the Potomac, not the Rubicon.
Even through the World Wars and Depression, the presidency was not entirely unbound. FDR accomplished most of his work through Congress. Whatever its Constitutional infirmities, the New Deal was substantially passed (or confirmed) by Congress, not naked fiat. While FDR believed he had legal authority to proceed almost unilaterally in matters of urgency, he acted otherwise, returning time and again to Congress and voters for support. When World War II erupted and Britain begged for aid before America entered the war, FDR did not ship munitions on some unfounded theory of wartime powers, but asked his lawyers to ensure that the Destroyers-for-Bases swap, achieved by executive agreement, fit somewhere in the nooks and crannies of the law (a punctiliousness that greatly irritated Churchill).19 Throughout his presidency, FDR’s public statements carefully noted that his emergency economic directives were issued “under the Constitution and under Congressional Acts” (people could usually tell which ones), explicitly recognizing the Capitol’s role. After Pearl Harbor, FDR traveled to Congress, and specifically obtained its authorization for war. During the conflict, FDR also largely accepted that he would have to litigate matters through normal courts, accepting losses as they came. So in two of the greatest wars America ever fought, one of them existential and the other potentially ruinous, presidents bent the Constitution but never declared themselves beyond it.
All that changed in 1950, when Truman entered the Korean War without Congressional approval, based solely on a UN resolution.20 There is nothing more dangerous than a nuclear state going to war, especially when that state is led by the only person in history to have actually used atomic weapons. War is a decision that requires contemplation and formality. Of course, in the event of an immediate and overwhelming attack on home soil, presidents may be forced to respond—though it proved no challenge to FDR to travel the 1.2 miles from White House to Capitol to receive Constitutional sanction to enter World War II. Apparently, this journey was beyond Truman, though the Korean War quite obviously involved no direct or imminent threats to American soil requiring swift, unilateral action.
If anything, American lives were lost because Truman acted. Some 36,574 American soldiers died in this undeclared war, but that’s post hoc; in 1950, there was no way to be sure that the butcher’s bill would not be vastly higher. North Korea was, after all, not just a client state of China (which fought directly), but also of the newly nuclearized Soviet Union, which covertly participated in the war. And though the Soviets did not then have the means to nuke the continental United States, they would shortly acquire them and could already strike America’s European bases and allies. If there were a time for Congress to insist on its right to debate a war, rather than accede to presidential instinct, 1950 was it. But Congress did not declare war; indeed, it’s never properly declared war since, despite constant military interventions, including the simultaneous occupations of two substantial nations.* Instead, Congress coughed up funds to support whatever interventions presidents had begun on their own initiatives garnished with the odd, Constitutionally ambiguous authorization.
Superficially, Congressional reticence over declaring war seems to mark no great break from prior practice; Congress has only formally declared five general wars, despite dozens of armed interventions since 1789. But qualitatively, 1950 was a watershed. The undeclared wars in Korea, Vietnam, Afghanistan, and Iraq bore no resemblance to the small, short, and near-riskless interventions of the nineteenth century. Does anyone remember American interventions in Sumatra, Samoa, Fiji, or Smyrna? Not really, and that is why these actions, legal or not, posed no great Constitutional risk.
Even as undeclared wars grew larger, their Constitutional moorings loosened almost completely. The War Powers Resolution of 1973, which authorizes military intervention without a proper war declaration, tried to offer a patina of legislative legitimacy.21 But presidents have declined the invitation, stating only that their actions are “consistent” with, but not “pursuant” to, the Act.22 The goal of this word game is to avoid conceding even an inch of presidential authority; to invoke the Act might concede that presidents do not enjoy an unrestricted Article II power to fight whenever, wherever they please. As statements of executive power go, this is about as potent as it gets. Yet, Congress remains complaisant, because norms have changed, just as they did in Caesarean Rome. Even when American interests aren’t in immediate peril and when hostilities risk open-ended and bloody quagmire, presidents believe they can act without much legislative pushback. And since 1950, they’ve been largely correct.*
In addition to usurping Congressional war powers, presidents have intruded substantially into other legislative domains. Article II requires that presidents “shall take Care that the Laws be faithfully executed”—i.e., the executive must enforce laws on an as-is basis.23 However, presidents have taken to issuing “signing statements,” offering commentary on whatever law they have just made official. Well into the twentieth century, signing statements contained little more than unobjectionable glosses or mild grousing.24 On the few occasions where nineteenth-century presidents dared go further, suggesting that laws might be unwise or unconstitutional (and by implication, go unexecuted), Congress provided harsh rebukes. The memory of royal prerogative and its abuses remained comparatively fresh. Thus, when Andrew Jackson issued a statement about limiting the scope of a road bill, Congress denounced it as an improper line-item veto; when John Tyler issued a signing statement, legislators condemned it as a “defacement of the public records and archives.”25 As late as 1875, President Grant described signing statements as “unusual,” an apt description of a Constitutionally suspect practice.26
In the twentieth century, presidents began to dabble with more, and more substantial, signing statements. Here again, Truman marked something of a turning point, issuing more than 100 statements, some offering vigorous objections to legislation. Since then, use has fluctuated, but the long-term trend is toward more intense use: Reagan issued 250 signing statements; Bush I issued 228; Clinton, 381; and Bush II, 161.27 Given contemporary bill-bundling, a single signing statement can now affect many different issues; for example, Bush II’s 161 statements offered more than 1,000 challenges to distinct provisions of law.28 These statements have also risen from hollow press releases and modest nitpicking to attacks on the very constitutionality of laws. This is bizarre because “signing statements” obviously indicate that presidents have signed laws, thereby granting them legal life, yet statements challenge the very laws to which they are attached. Presidents try to justify the practice on the seemingly unobjectionable grounds that their statements merely clarify that the executive will not enforce laws that breach the Constitution, the document presidents swear to uphold. If that were the real purpose, presidents wouldn’t bother with statements at all—grooms do not issue signing statements with their “I dos.” But upholding, it transpires, is a selective business and therefore a false one; the only part of the Constitution signing statements deign to “uphold” is Article II. Any law that treads on the most expansive read of presidential powers in that Article risks a statement and potential nonenforcement.29
Bush II’s statement binge prompted concerns about whether the executive was fulfilling its duty to faithfully execute the laws. As a candidate, Obama deplored Bush II’s signing statements as unlawful legislation, saying that “[w]e’re not going to use signing statements as a way of doing an end-run around Congress.”30 For the most part, Obama kept his word, issuing only 41 statements, though sometimes on important, multipart bills. As of mid-2018, Trump was issuing signing statements at roughly the same pace as Obama.31 It’s premature to conclude that signing statements have had their day. If Obama was comparatively circumspect about his powers, perhaps it was because he was cautious by disposition and by vocation, as a former Constitutional law professor. As for Trump, his slow pace almost certainly reflects the 115th Congress’s near-total inability to produce significant legislation rather than some newfound executive restraint.
Even if the pace of signing statements continues to moderate, a problem remains, because the correct frequency for signing statements is functionally zero. As we’ve seen, executive enforcement is discretionary, but there’s an important difference between making enforcement decisions on the merits of a specific issue and declining to enforce laws as a matter of presidential policy. To decline to enforce a law is, in effect, to nullify it. A special American Bar Association panel rightly condemned signing statements as undermining Constitutional principles and the rule of law.32 It’s easy to see why. Signing statements violate separation-of-powers by allowing presidents to selectively torpedo laws without resort to the official veto power. They also undermine the jurisprudential goals we saw long ago: How can laws be clear, predictable, and amenable to compliance if presidents enforce laws depending on mood, using documents whose legal status is uncertain and subject to revocation at any time? How can the rule of law be in place if presidents place themselves above law’s rules?
In riposte, Congressional research agencies say they have been unable to detect widespread nonenforcement attributable to signing statements. Not only does this fail to remove the jurisprudential concerns, one would not expect much direct evidence of executive malfeasance. The fog of executive discretion always makes it difficult to identify the real reasons why executive employees do, or don’t do, anything—though it seems unlikely (in most administrations, anyway) that executive branch staff simply ignore their boss’s view. Moreover, some signing statements specifically intend to make impossible any probing of illicit nonenforcement. For example, in 2002, Congress passed a law requiring the executive to provide information to legislators, including instances when DOJ declined to enforce laws it deemed unconstitutional.33 Here was a perfect opportunity to collect information about presidential discretion. While Bush II signed that law, he promptly affixed a statement to “construe” the law to preserve maximum presidential latitude and instructed agencies to “withhold information… which could impair foreign relations, the national security, the deliberative processes of the Executive, or the performance of the Executive’s Constitutional duties.”34 In other words, when Congress wanted to know what laws presidents declined to enforce as unconstitutional (the basis for virtually all important signing statements), the president refused on the grounds that this would impermissibly intrude on his Constitutional duties, reducing federal oversight law to Constitutional toilet paper. Bush II went even further by describing laws—which he had signed—as merely “purport[ing]” to direct executive agencies (most of which operate by delegation of Congressional power) to do this or that.35 In legalese, “purporting” means “pretending,” i.e., the executive denies that laws are laws except when and as the president chooses.
If a president actively disfavors a law, he can veto it in toto, and that’s all the Constitution permits. When Congress tried to give Clinton pick-and-choose powers via a “line-item veto,” the Court smacked the effort down as violating the Presentment Clause.36 If the Court squarely confronts signing statements, it’s hard to see principled Justices accepting that a president can universally decline to enforce a law (amounting, in substance, to a line-item veto) of his own volition and against Congressional wishes, given that the Court specifically ruled that Clinton could not wield such power when Congress tried to give it to him. The principles are the same, even if the precise legal grounds will differ. As for the objection that presidents need sometimes sign giant must-pass bills but somehow preserve the option to object to hidden and unconstitutional provisions within, the answer is that the executive can register his objections by participating in normal litigation.37 But presidents don’t want to do that, because the judiciary remains largely independent and the executive has more craven and clientelist resources to manufacture the required fig leaves. These helpers are the counsels’ offices at the White House and DOJ, which exist largely to provide a legal basis for any action the president may want to take. These offices are supposed to advise presidents about where the edge of the Constitutional cliff lies, but often encourage presidents to take a running jump.
Indeed, it was the DOJ’s Office of Legal Counsel (OLC) that birthed the modern, steroidal signing statement. In a 1986 memo penned by a young Samuel Alito (yes, that Alito), OLC argued that if signing statements proffered more substantive glosses, a “chief advantage[]” would be to “increase the power of the Executive to shape the law.” With reason, Alito noted that this was a “novelty” that Congress would be “likely to resent.”38 Therefore, Alito laid out a cynical boil-the-frog strategy for rolling out increasingly powerful statements, reaching deeper into the validity of laws themselves. However, in articulating a grand vision of executive power, Alito laid various landmines for himself. Nine years later, a rather different president was in charge, issuing a torrent of signing statements to thwart Alito’s ideological ally, Newt Gingrich—and therein lies the problem with signing statements: they are unstable. Alito was eventually compensated by his elevation to the Court, where he joined John Roberts. As it happens, Roberts trained at OLC and its sister institution, the office of the White House Counsel, where he disported himself by writing memos on how to dismantle the Voting Rights Act (about which his patron Bush II might say “mission accomplished”). The OLC and the White House Counsel are supposedly checks against hasty overreach, but in reality, they are more gas pedal than brake.
The presidential arsenal includes executive orders, which command the bureaucratic apparatus to effect presidential policy. Obviously, presidents enjoy inherent authority to direct subordinates, and just as obviously, those powers are not unlimited, else America could have saved itself the trouble and kept Lord North and George III. Until the twentieth century, executive orders tended to confine themselves to bureaucratic management, with the Emancipation Proclamation as a momentous exception. From Teddy Roosevelt on, presidents issued more consequential orders, quasi-legislative in character and affecting private persons and rights.
As usual, champions of executive power cite the Emancipation Proclamation as the shining example of presidential fiats used for good, but presidents have been drawing against emancipation’s moral credit without making many deposits. FDR’s Japanese–American internment order of 1942 was a major withdrawal, and Trump’s travel/immigration/border security orders from 2017 onward have made their deductions. Perhaps the largest class of debits are the Constitutional tensions created by an overreaching executive. Truman’s actions precipitated the crises of Dollar and Youngstown, and some of FDR’s orders were not Constitutionally pristine (though most enjoyed Congressional sanction). Eventually, the books will have to be balanced, and the ensuing receivership is sure to be unpleasant.
The typical euphemism for presidential centralization is the “unitary executive,” which supposedly refers to the president’s power to control the entire executive branch. In reality, it signals movement toward unitary government by the executive—legislative, executive, and, in the cases of military commissions and bureaucratic hearings, judicial. This degrades the law, whose overarching purpose is to foster social organization, a purpose made harder to achieve given the uncertain legal status of signing statements and the tendency of executive decrees to be made and changed on the fly (Trump undoing Obama’s orders, which undid Bush II’s, which undid Clinton’s, and so on). Nor is unitary government compatible with separated powers, though so long as Congress exercises budgetary authority and the judiciary has the final say over Constitutional interpretation, and the people support them in doing so, there will be no plenipotentiary Dear Leaders. None of this means that it’s impossible for America to experience a series of Constitutional failures that concatenate into political darkness. But these failures, while likely to be triggered by executive action, will always come from outside the Constitution: American presidents can’t point to some Weimar-like Article 48 to claim the right to govern by decree. This has historically been important, because gross violations can’t be tidied up as some part of the executive’s “inherent” powers, however much OLC muses to the contrary.
This is one area in which the Founding Fetish provides helpful clarity. Almost everything about the American Constitution’s history and text forbids executive tyranny, providing a bulwark that other, less successful constitutions have lacked. Nevertheless, the extraordinary powers now wielded by American presidents—to make war without declarations, control functional legislation via agencies and signing statements, establish military “commissions” to try suspected terrorists, and so on—push the nation toward the alarming condition of one-man rule. Unlike Lincoln, modern presidents do not acknowledge that they exceed their Constitutional powers, but implausibly assert that everything they do is flatly Constitutional (without pointing to any specific clause granting them such powers). The contemporary White House is much more powerful than the Nixonian morass that prompted Arthur M. Schlesinger, Jr. to write The Imperial Presidency in 1973. This is especially so as terrorism blurs the traditional boundaries between internal and external affairs. Foreign policy and military action have historically been the areas in which presidents enjoy the greatest deference by other branches, and terrorism refracts elements of both.39 The other branches can push back, but Congress’s habit of facilitating wars after they’ve begun, and the judicial tic of evading “political questions” and deferring to the executive whenever the Solicitor General incants “national security,” open considerable space for executive action. Presidents have filled the vacuum accordingly, abetted by legal pyrite like John Yoo’s torture memo, flowing out of the ever-accommodating OLC/OWHC.
In the moment, presidents often act with popular consent, but contemporaneous opinion polls are to republican government what the Billboard Top 40 is to the Pulitzer Prize for Music. Transient popularity is merely one (very historically troubled) indicator of legitimacy. More importantly, the public lacks all the facts in the moment and may rue swift actions of which they once approved. After all, since modern polling began, no president has been more popular than Bush II was immediately following 9/11, when he enjoyed a 90 percent approval rating.40 As the true facts emerged, Bush II’s approval slid, never breaching 50 percent after 2005 and hitting 34 percent when Bush II left office. But by January 2009, the whole apparatus of domestic spying had been erected, American troops were in their eighth year in Afghanistan and their sixth in Iraq, and Gitmo, the Black Hole of Cuba, had received almost 800 long-term visitors (including at least one American citizen) with something less than Genevan courtesy.41
Notwithstanding subsequent exposés, the odd judicial smackdown, and voluntary discontinuation of failed and palpably illegal policies, the cloak of secrecy and executive privilege means that the full scope of legal errors remains unknown. Indeed, Senators Kyl and Graham argued that courts couldn’t look into the issue. Those senators filed a freelance amicus brief with the Court, arguing that the Detainee Treatment Act of 2005 stripped the Court of its jurisdiction to review Guantanamo cases, an argument unsupported by the record (and fairly reeking of the scandals around Marbury), and duly rejected.* 42 The Act did, however, expand sovereign immunity for officials, including those performing “enhanced interrogations” where such were “authorized” and “determined to be lawful at the time,” with determinations being made by executive branch lawyers.43 In case this wasn’t enough, Bush II issued a signing statement invoking the “unitary executive” and noting that the legislation would be construed to be “consistent with the constitutional limitations on the judicial power,” a mysterious phrase not clarified until Kyl and Graham submitted their brief about jurisdiction-stripping.44 And there, in one vignette, the endgame of signing statements, executive orders, emergency powers, OLC memos, and the rest, lies exposed: the evisceration of judicial review and the immunization of the executive branch. And for what? In Bush II’s case, a rather dubious set of wars waged in the name of democratic norms.
Indeed, the aftermath of Bush II’s “unitary executive” suggests why “nimbleness” may not be a virtue. After 9/11, the president surely needed to protect the nation with force, potentially pushing toward Constitutional frontiers as they were reasonably understood. No president need pore over First Amendment case law before shutting down a website offering a formula for anthrax and a map of the Pentagon’s ducting; as has been said many times, the Constitution is not a suicide pact. But nor is the Constitution, not even Article II, a blank check. Nevertheless, during national security crises real and imagined (e.g., the dubious Gulf of Tonkin episode), presidents have repeatedly pushed past all sensible limits, launching undeclared wars. Few wars require immediate, almost unthinking, action. But as the most visible responder, presidents can whip up irresistible popular support in the moment, saddling Congress and the courts with faits accomplis and, eventually, the public with its regrets. That the greatest consequences have been borne by non-Americans is, in a legal sense, partial mitigation; presidents have few legal duties to noncitizens except those explicitly imposed by law or incorporated by treaty. But mitigation is not exculpation; a Constitutional sin is still a sin.
These considerations may seem somewhat abstract, literally far removed from the United States. Unfortunately, there’s no reason to believe that legal wrongs will always occur at a comfortable distance. The essential premise of counterterrorism is that an attack will occur in America. Presidents may be tempted to deploy the federal military domestically, and indeed, a 2006 revision of the Insurrection Act provided that if “the President determines” that a natural disaster, terrorist attack, or “other condition” (as determined by the president) leads to “violence” such that local authorities “are incapable of maintaining public order,” then he may “employ the armed forces.”45 These provisions, strikingly similar those in the Weimar Constitution, were walked back by the following Congress. But loopholes still exist; for example, the president may use the armed services to enforce federal law, on his own initiative in a self-determined emergency, if it is “impracticable” to proceed through the “ordinary course of judicial proceedings.”46 Well, what does that mean? A border crisis in California, where the governor is alleged to defy federal dictates and where legal resolution will be furnished by “so-called judges” ostensibly disabled by ethnic affinity? State regulation of recreational marijuana leading to a “crime spike”? Few judges would countenance such intrusions, but then again, the statute presupposes that judges will not be able to opine in the moment.
A particular challenge with emergencies, especially those involving terrorism, is that the executive must rely on secrecy, so illegal practices can be insulated from easy judicial and legislative review, or scrutiny from the public and the media. Secrecy has its purposes, but like attorney-client privilege, there are limits: systematic Constitutional abuses are not secrets; they’re scandals. Congress could investigate, of course, but Congress is fairly supine when it shares a party with the president, and presidents are fairly obstreperous when Congress belongs to the other side. Courts, meanwhile, can take only live cases—which means that they can’t act until someone leaks, or unconstitutional actions become painfully overt.
The complacent response is that America is not some banana republic. The executive doesn’t “disappear” citizens or expropriate property willy-nilly, as Latin American dictatorships did. (Leaving aside denaturalizations, deportations, and civil forfeitures, to say nothing of the CIA’s probable involvement, at White House direction, with the juntas and Operation Condor.) Nor does the executive maintain secret police or condone arbitrary detention and torture. (Leaving aside the NSA’s warrantless wiretapping, the noncitizens dumped into Gitmo and various black sites, stop-and-frisk, and police brutality.) The executive doesn’t detain people simply because it doesn’t like the way they look or think. (At least, not anymore, or anyway, not as a matter of stated policy, except when it does.47) But that still leaves the executive with extraordinary powers. Surely, the executive has the burden of proof as to why it must bend the Constitution; just as surely, the unmagical words “national security” do not discharge that burden. Why hasn’t law established more robust constraints on the executive, clarifying the status of war powers, executive orders, states of emergency, and signing statements? The practical answer is that the costs of executive power so far have been diffused into abstractions like civic erosion, the national debt, and indignities visited on other countries. No violation has been so palpably grave as to prompt a wholesale rethinking of the presidential powers that have accreted over the past century.
With so much authority, so many things to hide, and the great political tradition of scandal, presidents can trigger Constitutional crises even without troops on the streets. The most likely causes are presidential management of unpopular wars and foreign policy—specifically, the efforts to disguise their practical and Constitutional failings. Variations on these themes helped bring down Truman (Korea), Johnson (Vietnam), and Nixon (Vietnam/Laos/Cambodia), and badly damaged Reagan (Iran-Contra) and Bush II (Iraq/Afghanistan). Even Clinton suffered from the perception that his missile strikes in Sudan and Afghanistan were employed as a distraction from the Lewinsky case (well-timed and, given subsequent evidence the Clinton Administration labored to conceal, including by fiddling UN inspections, lacking strong factual basis). That most presidents since FDR have tripped over the same issue and been called on it stands simultaneously as a symbol of Constitutional (and journalistic) strength and the imperial presidency’s specific weakness.
Whatever subject drives the crisis, when it comes, the key players will almost certainly be the president and the judiciary. The most direct route to conflict is presidential defiance of a major judicial order. Of course, presidents defy lesser courts on small matters constantly, but since the Civil War, no president has dared defy a firm ruling by the Court on any matter of great and independent significance. Presidents do push the limits, however, and in recent years, the unifying goal has been to protect the ever-growing body of secrets hoarded by the executive.
The cardinal case here is United States v. Nixon, wherein the president asserted sweeping executive privilege to resist a subpoena of the Watergate tapes. Tellingly, Nixon’s personal attorney informed a lower court that the president had wanted him to argue that Nixon was “as powerful a monarch as Louis XIV,” essentially inviting judges to reject the argument, which they duly did.48 When the case reached the Court, it was clear at oral argument that Nixon would lose, but the Justices appreciated the stakes and wanted a unanimous decision. While the opinion itself was a bit muddled by the effort to achieve unanimity, the Court did hand down an 8–0 decision.* 49 That was good enough; Nixon resigned sixteen days later. However traumatic, the Constitution held.
The Lewinsky scandal provided the next major confrontation, where executive branch employees invoked various executive privileges to resist grand jury questions and other legal process. On March 20, 1998, President Clinton himself invoked executive privilege. Courts processed the lower-stakes claims first, and the privilege asserted by Secret Service agents seemed especially weak (the consensus being that they had been acting more as butlers/procurers than agents on official business). On July 17, 1998, Rehnquist allowed lower courts to compel the agents’ testimony. Ken Starr’s grand jury issued a subpoena around July 25, setting up a confrontation, as presidents had long considered themselves immune to subpoena. But the political writing was on the wall, and Starr underlined it by granting immunity to the Lewinsky family. On July 29, the president agreed to be voluntarily deposed, and Starr withdrew his subpoena, averting a Court case.50 On August 17, Starr deposed Clinton; that same day, Clinton went on national TV and admitted his affair. The House subsequently impeached Clinton and the Senate conducted a trial, presided over by the Zelig-like Rehnquist. Clinton wasn’t acquitted until February 12, 1999, but the Constitution had been out of danger long before, when Clinton decided not to challenge Starr’s subpoena. Ultimately, Clinton was cited for contempt of a lower court, fined, and had his law license suspended.
Crucially, in the Nixon and Clinton dramas, both presidents stood down instead of escalating into open defiance. It’s not irrelevant that both presidents were lawyers trained to view the Court with the same level of deference that older parish priests reserve for the Vatican, and may have found defying the Justices emotionally difficult—especially given that both presidents were, legally and factually, guilty of one or more sins that had been publicized to the point of undeniability.51 As a result, the law of executive privilege is not well developed—the Court had not even articulated a real doctrine until Nixon—and only a few cases have been litigated seriously since.52 Outside the White House, the general view is that executive privilege is not absolute, and applies with decreasing strength as matters get further from presidents and their core duties. Within the White House, the position varies by president, from maximalists (Nixon, Clinton, Bush II, e.g.) to moderates (Bush I, Obama, e.g.). The only plausible minimalist in the past forty years has been Reagan, whose administration declared that executive “privilege should not be invoked to conceal evidence of wrongdoing or criminality on the part of executive officers”; Reagan even turned over parts of his diaries during the Iran-Contra investigation.53
Presidents with fewer scruples or legal finesse may assert privileges that do not really exist. A particularly dangerous temptation may be to blend executive privileges with the pardon power; what information cannot be buried will be pardoned away. This combination could trigger a meltdown, and no sitting president has yet been foolish enough to try it on a matter of any consequence, though matters mixed rather uncomfortably in the ethical murk that was Clinton’s Pardon Office. Judicial Watch, a right-leaning organization, was particularly curious about Clinton’s pardons of donors like Marc Rich (a decision condemned by Jimmy Carter and rued by Clinton himself). Judicial Watch sued for records, but the case didn’t arrive in court until after Clinton left office, and when DOJ lost one of its privilege arguments in 2004, the political moment had passed.54
With a sitting president, however, the stakes would be immeasurably higher. Depending on the gravity of the offense, the president’s popularity, and remaining years in office, courts might reasonably choose to delay proceedings. That might not be particularly inspiring or just, but it could be pragmatic. The alternative, after all, would be to invoke the uncertain power of contempt, risking breakdown.
In 1958, President Eisenhower declared that the nation would have an annual celebration of the “principle of government under laws.”55 It took three years for Congress to actually pass the legislation formally establishing Law Day, but never mind.56 Presidents can proclaim (nonbinding) days of observance as they please, and what pleased Eisenhower was to deposit Law Day on May 1, the same day that the Eastern Bloc celebrated international labor by rolling tanks down the streets. As a former five-star general, D-Day hero, Supreme Allied Commander, and since 1953 the civilian leader of the world’s greatest military, Eisenhower could certainly have arranged his own parade, and rather better than Nikita Khrushchev, the jumpy ex–metal worker with a vocational school education then ruling Moscow. But gaudy displays of missiles and jets would be unbefitting a free people; anyway, if the point was to compare arsenals, better to brandish the greatest weapon in America’s possession: the rule of law. It was a slap that glowed vividly, nowhere more so on Khrushchev’s cheek. After all, the Soviet tanks in the 1958 May Day parades had crushed dissent in Budapest just eighteen months earlier, when Hungary dared defy its imperial master. And when not on parade duty in the spring of 1958, those same tanks were deployed around Moscow, to cow the Presidium into sanctioning the final consolidation of all state power into Khrushchev’s person. So those were the options on May 1, 1958: dictatorship or democracy; the rule of man or the rule of law. Eisenhower made his choice, and was certain he would be vindicated.
Thirty-one May Days later, it seemed Eisenhower’s vision would be realized. The Warsaw Pact was in disarray and even Moscow would soon enjoy its first direct election. There was a hitch, though; even as the Eastern Bloc inched toward Western ideals, the West had migrated toward an Easternized executive—“oriental despotism,” as a different generation would have tactlessly put it. And so Washington increasingly became what it beheld, a place where executives wield power over all areas of life, beyond normal account. The predictable consequences have been instability and power struggles within the executive branch—with the palace comes the intrigue. In the Trump Administration, plagued by leaks and in open conflict with the FBI and DOJ, the “unitary” executive is already at war with itself.
The greater executive power becomes, the larger the possibility for error. After decades of expansion, the presidency has become a near-impossible job, reposed in one beleaguered and often unstable person. The question is not whether executives will fail—many already have—but how serious and frequent future failures will be, and whether wrongdoers will have the grace to withdraw after making a serious mistake. A president unwilling to do so might unravel the system from above. But this assumes the system is not already unraveling from below, as the public slowly recoils from a legal system that falls short of its promises.