PIOUS PAIR
WHAT MAKES SENS. MCCAIN AND LIEBERMAN SO APPEALING IS ALSO WHAT MAKES THEM SO ANNOYING.
Slate, Jan. 23, 2003
Back when I was a co-host on CNN’s Crossfire, Joe Lieberman and John McCain were known as “7:15 guys”—meaning that the producer could call either of them up at 7:15 P.M., and they’d be on time for a live show at 7:30. (At least, unlike another current senator, they asked what the topic was before dropping important affairs of state and rushing over.) McCain once even came back for a partial rebroadcast at 1 A.M. when, for reasons of verisimilitude, they needed the same senator wearing the same shirt.
To say that two members of the Senate are publicity hounds because they like to be on television is a bit redundant (find a shy senator) and a bit unfair (nobody had a gun to my head either). But even among the self-promoters of Washington, Lieberman and McCain stand out for their enthusiasm and their skill. An important part of that skill, of course, is making enthusiasm look like reluctance. Both are fond of the conceit that they are saddened or alarmed or deeply disturbed by whatever matter impelled them toward the microphones that particular day. The image in your mind, though, if you are an irritated fellow senator or even just a lay cynic, is of Joe or John perusing the newspaper over breakfast coffee as if it were a shopping catalog, looking for something to be saddened by today.
Many a colleague must read a headline like that in the Times the other day, “McCain and Lieberman Offer Bill to Require Cuts in [Greenhouse] Gases” and think, “Gasbag, heal thyself.”
By any objective standards, Lieberman and McCain are among the very best of our national politicians. They are smarter, more interesting, and probably more honest than most of their colleagues. On the issues they choose to spotlight, they’re usually right, often first (or at least ahead of the horde) and occasionally even courageous. It’s not surprising that Lieberman is now a front-runner for the 2004 Democratic presidential nomination while Al Gore, who put Lieberman on the map, is gone. Nor is it surprising that dreamers of both parties imagine McCain at the heads of their tickets, rescuing the country from a second George W. Bush turn. Yet there is a mystery to solve about both these virtuous politicians: Why, despite their virtues, are they so annoying?
Obviously it is in part because of their virtues, not despite them. Or rather, it is because of the way they wear their virtues on their sleeves. They are, in a word, pious. If hypocrisy is the tribute vice pays to virtue, piousness is virtue paying tribute to itself.
Lieberman is literally pious—a devout Orthodox Jew—and that is admirable, especially in a politician with the highest ambitions. But he also has the hectoring, bromidic high-rhetorical style reminiscent of an especially pompous clergyman. (“These are not ordinary times for our country. Therefore those of us who seek our highest office or hold it cannot practice ordinary politics.” When exactly were these ordinary times when ordinary politics, whatever that means, would have been OK by Joe Lieberman?) His jokes are labored and dutiful. All this melds unattractively with the hair-trigger indignation of a more recent but increasingly familiar social type: the ambulance-chasing state attorney general, always scanning the horizon in search of a reason for a press conference. Greenhouse gases today, violent video games tomorrow, some other alliterative outrage the next.
McCain, by contrast, is the naughty boy who gets too much pleasure out of his reputation for naughtiness. While Lieberman always plays it straight, McCain’s performances come with a bit of a wink for those who are looking for one. He makes clear that he gets the joke, which is flattering when you first feel the warmth of his conspiratorial embrace but less so as you come to reflect that the joke may be on you.
Both men are hooked on cheap iconoclasm. How many times can a politician be the rare member of his party who takes the position of the other party on some issue or other before this stops being such a wonderful surprise? McCain and Lieberman have stumbled (perhaps) on a brilliant formula. By being dissidents toward the center, rather than toward the extreme, they get to luxuriate in two of the press’s most popular (and, you would have thought, mutually exclusive) categories simultaneously: courageous outsider and moderate voice of reason.
But moderation, far from courageous, can be too easy. Lieberman opposes President Bush’s tax plan but “said he was intrigued” (the Washington Post) by the idea of tax-free dividends, which is the plan’s centerpiece, even though it “doesn’t do anything” to help get the economy “out of the rut.” Under the nutty conventions of the media, this kind of talk gets you points for statesmanship and sophistication, rather than a penalty for having it every way and a general lack of any meaning whatsoever. McCain, during the Clinton years, used similar techniques to develop a reputation for statesmanship and foreign-policy expertise. His views on the use of American power are easier to admire than to parse.
On the other hand, despite their annoying piousness, either McCain or Lieberman would make a better president than the incumbent or the other obvious alternatives. Now, that’s really annoying.
MORALLY UNSERIOUS
Slate, Jan. 29, 2003
The second half of President Bush’s State of the Union speech Tuesday night, about Iraq, was a model of moral seriousness, as it should be from a leader taking his nation into war. Bush was brutally eloquent about the cause and—special points for this—about the inevitable cost. It may seem petty to pick apart the text. But logical consistency and intellectual honesty are also tests of moral seriousness. It is not enough for the words to be eloquent or even deeply sincere. If they are just crafted for the moment and haven’t been thought through, the pretense of moral seriousness becomes an insult.
In his most vivid passage, Bush listed practices of Saddam Hussein such as destroying whole villages with chemical weapons and torturing children in front of their parents. “If this is not evil, then evil has no meaning,” he said, telling “the brave and oppressed people of Iraq” that “the day he and his regime are removed from power will be the day of your liberation.”
This is a fine, noble reason to wage war against Iraq. It would have been a fine reason two decades ago, which is when Saddam destroyed those villages and the United States looked the other way because our bone of contention back then was with Iran. It would be a fine reason to topple other governments around the world that torture their own citizens and do other despicable things. Is the Bush administration prepared to enforce the no-torturing-children rule by force everywhere? And what happens if Saddam decides to meet all our demands regarding weapons and inspections? Is he then free to torture children and pour acid on innocent citizens without fear of the United States?
If Saddam’s human-rights practices morally require the United States to act, why are we waiting for Hans Blix? Or if the danger that Saddam will develop and use weapons of mass destruction against the United States justifies removing him in our own long-term self-defense, what does torturing children have to do with it? Bush was careful not to say explicitly that Iraq’s internal human-rights situation alone justifies going to war—though he was just as careful to imply that it does. But Bush has said clearly and often that Saddam’s external threat does justify a war all by itself. So, human-rights abuses are neither necessary nor sufficient as a reason for war, in Bush’s view, to the extent it can be parsed. Logically, they don’t matter. That makes the talk about the torture of children merely decorative, not serious.
And tell us again why we’re about to invade Iraq but we’re “working with the countries of the region” to pinion North Korea, which is further along the nuclear trail and can’t even be bothered to lie about it. Bush’s “axis of evil” coinage last year and recent flagrant North Korean nose-thumbing made it almost impossible for Bush to avoid addressing this logical conundrum. His solution was artful but mysterious: “Our nation and the world must learn the lessons of the Korean Peninsula, and not allow an even greater threat to rise up in Iraq.” He seems to be saying here that the United States should have invaded and conquered North Korea years ago. But as Bush sets it out, the “lesson” of Korea seems to be that if you don’t go to war soon enough, you might have a problem years later that can be solved through regional discussions. That doesn’t sound so terrible, frankly. Regional discussions can be grim, no doubt, but they’re more fun than a war. So, what exactly is this lesson the Korean experience is supposed to offer?
There are actually plenty of differences between the situation on the Korean Peninsula and the one in the Middle East, and good reasons why you might decide to bring Iraq to a crisis and steer North Korea away from one. But all these reasons cut against the Manichean notion of an absolute war against an absolute evil called terrorism. Bush is getting terrific credit for the purity and determination of his views on this subject. But either his own views are dangerously simplistic or he is purposely, though eloquently, misleading the citizenry.
Proclaiming the case for war as the second half of a speech that devoted its first thirty minutes to tax cuts and tort reform also makes the call to arms seem morally unserious. Why are we talking about cars that run on hydrogen at all if the survival of civilization is at stake over the next few months? Bush declared that the best thing to do with government money is to give it back to the taxpayers, and then put on his “compassionate conservative” hat and proposed billions in government spending on the environment and on AIDS in Africa and on a program to train mentors for children of prisoners and on and on. The dollars don’t exist to either give back or spend, of course, let alone both, so we’ll be borrowing them if Bush has his way, a point he didn’t dwell upon.
This orgiastic display of democracy’s great weakness—a refusal to acknowledge that more of something means less of something else—undermined the moral seriousness of the call to arms and sacrifice that followed. Sneering at the folly of tax cuts spread over several years instead of right away, Bush failed to note that those gradual tax cuts were part of his own previous tax bill. Bragging that he would hold the increase in domestic discretionary spending to 4 percent a year, Bush probably didn’t stop to wonder what that figure was under his tax-and-spend Democrat predecessor. Short answer: lower. These are venial sins in everyday politics, but Bush was striving for something higher. He had the right words for it. But words alone aren’t enough.
DESERT SHIELDS
Slate, Feb. 27, 2003
Saddam Hussein, it seems, is not just a dictator and mass murderer. He is a bounder as well. While we amass hundreds of thousands of troops and billions of dollars of military equipment near his borders, with the frank intention of removing him from power and probably from life, he is welcoming a few dozen scraggly Western war protesters to act as “human shields” by planting themselves next to potential bombing targets such as power plants. It’s just not cricket, complains Secretary of Defense Donald Rumsfeld. Using civilians as human shields “is not a military strategy.” It is “a violation of the laws of armed conflict.”
Rumsfeld’s indignation is fey. Since the premise and justification for our imminent invasion of Iraq is that Saddam is evil and ruthless, which is certainly true, it would be remarkable if he played the game of war according to Hoyle. Why should he? It’s not going to improve his reputation and will do nothing for his life expectancy either. Indeed one of the big surprises of the build-up to Gulf War I was Saddam’s sudden decision to release the Western civilians he had initially forced to live near military targets. That certainly made America’s job easier. And as a practical matter, it may have cost more civilian lives than it saved, by giving us more freedom to bomb.
Like “terrorism” and like “weapons of mass destruction,” the anathema on the use of human shields is an attempt to define certain methods of war as inherently illegitimate, whether the cause for which they are used is legitimate or not. It’s a noble effort, but difficult to sustain and may require more intellectual consistency than the current American administration, at least, is capable of. There have been well-documented reports during the past year, for example, that the Israeli army has used Palestinian civilians as human shields. The U.S. reaction has been muted and generalized mumblings of disapproval and calls for all parties to resolve their differences by negotiation in good faith. No high horse to be seen.
Then, too, it is a bit problematic to be invoking international law and insisting on your right to ignore it at the same time, in the same cause, and with the same righteous indignation. International law says, “Thou shalt not use human shields.” It also says, “Thou shalt not use military force without the approval of the Security Council—even if thou art the United States of America and some idiot long ago gave veto power to the French.” The test of a country’s commitment to international law—and the measure of its credibility when it accuses other countries of flouting international law—is whether that country obeys laws even when it has good reasons to prefer not to.
Just like specific instances such as the rule against using human shields, the general regime of international law depends on a willingness to sacrifice short-term goals that may even be admirable for the long-term goal of establishing some civilized norms of global behavior. It sounds naive, and maybe it is. But you’re either in the game or you’re not. You can’t pick and choose which rules to take seriously.
Supporters of the coming war find it maddening that so many people say, “I’m for it if we have U.N. approval, but not if we act unilaterally.” This is an awfully convenient resting point for bet-hedging politicians. It also seems to be the most popular position in opinion polls. (And it was the conclusion of a thunderously ambivalent full-page editorial in last Sunday’s New York Times.) For heaven’s sake, this is war we’re talking about. And even if we do get international approval, this is overwhelmingly an American show: our initiative, our insistence, our leadership, mostly our money and our blood. Surely, these irritated hawks say, making the right decision is more important than how that decision is made. Putting procedure aside, are you for this war or against it?
But “only if it’s multilateral” is not the copout it may seem. Not just because of concern about an anti-American backlash. And not just because obeying international law has an independent value in its own right. In the specific circumstances of this particular war, multilateral procedures can alleviate our substantive doubts.
Like generals, anti-war protesters are always fighting the last war. Or in this case, depending on how you count, the war-before-last. The methods, the style, the arguments, the very language of objecting to war are still stuck in Vietnam. That’s why the protests of the past couple of weeks have seemed so lame and retro. The Vietnam debate was primarily a moral one. Although the cost of victory became an important factor as the years went on, it was not the main factor turning people against that war. Americans ultimately decided it was a victory we shouldn’t even want. In the case of Iraq, by contrast, few people think the goal of overturning Saddam Hussein is immoral. If we knew for sure it would be as easy and cheap as the administration hopes, few folks would object.
It is often thought that moral questions are inherently fuzzy and uncertain while factual questions are concrete and sharp-edged. But there is always at least the possibility that your strongly held moral view is the right one even when most other folks disagree. Factual predictions about the future, by contrast, will ultimately turn out right or wrong, but meanwhile they are fogged by a more fundamental unknowability. The case for democracy among nations, like the case for democracy within nations, depends in part on this particular human failing. Even if Saddam Hussein were well-meaning, he still wouldn’t be all-knowing. The United States actually is well-meaning, but we’re not all-knowing either.
J’ACCUSE, SORT OF
YOU NEVER KNOW WHERE YOU’RE GOING TO FIND ANTI-SEMITIC PROPAGANDA.
Slate, March 12, 2003
Rep. James P. Moran of Virginia, already a locally famous foot-in-mouther, went national last week by declaring at an anti-war rally that “if it was not for the strong support of the Jewish community,” the war against Iraq would not be happening. He said that Jewish “leaders” are “influential enough” to reverse the policy “and I think they should.”
The thunderous rush of politicians of all stripes to denounce Moran’s remarks as complete nonsense might suggest to the suspicious mind that they are not complete nonsense. Moran himself almost immediately denounced his own words as “insensitive.” He said he was using the term “Jewish community” as a shorthand for all “organizations in this country,” which would certainly be a first if it were at all plausible.
As others have noted, Moran’s words are less alarming for their own direct meaning than for their historic association with some of the classic themes of anti-Semitism: the image of Jews as a monolithic group suffering from “dual loyalty” and wielding nefarious influence behind the scenes. When someone touches even lightly on these themes in public, it’s only natural to wonder whether his or her actual views are a lot darker.
Nevertheless, Moran is not the only one publicly exaggerating the power and influence of the Zionist lobby these days. It is my sad duty to report that this form of anti-Semitism seems to have infected one of the most prominent and respected—one might even say influential—organizations in Washington. This organization claims that “America’s pro-Israel lobby”—and we all know what “pro-Israel” is a euphemism for—has tentacles at every level of government and society. On its Web site, this organization paints a lurid picture of Zionists spreading their party line and even indoctrinating children. And yes, this organization claims that the influence of the Zionist lobby is essential to explaining the pro-Israel tilt of U.S. policy in the Middle East. It asserts that the top item on the Zionist “agenda” is curbing the power of Saddam Hussein. The Web site also contains a shocking collection of Moran-type remarks from leading American politicians.
Did you know, for example, that former President Clinton once described the Zionist lobby as “stunningly effective” and “better than anyone else lobbying this town”? Former House Speaker Newt Gingrich has gone even further (as is his wont), labeling the Zionists “the most effective general interest group…across the entire planet.” (Gingrich added ominously that if the Zionist lobby “did not exist, we would have to invent” it.) House Minority Leader Dick Gephardt is quoted saying that if it weren’t for the Zionist lobby “fighting on a daily basis,” the close relationship between America and Israel “would not be.” Sen. John McCain has said that this lobby “has long played an instrumental and absolutely vital role” in protecting the interests of Israel with the U.S. government. There is a string of quotes from leading Israeli politicians making the same point.
According to this Web site, the Zionist lobby is, like most political conspiracies, a set of concentric circles within circles. The two innermost circles are known as the “President’s Cabinet” and the “Chairman’s Council.” Members allegedly “take part in special events with members of Congress in elegant Washington locations,” “participate in private conference calls,” and attend an annual “national summit.” In the past members of these groups have met “in a private setting” with President Clinton, with Vice President Gore, and with the president of Turkey, among others. If this Web site is to be believed, these Zionist-lobby insiders have even enjoyed “a luncheon with renowned author and commentator George Will.”
And who is behind this Web site? Who is spreading the anti-Semitic canard that Jews and Zionists influence American policy in the Middle East, including Iraq? It is a group calling itself the America-Israel Public Affairs Committee, or AIPAC, and claiming to be “pro-Israel.” They all claim that, of course. But in this case, AIPAC actually is considered to be the institutional expression of the amorphous Zionist lobby. All the foregoing quotes and assertions about the huge Zionist influence with the U.S. government and the lengths to which Zionists go to protect and expand it actually refer to AIPAC itself.
This doesn’t make it all true, of course. AIPAC, like any organization, has an institutional interest in exaggerating its own importance. This is especially true of any organization that must raise money to support itself. The “President’s Club” and “Chairman’s Council” are both fund-raising gimmicks, intended to give donors the feeling that they are in the thick of government policy-making. It’s more about being able to say, “As I was saying to Colin Powell” than about trying to say anything in particular to Colin Powell. Another element in AIPAC’s braggadocio is rivalry with other Jewish organizations. The American Jewish Committee also has a page of quotes on its Web site about how influential it is. (“We know that yours is the most important and powerful Jewish organization in the United States,” says President Jacques Chirac. Maybe it sounds more like a compliment in French.) This evident rivalry undermines any notion of a unified Jewish conspiracy.
Just as African-Americans can use the “n” word when joshing among themselves and it sounds a lot different than when used by a white person, talk about the political influence of organized Jewry sounds different when it comes from Jewish organizations themselves. Nevertheless, you shouldn’t brag about how influential you are if you want to get hysterically indignant when someone suggests that government policy is affected by your influence.
UNAUTHORIZED ENTRY
THE BUSH DOCTRINE: WAR WITHOUT ANYONE’S PERMISSION.
Slate, March 20, 2003
Until this week, the president’s personal authority to use America’s military might was subject to two opposite historical trends. On the one hand, there is the biggest scandal in constitutional law: the gradual disappearance of the congressional Declaration of War. Has there ever been a war more suited to a formal declaration—started more deliberately, more publicly, with less urgency, and at more leisure—than the U.S. war on Iraq? Right or wrong, Gulf War II resembles the imperial forays of earlier centuries more than the nuclear standoffs and furtive terrorist hunts of the twentieth and twenty-first. Yet Bush, like all recent presidents, claims for his person the sovereign right to launch such a war. Like his predecessors, he condescends only to accept blank-check resolutions from legislators cowed by fear of appearing disloyal to troops already dispatched.
On the other hand, since the end of World War II, the United States has at least formally agreed to international constraints on the right of any nation, including itself, to start a war. These constraints were often evaded, but rarely just ignored. And evasion has its limits, enforced by the sanction of embarrassment. This gave these international rules at least some real bite.
But George W. Bush defied embarrassment and slew it with a series of Orwellian flourishes. If the United Nations wants to be “relevant,” he said, it must do exactly as I say. In other words, in order to be relevant, it must become irrelevant. When that didn’t work, he said: I am ignoring the wishes of the Security Council and violating the U.N. Charter in order to enforce a U.N. Security Council resolution. No, no, don’t thank me! My pleasure!!
By Monday night, though, in his forty-eight-hour-warning speech, the references to international law and the United Nations had become vestigial. Bush’s defense of his decision to make war on Iraq was basic: “The United States of America has the sovereign authority to use force in assuring its own national security.” He did not claim that Iraq is a present threat to America’s own national security but suggested that “in one year or five years” it could be such a threat. In the twentieth century, threats from murderous dictators were foolishly ignored until it was too late. In this century, “terrorists and terrorist states” do not play the game of war by the traditional rules. They “do not reveal these threats with fair notice in formal declarations.” Therefore, “Responding to such enemies only after they have struck first is not self-defense. It is suicide.”
What is wrong with Bush’s case? Sovereign nations do have the right to act in their own self-defense, and they will use that right no matter what the U.N. Charter says or how the Security Council votes. Waiting for an enemy to strike first can indeed be suicidal. So?
So first of all, the right Bush is asserting really has no limits because the special circumstances he claims aren’t really special. Striking first in order to pre-empt an enemy that has troops massing along your border is one thing. Striking first against a nation that has never even explicitly threatened your sovereign territory, except in response to your own threats, because you believe that this nation may have weapons that could threaten you in five years, is something very different.
Bush’s suggestion that the furtive nature of war in this new century somehow changes the equation is also dubious, and it contradicts his assertion that the threat from Iraq is “clear.” Even in traditional warfare, striking first has often been considered an advantage. And even before this century, nations rarely counted on receiving an enemy’s official notice of intention to attack five years in advance. Bush may be right that the threat from Iraq is real, but he is obviously wrong that it is “clear,” or other nations as interested in self-preservation as we are (and almost as self-interested in the preservation of the United States as we are) would see it as we do, which most do not.
Putting all this together, Bush is asserting the right of the United States to attack any country that may be a threat to it in five years. And the right of the United States to evaluate that risk and respond in its sole discretion. And the right of the president to make that decision on behalf of the United States in his sole discretion. In short, the president can start a war against anyone at any time, and no one has the right to stop him. And presumably other nations and future presidents have that same right. All formal constraints on war-making are officially defunct.
Well, so what? Isn’t this the way the world works anyway? Isn’t it naive and ultimately dangerous to deny that might makes right? Actually, no. Might is important, probably most important, but there are good, practical reasons for even might and right together to defer sometimes to procedure, law, and the judgment of others. Uncertainty is one. If we knew which babies would turn out to be murderous dictators, we could smother them in their cribs. If we knew which babies would turn out to be wise and judicious leaders, we could crown them dictator. In terms of the power he now claims, without significant challenge, George W. Bush is now the closest thing in a long time to dictator of the world. He claims to see the future as clearly as the past. Let’s hope he’s right.
UNSETTLED
Slate, April 10, 2003
NOTE: Hard to believe now, but there really was a moment when the Iraq war appeared to be “won.” This piece mistakenly allowed Bush to define “victory” in Iraq as victory over Saddam. But it got the basic point—that dislodging Saddam was just the beginning of our troubles—right.
So, we’ve won, or just about. There is no quagmire. Saddam is dead, or as good as, along with his sons. It was all fairly painless—at least for most Americans sitting at home watching it on television. Those who opposed the war look like fools. They are thoroughly discredited and, if they happen to be Democratic presidential candidates (and who isn’t these days?), they might as well withdraw and nurse their shame somewhere off the public stage. The debate over Gulf War II is as over as the war itself soon will be, and the anti’s were defeated as thoroughly as Saddam Hussein.
Right? No, not at all.
To start with an obvious point that may get buried in the confetti of the victory parade, the debate was not about whether America could topple the government of Iraq if we chose to make the attempt. No sane person doubted that the mighty United States military machine could conquer and occupy a country with a tiny fraction of our population and an even tinier fraction of our wealth—a country suffering from over a decade of economic strangulation by the rest of the world.
Oh, sure, there was a tepid public discussion of how long victory might take to achieve, in which pro’s and anti’s were represented across the spectrum of opinion. And the first law of journalistic dynamics—The Story Has to Change—inevitably produced a couple of comic days last week when the media and their rent-a-generals were peddling the Q-word. No doubt there are some unreflective peaceniks still mentally trapped in Vietnam, or grasping at any available argument, who are still talking quagmire. But the serious case against this war was never that we might actually lose it militarily.
The serious case involved questions that are still unresolved. Factual questions: Is there a connection between Iraq and the perpetrators of 9/11? Is that connection really bigger than that of all the countries we’re not invading? Does Iraq really have or almost have weapons of mass destruction that threaten the United States? Predictive questions: What will toppling Saddam ultimately cost in dollars and in lives (American, Iraqi, others)? Will the result be a stable Iraq and a blossoming of democracy in the Middle East or something less attractive? How many young Muslims and others will be turned against the United States, and what will they do about it?
Political questions: Should we be doing this despite the opposition of most of our traditional allies? Without the approval of the United Nations? Moral questions: Is it justified to make “pre-emptive” war on nations that may threaten us in the future? When do internal human rights, or the lack of them, justify a war? Is there a policy about pre-emption and human rights that we are prepared to apply consistently? Does consistency matter? Even etiquette questions: Before Bush begins trying to create a civil society in Iraq, wouldn’t it be nice if he apologized to Bill Clinton and Al Gore for all the nasty, dismissive things he said about “nation-building” in the 2000 campaign?
Some of these questions will be answered shortly, and some will be debated forever. This doesn’t mean history will never render a judgment. History’s judgment doesn’t require unanimity or total certainty. But that judgment is not in yet. Supporters of this war who are in the mood for an ideological pogrom should chill out for a while, and opponents need not fold into permanent cringe position.
Of course opponents have been on the defensive since the day the fighting started, forced to repeat the mantra that we “oppose the war but support the troops.” Critics mock this formula as psychologically implausible if not outright dishonest, but it’s not even difficult or complicated. Most of the common reasons for opposing this war get more severe as the war grows longer. Above all is the cost in human lives, especially the lives of American soldiers. (And most American war opponents share with American war supporters—with most human beings, for that matter—an instinctively greater concern for the lives of fellow nationals, however illogical or deplorable that might be.) Unlike Vietnam, where opposition barely existed until the war had been going on for several years, this is a war in which calling for a pullout short of victory would be silly. So, once the war has started, no disingenuousness is required for opponents to hope for victory, the quicker the better.
What is an honest opponent of a war supposed to do? Since even the end of this war won’t settle most of the important arguments about it, dropping all opposition at the beginning of the war would surely be more intellectually suspicious than maintaining your doubts while sincerely hoping for victory. Inevitably, more than one supporter of this war has taunted its opponents with Orwell’s famous observation in 1942 that pacifists—the few who opposed a military response to Hitler—were “objectively pro-fascist.” The suggestion is that opposing this war makes you objectively pro-Saddam. In an oddly less famous passage two years later, Orwell recanted that “objectively” formula and called it “dishonest.” Which it is.
The psychological challenge of opposing a war like this after it has started isn’t supporting the American troops, but hoping to be proven wrong. That, though, is the burden of pessimism on all subjects. As a skeptic, at the least, about Gulf War II, I do hope to be proven wrong. But it hasn’t happened yet.
BUSH’S WAR
Time, April 21, 2003
The “great man” theory of history has been out of fashion for decades. Historians trying to explain the course of human events point to geography or climate or technology. They explore the everyday life of ordinary people and the tides of change that sweep through whole populations. When they write about individual historical actors, the emphasis tends to be on psychology. Kings and Queens, Presidents and Prime Ministers may affect events at the margins, but the notion that history happens because someone decided it should happen is regarded as unenlightening if not simply wrong.
About Gulf War II and its consequences (whatever they may be), though, the “great man” theory is correct, and the great man is President George W. Bush. Great in this context does not necessarily mean good or wise. It does usually suggest a certain largeness of character or presence on the stage, which Bush does not possess. Whatever gods gave him this role were casting against type. But the role is his. This was George W. Bush’s war. It was the result of one man’s deliberate, sudden and unforced decision. Yes, Saddam Hussein deserves the ultimate moral blame, but Bush pushed the button.
Bush’s decision to make war on Iraq may have been visionary and courageous or reckless and tragic or anything in between, but one thing it wasn’t was urgently necessary. For Bush, this war was optional. Events did not impose it on him. Few public voices were egging him on. He hadn’t made an issue of the need for “regime change” during the presidential campaign or made it a priority in the early months of his Administration. If he had completely ignored Iraq through the 2004 election, the price would have been a few disappointed Administration hawks and one or two grumpy op-eds. But something or someone put this bee in his bonnet, and from a standing start, history took off. Thousands died, millions were freed from tyranny (we hope), billions were spent, a region was shaken to its core, alliances ruptured, and the entire world watched it all on TV.
Compare America’s other wars of the past sixty years. All of them had, if not inevitability, at least a bit of propulsion from forces larger than one man’s desire. Gulf War I was provoked by an actual event: Iraq’s occupation of Kuwait. George the Elder didn’t have to make war, but he had to do something. Vietnam, famously, was never an explicit decision. Even the parody war in Grenada had a few captive American medical students to force its way onto the agenda. Some people believe that Franklin Roosevelt personally, deliberately and even dishonestly maneuvered a reluctant America into World War II. But World War II was history boiling over and impossible to avoid one way or another.
Why did Bush want this war? His ostensible reasons were unconvincing. Whatever we may find now in the rubble of Baghdad, he never offered any good evidence of a close link between Iraq and al-Qaeda or of weapons of mass destruction that could threaten the U.S. His desire to liberate a nation from tyranny undoubtedly was sincere, but there are other tyrants in the world. Why this one? On the other hand, the ulterior motives attributed to Bush by critics are even more implausible. He didn’t start a war to serve his re-election campaign or avenge his father or enrich his oil buddies or help Israel. The mystery of Bush’s true motives adds to the impression of a wizard arbitrarily waving his wand over history.
War on Iraq was optional for George W. Bush in another sense too. He could have easily chosen not to have it, in which case it wouldn’t have happened, but when he decided to have it, that was it: we had it. The President’s ability to decide when and where to use America’s military power is now absolute. Congress cannot stop him. That’s not what the Constitution says, and it’s not what the War Powers Act says, but that’s how it works in practice. The U.N. cannot stop him. That’s not what the U.N. Charter says, but who cares? And who cares what America’s allies think either?
Even more amazing than the President’s pragmatic power over military resources is his apparent spiritual power over so many minds. Bush is not the only one who decided rather suddenly that dis-empowering Saddam had to be the world’s top priority. When Bush decided this, so did almost every congressional Republican, conservative TV pundit and British Prime Minister. In polls, a large majority of Americans agreed with Bush that Saddam was a terrible threat and had to go, even though there had been no popular passion for this idea before Bush brought it up. You could call this many things, but one of them is leadership. If real leadership means leading people where they don’t want to go, George W. Bush has shown himself to be a real leader. And he now owns a bit of history to prove it.
BILL BENNETT’S BAD BET
Slate, May 4, 2003
Sinners have long cherished the fantasy that William Bennett, the virtue magnate, might be among our number. The news over the weekend—that Bennett’s fifty-thousand-dollar sermons and best-selling moral instruction manuals have financed a multimillion dollar gambling habit—has lit a lamp of happiness in even the darkest hearts. As the joyous word spread, crack flowed like water through inner-city streets, family court judges began handing out free divorces, children lit bonfires of The Book of Virtues, More Virtuous Virtues, Who Cheesed My Virtue?, Moral Tails: Virtue for Dogs, etc. And cynics everywhere thought, for just a moment: Maybe there is a God after all.
If there were a Pulitzer Prize for schadenfreude (joy in the suffering of others), Newsweek’s Jonathan Alter and Joshua Green of the Washington Monthly would surely deserve it for bringing us this story. They are shoo-ins for the public service category in any event. Schadenfreude is an unvirtuous emotion of which we should be ashamed. Bill Bennett himself was always full of sorrow when forced to point out the moral failings of other public figures. But the flaws of his critics don’t absolve Bennett of his own.
Let’s also be honest that gambling would not be our first-choice vice if we were designing this fantasy-come-true from scratch. But gambling will do. It will definitely do. Bill Bennett has been exposed as a humbug artist who ought to be pelted off the public stage if he lacks the decency to slink quietly away, as he is constantly calling on others to do. Although it may be impossible for anyone famous to become permanently discredited in American culture (a Bennett-like point I agree with), Bennett clearly deserves that distinction. There are those who will try to deny it to him. They will say:
1. HE NEVER SPECIFICALLY CRITICIZED GAMBLING. This, if true, doesn’t show that Bennett is not a hypocrite. It just shows that he’s not a complete idiot. Working his way down the list of other people’s pleasures, weaknesses, and uses of American freedom, he just happened to skip over his own. How convenient. Is there some reason why his general intolerance of the standard vices does not apply to this one? None that he’s ever mentioned.
Open, say, Bennett’s The Broken Hearth: Reversing the Moral Collapse of the American Family, and read about how Americans overvalue “unrestricted personal liberty.” How we must relearn to “enter judgments on a whole range of behaviors and attitudes.” About how “wealth and luxury…often make it harder to deny the quest for instant gratification” because “the more we attain, the more we want.” How would you have guessed, last week, that Bennett would regard a man who routinely “cycle[s] several hundred thousand dollars in an evening” (his own description) sitting in an airless Las Vegas casino pumping coins into a slot machine or video game? Well, you would have guessed wrong! He thinks it’s perfectly OK as long as you don’t spend the family milk money.
2. HIS GAMBLING NEVER HURT ANYONE ELSE. This is, of course, the classic libertarian standard of permissible behavior, and I think it’s a good one. If a hypocrite is a person who says one thing and does another, the problem with Bennett is what he says—not (as far as we know) what he does. Bennett can’t plead liberty now because opposing libertarianism is what his sundry crusades are all about. He wants to put marijuana smokers in jail. He wants to make it harder to get divorced. He wants more “moral criticism of homosexuality” and “declining to accept that what they do is right.”
In all these cases, Bennett wants laws against or heightened social disapproval of activities that have no direct harmful effects on anyone except the participants. He argues that the activities in question are encouraging other, more harmful activities or are eroding general social norms in some vague way. Empower America, one of Bennett’s several shirt-pocket mass movements, officially opposes the spread of legalized gambling, and the Index of Leading Cultural Indicators, one of Bennett’s cleverer PR conceits, includes “problem” gambling as a negative indicator of cultural health. So, Bennett doesn’t believe that gambling is harmless. He just believes that his own gambling is harmless. But by the standards he applies to everything else, it is not harmless.
Bennett has been especially critical of libertarian sentiments coming from intellectuals and the media elite. Smoking a bit of pot may not ruin their middle-class lives, but by smoking pot, they create an atmosphere of toleration that can be disastrous for others who are not so well-grounded. The Bill Bennett who can ooze disdain over this is the same Bill Bennett who apparently thinks he has no connection to all those “problem” gamblers because he makes millions preaching virtue and they don’t.
3. HE’S DOING NO HARM TO HIMSELF. From the information in Alter’s and Green’s articles, Bennett seems to be in deep denial about this. If it’s true that he’s lost eight million dollars in gambling casinos over ten years, that surely is addictive or compulsive behavior no matter how good virtue has been to him financially. He claims to have won more than he has lost, which is virtually (that word again!) impossible playing the machines as Bennett apparently does. If he’s not in denial, then he’s simply lying, which is a definite non-virtue. And he’s spraying smarm like the worst kind of cornered politician—telling the Washington Post, for example, that his gambling habit started with “church bingo.”
Even as an innocent hobby, playing the slots is about as far as you can get from the image Bennett paints of his notion of the Good Life. Surely even a high-roller can’t “cycle through” eight million dollars so quickly that family, church, and community don’t suffer. There are preachers who can preach an ideal they don’t themselves meet and even use their own weaknesses as part of the lesson. Bill Bennett has not been such a preacher. He is smug, disdainful, intolerant. He gambled on bluster, and lost.
THE FABULIST
AN AMERICAN SUCCESS STORY.
Slate, May 15, 2003
President Bush, of course, is not a junior reporter for the New York Times. So maybe it doesn’t matter if he makes up stories and puts them in the newspaper. After Ronald Reagan, it’s almost a presidential tradition.
Bush was in New Mexico on Monday with a new answer to critics who complain that his tax cut proposal favors the rich. In two words: small business. “Most new jobs in America are created by small businesses.” Therefore tax cuts “must focus on the entrepreneur.” And thence to more familiar bromides: It’s not “the government’s money,” it’s “your money”; “our greatest strength” is “our individual citizens”; criticism is “just typical Washington, D.C., political rhetoric, is what it is.”
The myth of small business is one of the more ridiculous bipartisan superstitions that influence government policy. Small businesses, by their nature, come and go. They create more jobs than big businesses and wipe out more jobs, too. Any small-business owner burdened by high taxes is, by definition, more affluent than the typical big-business owner, who is an ordinary working American with an interest in a retirement fund. Small businesses are swell. But special favors for small business make no sense in terms of either fairness or prosperity.
Bush gave his speech Monday at a company in Albuquerque called MCT Industries. “We’re standing in the midst of what we call the American dream,” he said. MCT is privately owned by the family of Ted Martinez, who founded it on a shoestring in 1973 and is now a wealthy VIP who hangs around with politicians. “The Martinez family is living that dream,” Bush said.
Before we even get to the fantasy element, there is a logical problem here, isn’t there? A successful “small” business makes an odd poster child for the proposition that the government is getting in the way of small business success. How did the Martinez family manage to achieve the American dream during a period when high taxes were supposedly thwarting that dream? If MCT Industries is so successful under current arrangements, why does it need a tax cut?
You don’t need overdeveloped smell detectors to suspect that this story may be a bit more complicated. And the most casual stroll through the Internet and media databases enriches the narrative a lot. MCT Industries seems to be a weird collection of unrelated businesses whose only unifying theme is selling to government agencies or needing the approval of politicians. The Martinez family is wealthy because of tax revenues, not despite them.
No surprise, MCT is a member of the Rio Grande Minority Purchasing Council, a trade association for businesses looking to benefit from reverse discrimination. Racial favoritism for “disadvantaged” wealthy business owners is the most ridiculous and unjustifiable form of affirmative action and generally the only kind Republicans are enthusiastic about. Martinez is a GOP activist, but his company does not discriminate. At a 1997 conference of Hispanic CEOs, Clinton Energy Secretary Federico Pena boasted about how “MCT was able to secure a diesel-powered aircraft maintenance contract with the U.S. Air Force” thanks to the “assistance” of a federal agency.
Earlier this year, the Albuquerque City Council declined to authorize about $5 million of industrial revenue bonds for MCT. IRBs are a racket—legal, unfortunately—in which local governments use their right to issue federal-tax-exempt bonds in order to raise money for private companies. The company gets to borrow at a below-market interest rate, subsidized by the loss to the federal Treasury. In Albuquerque, the lucky companies get exempted from local property taxes and some state taxes to boot. MCT did not want the money for job-creating expansion but to refinance IRBs it already enjoys to get an even lower interest rate. Those IRBs helped to finance a factory to build maintenance equipment and do R & D, both for the Defense Department.
October 2002. MCT is one of the contributors to a PAC that paid for the mayor’s family to visit China.
July 2002. The Bureau of Indian Affairs approves an MCT municipal garbage landfill on an Indian reservation. Also, the New Mexico Rural Development Response Council and several state agencies help MCT to acquire land for a factory to build platforms for aircraft repairs.
August 1999. Waste News reports that Albuquerque has a bizarre regulation requiring all city garbage trucks to be made out of a particular brand of steel. Only one company sells trucks made out of this material. Guess.
December 1998. The Energy Department (secretary: Bill Richardson, now governor of New Mexico) hires MCT to build magnets to be used in making tritium for nuclear warheads.
June 1997. MCT, as a local company, competes against a national waste-management firm for a local garbage-collection contract. It wins the contract and sells the business to the national firm the next day.
October 1996. Republican vice presidential candidate Jack Kemp holds a rally at MCT. Ted Martinez hands him a document asserting that almost a third of MCT’s payroll goes to paying federal, state, and local taxes. In his speech, Kemp makes it “half.”
October 1995. Giant defense contractor TRW announces that it has won a $185 million contract from the Air Force, which it will share with two “small disadvantaged businesses” including MCT.
December 1994. In congressional testimony about export assistance for small businesses, a Commerce Department official talks about how the federal government sponsored an exhibit by MCT at the Paris Air Show and subsequent Commerce Department shows in China and Dubai.
So you get rich with a dozen different types of tax-funded help, you become a Republican, and you live happily ever after complaining about how much you pay in taxes. Maybe President Bush was right after all, that is the American dream.
SYMPATHY FOR THE NEW YORK TIMES
Slate, May 21, 2003
Although rarely reluctant to join in a schadenfreude festival, I nevertheless feel sorry for the New York Times. Duped by one of its own reporters, hemorrhaging rumors and leaks like the institutions it is used to covering, its extravagant public self-flagellation merely inviting flagellation by everyone else, the paper is at a low ebb. Much of the criticism and self-criticism is deserved. But after two weeks of Times-bashing, it’s time for a bit of therapeutic outreach.
One reason the Times has my sympathy over being duped by a writer is that I’ve been there. And let me tell you: The clarity of hindsight is remarkable. A couple of years ago, Slate published a vivid, rollicking yarn about an alleged sport called “monkeyfishing.” The author claimed to have used a rod and reel, with rotten fruit as bait, to catch monkeys living on an island in the Florida Keys.
As editor of Slate at the time, I read the piece before it was published and didn’t like it—for a variety of wrong reasons. So I cannot even claim to have been blinded by enthusiasm. Others at Slate did like it and so we published it. When outsiders challenged it, I read it again.
It was like reading an entirely different article. Red flags waved from every line. At first the author stood by his story and we stood by him. But within days, poking around by ourselves and others made this position untenable, and so we both caved. The question remains, though, why my baloney-detectors didn’t function beforehand, when they could have saved us considerable embarrassment. All I can say is: Congress is about to exempt dividends from the income tax—i.e., stranger things than monkeyfishing actually do happen.
Whatever the reason, reading an article with doubts raised is a different experience from reading it in its virginal pre-publication freshness. As Slate’s Jack Shafer has pointed out, most readers of Jayson Blair’s Times articles did not spot the hints of fabrication or plagiarism either. This includes many of the critics who now say that the Times missed important clues because of institutional arrogance or political bias or an affirmative action mentality.
Of course readers are entitled to assume that published articles have been pre-skepticized. And Jayson Blair duped the Times again and again. But holding foresight to the standards of hindsight is a bit unfair.
My second reason for feeling sympathy for the New York Times is that it now wears the Scarlet P, for plagiarist, when in a way we are all plagiarizers of the New York Times. Plagiarism technically applies only to an article’s words, not to the ideas and information contained in them. But the value of a newspaper article lies more in the ideas and information than in the precise words. And much or even most American news reporting and commentary on national issues derives—uncredited—from the New York Times.
Even if you don’t read the Times yourself, you get your news from journalists at other media who do. The Times sets the news agenda that everyone else follows. The Washington Post and maybe one or two other papers also play this role, but even as a writer who appears in the Washington Post—a damned fine newspaper run by superb editors who are graced with every kind of brilliance, charm, and physical beauty—I would have to concede that the Times is more influential.
It’s not just the agenda setting. Our basic awareness of what is going on in the world derives in large part from the Times. How do you even know that Baghdad exists? Have you been there? Touched it? How do I, sitting in Seattle, know the current status of the Bush administration’s Mideast road map, about which I may choose to opine with seeming authority? Column-writing is an especially derivative form of journalism. But even the hardest of hard-news reporters starts with basic knowledge that probably comes more from the Times than her own two eyes.
It’s true that the journalistic food chain runs both ways: Big media like the Times often pick up stories and information from smaller fish, often with insufficient credit or none at all. But it is the imprimatur of the Times or the Post that stamps the story as important before sending it back down to other papers—as well as up to the media gods of television.
This near-universal dependence on the Times helps to explain the schadenfreude (dependence causes resentment) as well as the more serious alarm about the Times’ reliability. It also puts Jayson Blair’s rip-offs of others, if not his fabrications, in perspective. No one gets ripped off more than the New York Times.
The social critic Dwight Macdonald, reminiscing about the left-wing Partisan Review crowd of the 1930s, wrote: “The N.Y. Times was to us what Aristotle was to the medieval scholastics—a revered authority, even though pagan, and a mine of useful information about the actual world.” Today’s equivalent of that sect-ridden, conspiracy-minded, alienation-proud political world is on the right. I was listening to a right-wing broadcast crank the other day as he carried on about how the Times can’t be trusted about this and that. I don’t know where he got his information, but I have a guess.
SUPREME COURT FUDGE
Slate, June 24, 2003
Admission to a prestige institution like the University of Michigan or its law school is what computer types call a “binary” decision. It’s yes or no. You’re in, or you’re out. There is no partial or halfway admission. The effect of any factor in that decision is also binary. It either changes the result or it doesn’t. It makes all the difference, or it makes none at all. Those are the only possibilities.
For any individual, the process of turning factors into that yes-or-no decision doesn’t matter. Any factor that changes the result has the same impact as if it were an absolute quota of one. It gets you in, or it keeps you out. And this is either right or it is wrong. The process of turning factors into a result doesn’t matter here, either. In this sense, the moral question is binary, too.
For twenty-five years, since Justice Powell’s opinion in the Bakke case, moderates on the Supreme Court and well-meaning people throughout the land have been pretending that it is possible to split a difference that cannot be split. This week’s court ruling, in which Justice O’Connor contrasts the college and law-school admissions systems at Michigan and essentially reaffirms Bakke, shows how laughable that pretense has become.
Michigan’s college admissions policy at the time this suit began was strictly numerical: You needed 100 points to get in, and you got 20 points for being an officially recognized minority. Flatly unconstitutional, the court declared. Michigan’s law school, by contrast, “engages in a highly individualized, holistic review of each applicant’s file.” It “awards no mechanical, predetermined diversity ‘bonuses’ based on race or ethnicity.” Instead, it makes “a flexible assessment of applicants’ talents, experiences, and potential…” blah blah blah. This is how it should be done, the court said.
Yes, but does the law school give an advantage in admissions to blacks and other minorities? Well, says the court, quoting the law school’s brief, it “aspires to ‘achieve that diversity which has the potential to enrich everyone’s education.’” The law school “does not restrict the types of diverse contributions eligible” for special treatment. In fact, it “recognizes ‘many possible bases for diversity admissions.’”
Yes, yes, yes, but does the law-school admissions policy favor minorities? Well, since you insist, yes: “The policy does…reaffirm the Law School’s longstanding commitment to ‘one particular type of diversity,’” i.e., “racial and ethnic diversity.” But O’Connor’s opinion immediately sinks back into a vat of fudge, trying not to acknowledge that “racial and ethnic diversity” means that some people will be admitted because of their race and others will be rejected for the same reason—exactly as in the undergraduate admissions system the court finds unconstitutional. By ignoring the similarities, the court avoids having to explain coherently why it sees such profound differences.
The court actually seems to be in denial on this point. Although it forbids explicit racial quotas or mathematical formulas to achieve racial balance, it is happy enough to measure the success of its preferred fuzzier approaches in statistical terms. If a selection system is going to be judged by its success in approximating the results of a mathematical formula, how is it any different from using that formula explicitly? Elsewhere, arguing for the social value of affirmative action, O’Connor’s opinion cites dramatic statistics about how few minority students there would be if it were ended. But don’t those statistics imply that affirmative action is having an equal-and-opposite effect now? And isn’t that good to exactly the extent that ending affirmative action would be bad? And if that extent can be measured and judged using statistics, why is it wrong to achieve the statistical goal through statistical means?
The majority opinion says that its preferred flexible-flier style of affirmative action does “not unduly harm members of any racial group.” Well, this depends on what you mean by “unduly,” doesn’t it? As noted, we’re dealing with an all-or-nothing-at-all decision here. Every time affirmative action changes the result, a minority beneficiary benefits by 100 percent and a white person is burdened 100 percent, in the only currency at issue, which is admission to the University of Michigan. This burden may be reasonable or unreasonable, but it is precisely the same size as the burden imposed by the mathematical-formula-style affirmative action that the court finds objectionable.
The Supreme Court took these Michigan cases to end a quarter century of uncertainty about affirmative action. What it has produced is utter logical confusion. The law-school dean testified that “the extent to which race is considered in admissions…varies from one applicant to another.” It “may play no role” or it “may be a determinative factor.” O’Connor cites this approvingly, but it is nonsense on several levels. First, “no role” and “determinative factor” are in fact the only possible options: There cannot be an infinite variety of effects on a yes-or-no question. Second, when race is determinative for one applicant, it is determinative for one other applicant, who may or may not be identifiable. Third, the same two possibilities—no factor and determinative factor—apply to any admissions system that takes race into account in any way, including by mathematical formula and even including an outright quota system. So, it says nothing special about the law school’s admissions policy compared with any other.
Finally, the court is confused if it thinks that a subjective judgment full of unquantifiable factors is obviously fairer than a straightforward formula. But confusion seems to be a purposeful strategy. The court’s message to universities and other selective, government-financed institutions is: We have fudged this dangerous issue. You should do the same.
ABOLISH MARRIAGE
Slate, July 2, 2003
Critics and enthusiasts of Lawrence v. Texas, last week’s Supreme Court decision invalidating state anti-sodomy laws, agree on one thing: The next argument is going to be about gay marriage. As Justice Scalia noted in his tart dissent, it follows from the logic of Lawrence. Mutually consenting sex with the person of your choice in the privacy of your own home is now a basic right of American citizenship under the Constitution. This does not mean that the government must supply it or guarantee it. But the government cannot forbid it, and the government also should not discriminate against you for choosing to exercise a basic right of citizenship. Offering an institution as important as marriage to male-female couples only is exactly this kind of discrimination. Or so the gay rights movement will now argue. Persuasively, I think.
Opponents of gay rights will resist mightily, although they have been in retreat for a couple of decades. General anti-gay sentiments are now considered a serious breach of civic etiquette, even in anti-gay circles. The current line of defense, which probably won’t hold either, is between social toleration of homosexuals and social approval of homosexuality. Or between accepting the reality that people are gay, even accepting that gays are people, and endorsing something called “the gay agenda.” Gay marriage, the opponents will argue, would cross this line. It would make homosexuality respectable and, worse, normal. Gays are welcome to exist all they want, and to do their inexplicable thing if they must, but they shouldn’t expect a government stamp of approval.
It’s going to get ugly. And then it’s going to get boring. So, we have two options here. We can add gay marriage to the short list of controversies—abortion, affirmative action, the death penalty—that are so frozen and ritualistic that debates about them are more like Kabuki performances than intellectual exercises. Or we can think outside the box. There is a solution that ought to satisfy both camps and may not be a bad idea even apart from the gay-marriage controversy.
That solution is to end the institution of marriage. Or rather (he hastens to clarify, Dear) the solution is to end the institution of government-sanctioned marriage. Or, framed to appeal to conservatives: End the government monopoly on marriage. Wait, I’ve got it: Privatize marriage. These slogans all mean the same thing. Let churches and other religious institutions continue to offer marriage ceremonies. Let department stores and casinos get into the act if they want. Let each organization decide for itself what kinds of couples it wants to offer marriage to. Let couples celebrate their union in any way they choose and consider themselves married whenever they want. Let others be free to consider them not married, under rules these others may prefer. And, yes, if three people want to get married, or one person wants to marry herself, and someone else wants to conduct a ceremony and declare them married, let ’em. If you and your government aren’t implicated, what do you care?
In fact, there is nothing to stop any of this from happening now. And a lot of it does happen. But only certain marriages get certified by the government. So, in the United States we are about to find ourselves in a strange situation where the principal demand of a liberation movement is to be included in the red tape of a government bureaucracy. Having just gotten state governments out of their bedrooms, gays now want these governments back in. Meanwhile, social-conservative anti-gays, many of them Southerners, are calling on the government in Washington to trample states’ rights and nationalize the rules of marriage, if necessary, to prevent gays from getting what they want. The Senate Majority Leader, Bill Frist of Tennessee, responded to the Supreme Court’s Lawrence decision by endorsing a constitutional amendment, no less, against gay marriage.
If marriage were an entirely private affair, all the disputes over gay marriage would become irrelevant. Gay marriage would not have the official sanction of government, but neither would straight marriage. There would be official equality between the two, which is the essence of what gays want and are entitled to. And if the other side is sincere in saying that its concern is not what people do in private, but government endorsement of a gay “lifestyle” or “agenda,” that problem goes away, too.
Yes, yes, marriage is about more than sleeping arrangements. There are children, there are finances, there are spousal job benefits like health insurance and pensions. In all these areas the law uses marriage as a substitute for other factors that are harder to measure, such as financial dependence or devotion to offspring. It would be possible to write rules that measure the real factors at stake and leave marriage out of the matter. Regarding children and finances, people can set their own rules, as many already do. None of this would be easy. Marriage functions as what lawyers call a “bright line,” which saves the trouble of trying to measure a lot of amorphous factors. You’re either married or you’re not. Once marriage itself becomes amorphous, who-gets-the-kids and who-gets-health-care become trickier questions.
So, sure, there are some legitimate objections to the idea of privatizing marriage. But they don’t add up to a fatal objection. Especially when you consider that the alternative is arguing about gay marriage until death do us part.
WHO IS BURIED IN BUSH’S SPEECH?
Slate, July 14, 2003
Once again a mysterious criminal stalks the nation’s capital. First there was the mystery sniper. Then there was the mystery arsonist. Now there is the mystery ventriloquist. The media are in a frenzy of speculation and leakage. Senators are calling for hearings. All of Washington demands an answer: Who was the arch-fiend who told a lie in President Bush’s State of the Union speech? No investigation has plumbed such depths of the unknown since O.J. Simpson’s hunt for the real killer of his ex-wife. (Whatever happened to that, by the way?)
Whodunit? Was it Col. Mustard in the kitchen with a candlestick? Condoleezza Rice in the Situation Room with a bottle of Wite-Out and a felt-tipped pen?
Linguists note that the question, “Who lied in George Bush’s State of the Union speech?” bears a certain resemblance to the famous conundrum, “Who is buried in Grant’s Tomb?” They speculate that the two questions may have parallel answers. But philosophers are still struggling to properly analyze the Grant’s Tomb issue—let alone answer it. And experts say that even when this famous nineteenth-century presidential puzzle is solved, it could be many years before the findings can be applied with any confidence to presidents of more recent vintage.
Lacking a real-life analogy that sufficiently captures the complexity of the speech-gate puzzle and the challenge facing investigators dedicated to solving it, political scientists say the best comparison may be to the assassination of Maj. Strasser in the film Casablanca. If you recall, Humphrey Bogart is standing over the body, holding a smoking gun. Claude Rains says: “Maj. Strasser has been shot! Round up the usual suspects.” And yet the mystery of who killed the major is never solved.
Ever since Watergate, a “smoking gun” has been the standard for judging any Washington scandal. Many a miscreant has escaped with his reputation undamaged—or even enhanced by the publicity and pseudovindication—because there was no “smoking gun” like the Watergate tapes. But now it seems that the standard has been lifted. You would think that on the question of who told a lie in a speech, evidence seen on television by millions of people around the world might count for something. Apparently not. The Bush administration borrows from Chico Marx: “Who are you going to believe—us or your own two eyes?”
The case for the defense is a classic illustration of what lawyers call “arguing in the alternative.” The Bushies say: a) It wasn’t really a lie; b) someone else told the lie; and c) the lie doesn’t matter. All these defenses are invalid.
1. Bushies fanned out to the weekend talk shows to note, as if with one voice, that what Bush said was technically accurate. But it was not accurate, even technically. The words in question were: “The British government has learned that Saddam Hussein recently sought significant quantities of uranium from Africa.” Bush didn’t say it was true, you see—he just said the Brits said it. This is a contemptible argument in any event. But to descend to the administration’s level of nitpickery, the argument simply doesn’t work. Bush didn’t say that the Brits “said” this Africa business—he said they “learned” it. The difference between “said” and “learned” is that “learned” clearly means there is some pre-existing basis for believing whatever it is, apart from the fact that someone said it. Is it theoretically possible to “learn” something that is not true? I’m not sure (as Donald Rumsfeld would say). However, it certainly is not possible to say that someone has “learned” a piece of information without clearly intending to imply that you, the speaker, wish the listener to accept it as true. Bush expressed no skepticism or doubt, even though the Brits qualification was only added as protection because doubts had been expressed internally.
2. The Bush argument blaming the CIA for failing to remove this falsehood from the president’s speech is based on the logic of “stop me before I lie again.” Bush spoke the words, his staff wrote them, those involved carefully overlooked reasons for skepticism. It would have been nice if the CIA had caught this falsehood, but its failure to do so hardly exonerates others. Furthermore, the CIA is part of the executive branch, as is the White House staff. If the president—especially this president—can disown anything he says that he didn’t actually find out or think up and write down all by himself, he is more or less beyond criticism. Which seems to be the idea here. The president says he has not lost his confidence in CIA Director George Tenet. How sweet. If someone backed me up in a lie and then took the fall for me when it was exposed, I’d have confidence in him too.
3. The final argument: It was only 16 words! What’s the big deal? The bulk of the case for war remains intact. Logically, of course, this argument will work for any single thread of the pro-war argument. Perhaps the president will tell us which particular points among those he and his administration have made are the ones we are supposed to take seriously. Or how many gimmes he feels entitled to take in the course of this game. Is it a matter of word count? When he hits 100 words, say, are we entitled to assume that he cares whether the words are true?
AT LEAST SAY YOU’RE SORRY
Slate, Sept. 11, 2003
President Bush will get his $87 billion for a year’s worth of victory in Iraq and Afghanistan, but he will have to endure a lot of nyah-nyah-nyah and I-told-you-so along the way. He could have avoided all this irritation—and he is just the kind of man to find it incredibly irritating—with two little words in his TV address last Sunday evening: “I’m sorry.” If he had acknowledged with a bit of grace what everyone assumes to be true—that the administration was blindsided by the postwar challenge in both these countries—this would have cut off a politically damaging debate that will now go on through the election campaign. And he would have won all sorts of brownie points for high-mindedness. Instead, he and his spokesfolk will be defending a fairly obvious untruth day after day through the election campaign.
Why do politicians so rarely apologize? Why in particular won’t they admit to being surprised by some development? Lack of scruples can’t explain it: Denying the obvious isn’t even good unscrupulous politics. For that reason, it is beyond spin. If spinning involves an indifference to truth, what’s going on here looks more like an actual preference for falsehood. The truth would be better politics, and the administration is spreading lies anyway.
This is not meant to be a partisan observation. Bush’s predecessor was, if anything, a more flamboyant liar. What’s going on here is something like lying-by-reflex. If the opposition accuses you of saying the world is round, you lunge for the microphone to declare your passionate belief that it is flat. Or maybe it has something to do with the bureaucracies that political campaigns have become. The truth, whatever its advantages, is messy and out-of-control. A lie can be designed by committee, vetted by consultants, tested with focus groups, shaped to perfection. Anyone can tell the truth. Crafting a good lie is a job for professionals.
This $87 billion request is a minefield of embarrassments, through which a simple “We got it wrong” would have been the safest route. After all, Bush either knew we’d be spending this kind of money for two or more years after declaring victory—and didn’t tell us—or he didn’t realize it himself. Those are the only two options. He deceived us, or he wasn’t clairvoyant in the fog of war. Apparently, Bush would rather be thought omniscient than honest, which is a pity, since appearing honest is a more realistic ambition. Especially for him.
What’s more, this would have been a truth without a tail. Telling one hard truth can lead you down, down, down into a vicious circle of more truth, revelation, embarrassment, and chagrin. That’s one reason for the truth’s dangerous reputation. But the Bush administration’s failure to realize how much its postwar festivities would cost is a truth that doesn’t lead anywhere in particular. Clearly knowing about the $87 billion bill for Year 2 would not have stopped Bush from conducting the war to begin with. Nor would this knowledge have stopped opponents from opposing it. Among supporters, there may be a few people who bought Bush’s initial war-on-terrorism rationale, didn’t mind the bait-and-switch to his revised freedom-and-democracy rationale, reveled in the military victory, and yet would have opposed it all if they’d known about the $87 billion. But it is an odd camel whose back is broken by this particular straw.
Bush needs some truth-telling points, because another aspect of this $87 billion request is driving him to dishonesty that he can’t abandon so blithely. That issue is: If he gets the $87 billion, where will it have come from? Bush is sending Colin Powell around the world with a begging cup. But whatever can’t be raised from foreigners apparently can be conjured out of thin air.
Raising taxes to pay the $87 billion would be a bad mistake, Bush says: Economic growth—fed by tax cuts—will cover the $87 billion and then some. But however miraculous Bush’s tax cuts turn out to be, economic growth will not be $87 billion more miraculous just because that much more is suddenly needed in Iraq and Afghanistan. Nor does Bush plan, or even concede the necessity, to harvest this $87 billion at some point by raising taxes (or not cutting them) by that amount. And although he talks vaguely about spending restraint, he and the Congress controlled by his party have shown very little of it. He certainly has not pinpointed $87 billion in other spending that the new $87 billion can replace.
So, spending $87 billion costs nothing, apparently. This makes it even sillier to deny being blindsided. What difference does it make?
While apologizing to the citizenry, Bush could win even more brownie points, at almost no cost, by apologizing specifically to his predecessor. Bush ridiculed Bill Clinton’s efforts to follow up military interventions with “nation-building.” Believe it or not, this was a pejorative term, implying unrealistic ambitions. Now Bush talks about turning Iraq into a Jeffersonian democracy.
And if Bush wants credit for a Gold-Star Triple-Whammy Zirconium-Studded apology, he should apologize to his father, who stopped Gulf War I at the Iraqi border. Armchair Freudians believe that in going to Baghdad and toppling Saddam, George II was playing Oedipal tennis with George I. If so, junior has lost. The elder Bush’s most notorious decision as president looks better every day. And not just because of the $87 billion.
JUST SUPPOSIN’
IN DEFENSE OF HYPOTHETICAL QUESTIONS.
Slate, Oct. 2, 2003
One of the absurd conventions of American politics is the notion that there is something suspect or illegitimate about a hypothetical question. By labeling a question as “hypothetical,” politicians and government officials feel they are entitled to duck it without looking like they have something to hide. They even seem to want credit for maintaining high standards by keeping this virus from corrupting the political discussion.
“If I’ve learned one thing in my nine days in politics, it’s you better be careful with hypothetical questions,” declared Gen. Wesley Clark in a recent Democratic presidential candidates’ debate. He might have learned it on television, where “Never answer a hypothetical question” is one of the rules a real-life political strategist offered to real-life presidential candidate Howard Dean in HBO’s fictional Washington drama K Street.
The question Clark was trying not to answer was “your vote, up or down, yes or no” on President Bush’s request for $87 billion to finance the wars in Iraq and Afghanistan for another year. This question is only hypothetical in the sense that Clark doesn’t literally get to vote on the matter. That kind of literalness could make almost any question hypothetical. The obvious purpose of the question was to elicit Clark’s opinion on the $87 billion. And surely it is not unreasonable or “hypothetical” to expect candidates for president to express an opinion on whatever controversy surrounds the presidency at the moment.
Secretary of State Colin Powell was asked this week whether Americans would have supported the Iraq war if they’d known we weren’t going to find those weapons of mass destruction the administration used to justify it. This really is a hypothetical question, as Powell labeled it in declining to answer, but it’s a darned interesting one and one an honest leader in a democracy ought to be pondering about now, even if he doesn’t care to share his thoughts.
Neither of these examples is the kind of hypothetical question that calls on the answerer to imagine a situation that is unlikely to occur, and one there would have been no good reason to think about. What if a man from Mars were running in the California recall election? What if President Bush were secretly writing a treatise on moral philosophy? And so on.
Avoiding questions (from reporters, from opponents, from citizens) is the basic activity of the American politician. Or rather, avoiding the supply of answers. Skill and ingenuity in question-avoidance techniques are a big factor in political success. Usually, avoiding the question involves pretending to answer it, or at least supplying some words to fill the dead space after the question has been asked. But if you can squeeze a question into one of a few choice categories, the unwritten rules allow the politician to not answer at all. There’s national security. (“I’m sorry, but revealing the size of my gun collection might imperil our war on terrorism.”) There’s privacy. (“I must protect my family from the pain of learning about my other family.”) There are legal proceedings. (“That arson allegation has been referred to the Justice Department, and I cannot comment further.”) But only an allegedly hypothetical question may be dismissed because of its very nature, irrespective of subject matter.
This is silly. Hypothetical questions are at the heart of every election in a democracy. These are questions the voters must answer. Voters are expected to imagine each of the candidates holding the office he or she is seeking and to decide which one’s performance would be most to their liking. Every promise made by a candidate imposes two hypothetical questions on the voter: If elected, will this person do as promised? And if this promise is kept, will I like the result? The voter cannot say, “I don’t answer hypothetical questions.” And voters cannot sensibly answer the hypothetical questions they’ve been assigned without learning the answers to some hypothetical questions from the candidates.
Hypothetical questions are essential to thinking through almost any social or political issue. In law school they’re called “hypos,” and the process is called “salami slicing.” Imagine this situation, and tell me the result. Now change the situation slightly—does the result change? Now change it in a different way—same result, or different one? It’s just like an eye exam, where you peer through a series of alternative lenses until you zero in on the correct prescription.
Yet even lawyers turn against the cherished hypo when nominated for prestigious judgeships. Then they say self-righteously that they cannot answer hypothetical questions about how they might rule. Once they are safely on the bench, of course, they issue public opinions every day that are, among other things, statements about how they analyze the issue at hand and strong indications, if not more, of how they will rule in the future.
A refusal or inability to answer hypothetical questions is nothing to be proud of. In fact, it ought to be a disqualification for public office. Anyone who doesn’t ponder hypothetical questions all the time is unfit for the task of governing. In fact, it’s hard to see how any halfway intelligent person can manage to avoid taking up hypothetical questions a dozen times a day.
But we can all name a few politicians we suspect are up to this challenge.
FILTER TIPS
Slate, Oct. 16, 2003
To President Bush, the news is like a cigarette. You can get it filtered or unfiltered. And which way does he prefer it? Well, that depends on the circumstances. When he is trying to send a message to the public, Bush prefers to have it go out unfiltered. He feels, for example, that the “good news about Iraq” is getting filtered out by the national media. “Somehow you just got to go over the heads of the filter and speak directly to the American people,” he said the other day. So, lately he has been talking to local and regional media, whom he trusts to filter less.
But when he is on the receiving end, Bush prefers his news heavily filtered. “I glance at the headlines, just to get kind of a flavor,” he told Brit Hume of Fox News last month. But, “I rarely read the stories” because “a lot of times there’s opinions mixed in with news.” Instead, “I get briefed by [White House Chief of Staff] Andy Card and Condi [Rice, the national security adviser] in the morning.”
The president concluded, “The best way to get the news is from objective sources. And the most objective sources I have are people on my staff who tell me what’s happening in the world.”
Bush’s beef about news from Iraq is a variation on the famous complaint that the media never report about all the planes that land safely. And it’s true: Many American soldiers have not been killed since the war officially ended. You rarely read stories about all the electricity that works, or the many Iraqis who aren’t shouting anti-American slogans. For that matter, what about all the countries we haven’t invaded and occupied in the past year? And what about the unreported fact that Saddam Hussein has been removed from power? Well, maybe that isn’t actually unreported. But an unfilterish media would surely report it again and again in every story every day, in case people forgot.
Every president complains that the media are blocking his message, and the media complain that every administration wants to manage the news. It’s not only presidents. Everyone who has something to say in our media-saturated culture (and who doesn’t?) longs for ways to get that message out unmediated by someone else. In this media cacophony, the president probably has more ability to deliver his message without a filter than anyone else on earth. Anything the president says is automatically news. If he wants to commandeer all the TV networks for a speech in prime time, he can usually do it. The president can even hold a press conference, although this president rarely bothers.
Bush also will have a campaign war chest of $170 million that he can spend in the next year delivering any message he wants, completely unfiltered. Who can top that? Well, until recently there was Saddam Hussein. He could talk as long as he wanted and Iraqi television never cut away for a commercial, let alone bring on annoying pundits to pick and pick and pick. And the next day’s Baghdad Gazette would publish every single word, also without any tedious analysis. A few others, such as Fidel Castro, still have this privilege. I was under the impression that George Bush found this distasteful—the sort of thing one might even tighten a boycott or start a war over.
George W. Bush doesn’t really want people to get the news unfiltered. He wants people to get the news filtered by George W. Bush. Or rather, he wants everyone to get the news filtered by the same people who apparently filter it for him. It’s an interesting epistemological question how our president knows what he thinks he knows and why he thinks it is less distorted than what the rest of us know or think we know. Every president lives in a cocoon of advisers who filter reality for him, but it’s stunning that this president actually seems to prefer getting his take on reality that way.
Bush apparently thinks (if that is the word) that the publicly available media contaminate the news with opinion but Condi Rice and Andy Card are objective reporters. Anyone who has either been a boss or had a boss will find it easier, knowing that Bush believes this, to understand how he can also believe that things are going swimmingly in Iraq. And where does the Rice-Card News Service obtain its uncontaminated information? Bush conceded his shocking suspicion that Rice and Card “probably read the news themselves.” They do? Whatever next? The president apparently is willing to tolerate the reading of newspapers by his staff members in the privacy of their own homes, as long as they don’t flaunt this unseemly habit by bringing the wretched things into the White House or referring to them at staff meetings.
The president noted, though, that Rice and Card also get “news directly from participants on the world stage.” (“Hi, Achmed—it’s Condi. What’s going on there in Baghdad? What’s the weather like? And how’s traffic? Thanks, I’ll go tell the president and call you again in fifteen minutes.”) The notion that these world-stagers are sources of objective opinion while newspaper reporters are burdened by insuppressible opinions and hidden agendas is another odd one.
When it comes to unfiltered news, the president says he can dish it out and actually brags that he can’t take it. In fact, he can’t do either one.
TAKING BUSH PERSONALLY
Slate, Oct. 23, 2003
Conservatives wonder why so many liberals don’t just disagree with President Bush’s policies but seem to dislike him personally. The story of stem-cell research may help to explain. Two years ago, Bush announced an unexpectedly restrictive policy on the use of stem cells from human embryos in federally funded medical research. Because federal funding plays such a large role, the government more or less sets the rules for major medical research in this country.
Bush’s policy was that research could continue on stem-cell “lines” that existed at the moment of his speech, in August 2001, but that otherwise, embryo research was banned. Even surplus embryos already in the freezer at fertility clinics—where embryos are routinely created and destroyed by the thousands every year—could not be used for medical research and would have to be thrown out instead.
Bush’s professed moral concern was bolstered by two factual assumptions. One was that there were more than sixty stem-cell lines available for research. Stem cells are “wild card” cells. They multiply and evolve into cells for specific purposes in the human body. A “line” is the result of a particular cell that has been “tweaked” and is multiplying in the laboratory. The hope is to develop lines of cells that can be put back into human beings and be counted on to evolve into replacements for missing or defective parts. The likeliest example is dopamine-producing brain cells for people with Parkinson’s disease. The dream is replacements for whole organs or even limbs. But each line is a crapshoot. So the more lines, the better. And it turns out that the number of useful lines is more like ten than sixty.
Bush also touted the possibility of harmlessly harvesting stem cells from adults. He said, “Therapies developed from adult stem cells are already helping suffering people.” This apparently referred to decades-old techniques such as removing some of a leukemia patient’s bone marrow and then reinjecting it after the patient has undergone radiation.
As for finding adult stem cells that could turn into unrelated body parts, that was just a dream two years ago, and now it is not even that. A new study, reported last week in Nature, concluded that when earlier studies thought they saw new specialized cells derived from adult stem cells, they were really seeing those adult cells bonding with pre-existing specialized cells. There’s hope in this bonding process, too—but not the hope researchers had for adult stem cells, and nothing like the hope they still have for embryonic stem cells. Since Bush’s speech, scientists have used embryonic stem cells to reverse the course of Parkinson’s in rats.
Put it all together, and the stem cells that can squeeze through Bush’s loopholes are far less promising than they seemed two years ago while the general promise of embryonic stem cells burns brighter than ever. If you claim to have made an anguished moral decision, and the factual basis for that decision turns out to be faulty, you ought to reconsider, or your claim to moral anguish looks phony. But Bush’s moral anguish was suspect from the beginning because the policy it produced makes no sense.
The week-old embryos used for stem-cell research are microscopic clumps of cells, unthinking and unknowing, with fewer physical human qualities than a mosquito. Fetal-tissue research has used brain cells from aborted fetuses, but this is not that. Week-old, lab-created embryos have no brain cells.
Furthermore, not a single embryo dies because of stem-cell research, which simply uses a tiny fraction of the embryos that live and die as a routine part of procedures at fertility clinics. And actual stem-cell therapy for real patients, if it is allowed to develop, will not even need these surplus embryos. Once a usable line is developed from an embryo, the cells for treatment can be developed in a laboratory.
None of this matters if you believe that a microscopic embryo is a human being with the same human rights as you and me. George W. Bush claims to believe that, and you have to believe something like that to justify your opposition to stem-cell research. But Bush cannot possibly believe that embryos are full human beings, or he would surely oppose modern fertility procedures that create and destroy many embryos for each baby they bring into the world. Bush does not oppose modern fertility treatments. He even praised them in his anti-stem-cell speech.
It’s not a complicated point. If stem-cell research is morally questionable, the procedures used in fertility clinics are worse. You cannot logically outlaw the one and praise the other. And surely logical coherence is a measure of moral sincerity.
If he’s got both his facts and his logic wrong—and he has—Bush’s alleged moral anguish on this subject is unimpressive. In fact, it is insulting to the people (including me) whose lives could be saved or redeemed by the medical breakthroughs Bush’s stem-cell policy is preventing.
This is not a policy disagreement. Or rather, it is not only a policy disagreement. If the president is not a complete moron—and he probably is not—he is a hardened cynic, staging moral anguish he does not feel, pandering to people he cannot possibly agree with, and sacrificing the future of many American citizens for short-term political advantage.
Is that a good enough reason to dislike him personally?
THE RELIGIOUS SUPERIORITY COMPLEX
Time, Nov. 3, 2003
“I knew that my god was bigger than his,” said Lieut. General William Boykin, Deputy Under Secretary of Defense for Intelligence, referring to a Somali warlord he once crossed swords with. The echo of a famous dog-food commercial was unintentional, we must hope. Presumably, Boykin’s God does not eat Ken-L Ration. But maybe Boykin does so himself, because he’s a mighty frisky fella.
Boykin was caught on videotape speaking to church groups and saying things like, “The enemy is a guy called Satan.” And, “They’re after us because we’re a Christian nation.” Now—like so many Christian soldiers before him, sent to distant lands to bring the pagans around to our point of view—he’s in hot water. Only this time it’s the forces of Western civilization, not the natives, who want him for lunch.
Among other problems, Boykin’s theo-babble muddies the waters of moral outrage over the latest rantings by the Prime Minister of Malaysia, Mahathir Mohamad, whose mouth is like a radio station where the anti-Semitic golden oldies never stop. Jews “rule the world by proxy.” They “invented socialism, communism, human rights and democracy so that persecuting them would appear to be wrong.” (Figure that one out!) The media don’t report his criticism of Muslims, he explained, because Jews control the media. And so on. In fact, until the last outburst, Mahathir got great press as a supposedly moderate Muslim leader, although his views on Jews are not new. He was President Bush’s poster boy for Muslim moderation.
Bush’s moral outrage at both Mahathir and Boykin has been oddly muted. He claims to have told Mahathir that saying Jews succeed at the expense of Muslims is “wrong and divisive.” Mahathir claims that Bush only apologized in private for having to criticize him in public, “unless my hearing is very bad.” Which, he tartly added, it isn’t. About Boykin, Bush said that the general’s remarks “didn’t reflect my opinion.” The Pentagon has begun an official investigation into Boykin’s remarks. What there is to investigate is a puzzle.
Everyone who gets caught in one of these ethno-controversies privately believes that he or she is being punished for having had the guts to tell a harsh truth. Any apology he or she coughs up, as Boykin did, only reinforces this feeling. No doubt even Mahathir has friends and sycophants who are telling him, “Mo, you’re just a victim of political correctness. What is this world coming to when a simple Prime Minister can’t say the Jews control everything without people making a ridiculous fuss?”
Bush ought to be furious at Boykin, because, until now, greater understanding and embrace of Islam have been real achievements of the Bush Administration. Even as America’s victory in the Iraq war turns to ash, Bush can take pride that Americans have a greater appreciation that Muslims and their religion add to the richness of our great ethnic stew. And without Bush’s special emphasis, the opposite might easily have happened.
At the same time, Bush has described the war on terror from the beginning in Manichaean terms not all that different from Boykin’s. “Today, our nation saw evil,” he said on Sept. 11, 2001. “Freedom and fear, justice and cruelty, have always been at war, and we know that God is not neutral between them,” he told the nation nine days later. Boykin may be understandably perplexed about what line he crossed by referring to evil as “a guy called Satan.”
As a devout believer, Boykin may also wonder why it is impermissible to say that the God you believe in is superior to the God you don’t believe in. I wonder this same thing as a nonbeliever: Doesn’t one religion’s gospel logically preclude the others’? (Except, of course, where they overlap with universal precepts, such as not murdering people, that even we nonbelievers can wrap our heads around.) Although Boykin’s version of Christianity seems less like monotheism than the star of a high school polytheism tournament, his basic point is that Christianity is right and Islam is wrong. Doesn’t the one imply the other? Pretending that my religion is no better than your religion may make for fewer religious wars, but it seems contrary to the very idea of religion. For this, you take a leap of faith?
Boykin’s mistake was to put all these pieces together, implying that Islam itself is not merely mistaken but evil. Talking like this while in a U.S. military uniform was also pretty tactless. Mahathir’s mistake, by contrast, was to open his trap at all.
ATTACK GEOGRAPHY
HEY, BUDDY, WHO DO YOU THINK YOU’RE CALLING “BUCOLIC”?
Slate, Nov. 20, 2003
Republicans have had a talent for geographical chauvinism since Nixon’s southern strategy. Wherever a Democratic candidate happens to be from, that place turns out to be isolated and unrepresentative and not part of the real America. They are having a good time at the moment dissing Vermont, home of former Gov. Howard Dean. It’s way up there in the Northeast somewhere. (Yeah, not too far south of the Bush family hangout in Maine.) It doesn’t have any black people. Its best-known product is some hippie ice cream. Worst of all, it’s (gasp!) “bucolic.”
Odd, but I don’t recall these points being made by any politician, Republican or Democrat, about New Hampshire, which is adjacent to Vermont. In the next few months, as always in election years, we will be hearing repeatedly about what a wonderful place New Hampshire is. Second only to Iowa. But, Vermont—now, that’s a different story.
In 1988, Republicans painted Massachusetts as a foreign country and Democratic candidate Michael Dukakis as an elitist, compared with that po’ boy from Texas, the elder George Bush. Massachusetts, to its credit, is a bit south of Vermont. On the other hand, it is full of universities. Need we say more?
When Bill Clinton emerged as Democratic front-runner in 1992, Republicans went to work denigrating Arkansas. “A failed governor of a small state,” was the sound bite summary. “Failed” was disputable, and disputed. “Small” was beyond dispute. But so what? “Mine is bigger than yours” is the subtext of a lot that goes on in politics, but getting all puffed up about the size of your state seems especially ridiculous. Should mothers in small states go back to their children and say, “Sorry, I was wrong. You can’t grow up to be president. Our state’s too small”?
The semiserious notion here is that the governor of a small state hasn’t done as much raw governing as the governor of a large state. If we’re measuring governorship by the inch, though, we had better note that George W. Bush was governor for six years, whereas Dean was governor for eleven years—almost twice as long. So, measuring a governorship in population years, Bush of Texas (population at the last census: 20.85 million) scores 125 million, whereas Dean of Vermont (population: 609,000) scores, um, 6.7 million. Well, OK, but measuring in square-mile years of governorship, Bush of Texas (268,000 square miles) scores 1.6 million, whereas Dean of Vermont (9,600 square miles) scores…gosh, it really is a small state, isn’t it?
Bush ought to sympathize with Dean a bit, though, because Dean is now getting the same grief that Bush got four years ago for the effrontery of being a governor at all. Governors have no foreign policy experience, it is said. How can they run for president when they’ve never been to Botswana? Senators, by contrast, know Washington. They know NATO. They may even know Botswana. But do they know how to run anything larger than their own offices? Even the state of Vermont is bigger than most congressional staffs, probably.
The experience of being president surely is more like the experience of running Vermont than it is like being, say, a member of the U.S. Senate. Senators, like journalists, enjoy jobs with a wonderful ratio of respectability to responsibility. There’s a lot of the first, not much of the second. You can huff and puff all day, people are inclined to take you seriously, but if you’re talking nonsense, no one gets hurt. Usually. A governor or a president, by contrast, makes decisions, and things tend to happen as a result. Usually. This can be disorienting and dangerous to a novice.
When they were going after Clinton, they portrayed Arkansas as the last place you would want your president from. Why? Well, it’s in the South—out of the American mainstream. It’s full of poor people. Everyone’s married to his cousin. They eat horrible, fatty lower-class foods. My dear, it’s Hicksville, plain and simple. Read your Faulkner—these people are sicko.
But now Vermont is in last place. Why? It’s in New England—out of the American mainstream. There aren’t enough poor people there. Everyone’s married to her girlfriend—or will be soon. They eat horrible, fatty upper-class cheese. And, of course, that hoity-toity ice cream. Man, it’s Snotsville. Read your Cheever or your John O’Hara—the guy really comes from Park Avenue after all. These people are wacko.
Dean does hail from New York City and state, which were still fairly large at the most recent census. But in the attack geography game, multiple locations are a subtractive process. Having experienced two places makes Dean doubly isolated from his country. The GOP will be making meat out of Dean’s New York background, too. They will have a harder time of it since they have chosen to hold their convention in New York next summer. This was a cynical decision, intended to provide a backdrop for yet one more presidential victory lap in the war on terrorism. The cynicism may have been premature. But does anyone remember 1992? (Answer: No, of course not. I proceed anyway.) In that year there was a lot of Republican sneeriness over the Democrats’ decision to hold their convention in New York. New York, it seemed, was not the real America. Urban. Ethnic. Noisy, crowded, dirty. The real America was…was…perhaps the word they were searching for back then was “bucolic.” Like Vermont. The Republicans went to Houston that year. Now, I ask you.
The appropriate sentiment at this point is that we all live in the real America and share…something. What a generation of attack geography actually demonstrates is that none of us live in the real America. There is no part of the country that cannot be portrayed, with some accuracy, as a sealed society out of touch with the rest of the country. In fact, attack geography depends on this very ignorance and disdain wherever it decrees the Real America to be in this election cycle.
But whoever thought “bucolic” would be a fighting word?
WHEN GOOD NEWS IS BAD NEWS
THE POLITICS OF MIXED EMOTIONS.
Slate, Dec. 18, 2003
It’s a familiar human predicament. Dear old Aunt Maude—dear, rich old Aunt Maude—has staged a remarkable recovery. The doctors say she could live another thirty years. You are delighted, of course. And yet you can’t help thinking about the money.
The Democratic presidential candidates woke up Sunday morning to learn that U.S. forces had captured Saddam Hussein. O joy! O joy! O ****! You cannot blame them for having mixed emotions. How long do you suppose each one spent relishing the good news for the world before dwelling on the bad news for their own political futures? And how long did President Bush spend savoring the boon for Iraqis before he started to savor the boon for his re-election campaign? It’s obviously less of a challenge, though, to be and/or appear sincere when good news for the world is good news for you personally.
How to deal with mixed emotions is a bigger challenge to politicians than to the rest of us for a couple of reasons. One is that being and/or appearing concerned for the greater good is a basic qualification for their jobs. You cannot win an election if the voters suspect that you are wishing them ill, gleeful when bad things happen, and disappointed when things turn out well. Even admitting to mixed feelings—which are only human—would kill your ambitions. And yet almost any development in the news during an election campaign creates a similar dilemma for everyone except the incumbent.
It’s the Law of Mixed Emotions: If you’re the challenger, what’s good for the voters is bad for you, and what’s bad for the voters is good for you. When the stock market goes down, it buoys you up. When the market goes up, it brings you down. Or at least part of you must feel this way. There may be politicians so pure of heart that such cynical thoughts never cross their minds. But I certainly would not want anyone so wonderful representing me. Nor would I trust such an angel to protect my interests in this cold world.
So if you’re a politician, how should you deal with good news that’s bad news? Howard Dean’s comments this week offer both a negative and a positive case study. He broke the most obvious rule: Pretend, at least, that you’re enjoying the party. Don’t stint or quibble. It may well be true, as Dean said, that capturing Saddam doesn’t make America any safer than if he’d been left cowering in his spider hole for the next fifty years. The Bush administration certainly had given us the impression until this week that it believed the important thing was toppling him from power, not running him physically into the ground. But that’s just cheap irony. Save it for the pundits. To be presidential, look gracious.
On the other hand, Dean won points in my book for another bit of straight talk. After calling Saddam’s capture “a great day” for the military, for Iraqis, and for Americans generally, he added that it was “frankly, a great day for the administration.” This is a rare example of a politician saying “frankly” and then saying something actually frank. It comes close to admitting the obvious: that this development helps Bush’s chance of winning next year’s election and therefore hurts Dean’s.
It’s a real mystery why politicians find it so hard to admit the obvious about the horse-race aspects of politics. No doubt it requires a dose of blind optimism to be a politician in the first place. Even Dennis Kucinich must think he has a 1-in-10,000 chance of becoming president, when his chance is actually much smaller. But there is also an annoying convention that you must pretend to a confidence you don’t feel. Anyone who doesn’t realize that this week’s news has been a big boost for Bush’s re-election is too stupid or blinded to be elected president. Yet the press will punish any candidate who says so, possibly because if the candidates take up stating the obvious, they’re stealing our material. The pols need to be coy and evasive so that we can tell it to you straight.
Virtually every Democratic candidate, including Dean, followed another puzzling convention of American politics by saying that the capture of Saddam was a reason, or at least an occasion, to draw in other nations. Their most common complaint about the war has been that it isn’t “multilateral.” It’s hard to see how this argument is affected one way or another by finding Saddam in a hole. (Well, this macho triumph by George W. seems to have cowed the Europeans and made them more cooperative—but that presumably was not the point his opponents were trying to make.)
Politicians reacting to a surprise in the news are always declaring that the unexpected development makes them believe even more deeply in the wisdom and urgency of whatever they have been saying all along. Bush inherited huge surpluses, and he called for a tax cut. Now there are huge deficits, and he calls for a tax cut.
Even when the cause-and-effect connection isn’t totally implausible, a politician taking this line usually looks silly. If you had asked John Kerry a week ago whether it was even possible for him to feel more strongly than he already did about the need for a more “globalized” approach to the situation in Iraq, he probably would have said it isn’t possible. He felt very, very, very strongly about it last week. Now we are supposed to believe that he feels very, very, very, very strongly. Is there any development that could make him feel only very, very strongly—or even make him change his mind?