One of the reasons for the successes we’ve enjoyed in Afghanistan is that our ambassador there, who saw the country through the founding of a democratic government, was not just a serious thinker and a skilled diplomat but also spoke the language and understood the culture. Why? Because Zalmay Khalilzad is an Afghan-born Afghan American.
It is not every country that can send to obscure faraway places envoys who are themselves children of that culture. Indeed, Americans are the only people who can do that for practically every country.
Being mankind’s first-ever universal nation, to use Ben Wattenberg’s felicitous phrase for our highly integrated polyglot country, carries enormous advantage. In the shrunken world of the information age, we have significant populations of every ethnicity capable of making instant and deep connections—economic as well as diplomatic—with just about every foreign trouble spot, hothouse and economic dynamo on the planet.
That is a priceless and unique asset. It is true that other countries, particularly in Europe, have in the past several decades opened themselves up to immigration. But the real problem is not immigration but assimilation. Anyone can do immigration. But if you don’t assimilate the immigrants—France, for example, has vast, isolated exurban immigrant slums with populations totally alienated from the polity and the general culture—then immigration becomes not an asset but a liability.
America’s genius has always been assimilation, taking immigrants and turning them into Americans. Yet our current debates on immigration focus on only one side of the issue—the massive waves of illegal immigrants that we seem unable to stop.
The various plans, all well-intentioned, have an air of hopelessness about them. Amnesty of some sort seems reasonable because there is no way we’re going to expel 10 million–plus illegal immigrants, and we might as well make their lives more normal. But that will not stop further illegal immigration. In fact, it will encourage it, because every amnesty—and we have them periodically—tells potential illegals still in Mexico and elsewhere that if they persist long enough, they will get in, and if they stay here long enough, they can cut to the head of the line.
In the end, increased law enforcement, guest-worker programs and other incentives that encourage some of the illegals to go back home can go only so far. Which is why we should be devoting far more attention to the other half of the problem: not just how many come in but what happens to them once they’re here.
The anti-immigrant types argue that there is something unique about our mostly Latin immigration that makes it unassimilable. First, that there’s simply too much of it to be digested. Actually, the percentage of foreign-born people living in the United States today is significantly below what it was in 1890 and 1910—and those were spectacularly successful immigrations. And there is nothing about their culture that makes it any more difficult for Catholic Hispanics to assimilate than the Czechs and Hungarians, Chinese and Koreans, who came decades ago.
The key to assimilation, of course, is language. The real threat to the United States is not immigration per se but bilingualism and, ultimately, biculturalism. Having grown up in Canada, where a language divide is a recurring source of friction and fracture, I can only wonder at those who want to duplicate that plague in the United States.
The good news, and the reason I am less panicked about illegal immigration than most, is that the vogue for bilingual education is waning. It has been abolished by referendum in California, Arizona and even Massachusetts.
As the results in California have shown, it was a disaster for Hispanic children. It delays assimilation by perhaps a full generation. Those in “English immersion” have more than twice the rate of English proficiency as those in the old bilingual system (being taught other subjects in Spanish while being gradually taught English).
By all means we should try to control immigration. Nonetheless, given our geography, our tolerant culture and the magnetic attraction of our economy, illegals will always be with us. Our first task, therefore, should be abolishing bilingual education everywhere and requiring that our citizenship tests have strict standards for English language and American civics.
The cure for excessive immigration is successful assimilation. The way to prevent European-like immigration catastrophes is to turn every immigrant—and most surely his children—into an American. Who might one day grow up to be our next Zalmay Khalilzad.
The Washington Post, June 17, 2005
One day’s news:
— A prominent Tory MP is killed by an IRA car bomb.
— Government soldiers in Liberia murder hundreds of refugees from a rival rebel tribe.
— A radical Islamic sect in Trinidad kidnaps the prime minister and his cabinet and holds them at gunpoint.
— American Indians demonstrate outside the Canadian Embassy in solidarity with Mohawks caught in a violent land dispute with Quebec (which itself is in an autonomy dispute with Canada).
What connects these unconnected events? The powerful, often ignored, global reality of tribalism. The Irish, Liberian and Trinidadian variety is more violent, but Canada illustrates best its bewilderingly regressive nature. Canadian nationalism has long sought to distinguish itself from the United States; Quebec nationalism to distinguish itself from Canada; and now here come the Mohawks with their own claim of apartness from Quebec.
Tribes within nations within empires. The world is littered with such Chinese boxes, and they make perfect tinder for conflict. Nowhere more so than in the Soviet bloc, where the decline of communism has brought a revival of tribalism (most notably in Azerbaijan and Transylvania) as savage and primitive as seen anywhere.
What is all this to Americans? A lesson and a warning. America, alone among the multi-ethnic countries of the world, has managed to assimilate its citizenry into a common nationality. We are now doing our best to squander this great achievement.
Spain still has its Basque secessionists, France its Corsicans. Even Britain has the pull of Scottish and Welsh to say nothing of Irish nationalists. But America has, through painful experience, found a way to overcome its centrifugal forces.
American unity has been built on a tightly federalist politics and a powerful melting pot culture. Most important, America chose to deal with the problem of differentness (ethnicity) by embracing a radical individualism and rejecting the notion of group rights. Of course, there was one great, shameful historical exception: the denial of rights to blacks. When that was finally outlawed in the ’60s, America appeared ready to resume its destiny, a destiny celebrated by Martin Luther King Jr., as the home of a true and now universal individualism.
Why is this a destiny to be celebrated? Because it works. Because while Spain and Canada, to say nothing of Liberia and Ireland, are wracked by separatism and tribal conflict, America has been largely spared. Its union is more secure than that of any multi-ethnic nation on earth.
We are now, however, in the process of throwing away this patrimony. Our great national achievement—fashioning a common citizenship and identity for a multi-ethnic, multi-lingual, multi-racial people—is now threatened by a process of relentless, deliberate Balkanization. The great engines of social life—the law, the schools, the arts—are systematically encouraging the division of America into racial, ethnic and gender separateness.
It began with the courts, which legitimized the allocation of jobs, government contracts, admission to medical school and now TV licenses by race, gender and ethnic group.
Then education. First Stanford capitulated to separatist know-nothings and abandoned its “Western Civilization” course because of its bias toward white males. (You know: narrow-minded ethnics like Socrates, Jesus and Jefferson.) Now the push is to start kids much earlier on the road to intellectual separatism. Grade school, for example.
A proposed revision of New York State’s school curriculum to rid it of “Eurocentric” bias is so clearly an attempt at “ethnic cheerleading on the demand of pressure groups” that historians Diane Ravitch, Arthur Schlesinger Jr., C. Vann Woodward, Robert Caro and 20 others were moved to issue a joint protest. Despite their considerable ideological differences, they joined to oppose the “use of the school system to promote the division of our people into antagonistic racial groups.”
Even the arts have been conscripted into the separatist crusade. “Both the Rockefeller and Ford foundations,” writes Samuel Lipman in Commentary, “intend to downgrade and even eliminate support for art based on traditional European sources and instead will encourage activity by certain approved minorities.”
Countries struggling to transcend their tribal separateness have long looked to America as their model. Now, however, America is going backward. While the great multi-ethnic states try desperately to imbue their people with a sense of shared national identity, the great American institutions, from the courts to the foundations, are promoting group identity instead.
Without ever having thought it through, we are engaged in unmaking the American union and encouraging the very tribalism that is the bane of the modern world.
The Washington Post, August 6, 1990
Last week the Washington Post reported on two Asian American children who were denied transfer from one local public school to another because of their race. The school board denied them entry to the second school, with its unique French immersion program, because the first school had only 11 Asian students. To keep their numbers up, no Asians were permitted to transfer out.
This is by no means a unique case. A white parent, writing just a few days later in the Washington Post, described the agonies of trying to transfer his adopted Korean-born child, a transfer denied because of its “impact on diversity,” as the school board’s rejection letter memorably explained.
Diversity is now the great successor to affirmative action as the justification for counting and assigning by race in America. First, diversity sounds more benign. Affirmative action has acquired a bad reputation because it implies the unfair advancement of one group over another. Diversity cheerfully promises nothing more than making every corner of America “look like America.”
Moreover, diversity is a blunter instrument. Affirmative action requires an inquiry into history and justifies itself as redress for past injustice. Diversity is much simpler. It does not even try to justify itself by appeal to justice or some other value. It is an end in itself. It requires no demonstration of historical wrong, only of current racial imbalance. Too few Asians at Rock Creek Forest Elementary School? Fine. No Asian will be allowed to leave.
Because of its blithe disregard of anything—individual rights, common citizenship, past injustice—except racial numbers, the appeal to diversity represents the ultimate degeneration of the idea of counting by race. At its beginnings, affirmative action was deeply morally rooted as an attempt to redress centuries of discrimination against blacks. Yes, affirmative action did violate the principle of judging people as individuals and not by group. But it did so in the name of another high moral principle: the redress of grievous, gratuitous harm inflicted on one group because of its race.
Had affirmative action remained restricted to African Americans and to the redress of past discrimination, it would still command support in the country today. Instead it has been stretched, diluted and corrupted beyond recognition, transmuted from redress for blacks—a case of massive, official, unique injustice—to diversity for all, except, of course, white males.
By what principle should government preferentially award a contract to, say, the newly arrived son of an Argentinian businessman over a native-born American white? None. Diversity alone, in and of itself, is invoked to justify such a travesty.
Diversity, drawing on no moral argument, is morally bankrupt. It draws only on a new form of American utopianism, a multi-hued variant of an older Norman Rockwell utopianism, in which in every walk of American life, race and ethnicity are represented in exactly correct proportions.
Like all utopianisms, this one is divorced from reality. It is entirely cockeyed to expect different groups to gravitate with strict proportionality to every school, workplace and neighborhood in America.
And when they don’t, this utopianism partakes of the brutality of all utopianisms and forces the fit. Individuals who obstruct the quest for the perfect post-Rockwell tableau beware.
The kindergartners denied entry to the French program at the Maryvale Elementary School in Rockville constitute such an obstruction. Yet they are hardly the most deeply aggrieved parties. That honor belongs to California’s Asian American high school graduates who, alas, have excelled disproportionately in school and thus threaten to overwhelm California’s best colleges.
Everyone knows that there is an unspoken quota system in the California universities and in other schools around the country that keeps Asians out because of their race. How does this shameful practice differ from the exclusion of similarly gifted Jews during the ’30s and ’40s? Perhaps only in the hypocrisy of those defending the practice. In the old days, the justification for anti-Jewish quotas was simple antipathy toward “pushy” Jews. Today, justification for excluding “nerdy” Asians is more highfalutin: They are an impediment to diversity.
Proponents of these appalling classifications by race prefer, of course, to pretend that they are about such grand notions as culture. Nonsense. As the white father of the untransferable Asian schoolkid notes, “I couldn’t help wonder what cultural contribution my son could make [as an Asian]—he was just five months old when he left Korea.” These quotas are not about culture. They are about skin color, eye shape and hair texture.
One stymied Asian American mother, desperate for a loophole, tried having her child reclassified as white because the father is white. More parents will seek such solutions. How shall we adjudicate these vexing questions of mixed blood?
Turn to the source—the modern state that produced the most exquisitely developed system of race classification. The unemployed justices who enforced the Group Areas Act of apartheid South Africa may finally find gainful work again.
The Washington Post, September 1, 1995
In regard to the (Washington) Redskins. Should the name be changed?
I don’t like the language police ensuring that no one anywhere gives offense to anyone about anything. And I fully credit the claim of Redskins owner Dan Snyder and many passionate fans that they intend no malice or prejudice and that “Redskins” has a proud 80-year history they wish to maintain.
The fact is, however, that words don’t stand still. They evolve.
Fifty years ago the preferred, most respectful term for African Americans was Negro. The word appears 15 times in Martin Luther King’s “I have a dream” speech. Negro replaced a long list of insulting words in common use during decades of public and legal discrimination.
And then, for complicated historical reasons (having to do with the black power and “black is beautiful” movements), usage changed. The preferred term is now black or African American. With a rare few legacy exceptions, such as the United Negro College Fund, Negro carries an unmistakably patronizing and demeaning tone.
If you were detailing the racial composition of Congress, you wouldn’t say: “Well, to start with, there are 44 Negroes.” If you’d been asleep for 50 years, you might. But upon being informed how the word had changed in nuance, you would stop using it and choose another.
And here’s the key point: You would stop not because of the language police. Not because you might incur a presidential rebuke. But simply because the word was tainted, freighted with negative connotations with which you would not want to be associated.
Proof? You wouldn’t even use the word in private, where being harassed for political incorrectness is not an issue.
Similarly, regarding the further racial breakdown of Congress, you wouldn’t say: “And by my count, there are two redskins.” It’s inconceivable, because no matter how the word was used 80 years ago, it carries invidious connotations today.
I know there are surveys that say that most Native Americans aren’t bothered by the word. But that’s not the point. My objection is not rooted in pressure from various minorities or fear of public polls or public scolds.
When I was growing up, I thought “gyp” was simply a synonym for “cheat,” and used it accordingly. It was only when I was an adult that I learned that gyp was short for gypsy. At which point, I stopped using it.
Not because I took a poll of Roma to find out if they were offended. If some mysterious disease had carried away every gypsy on the planet, and there were none left to offend, I still wouldn’t use it.
Why? Simple decency. I wouldn’t want to use a word that defines a people—living or dead, offended or not—in a most demeaning way. It’s a question not of who or how many had their feelings hurt, but of whether you want to associate yourself with a word that, for whatever historical reason having nothing to do with you, carries inherently derogatory connotations.
Years ago, the word “retarded” emerged as the enlightened substitute for such cruel terms as “feeble-minded” or “mongoloid.” Today, however, it is considered a form of denigration, having been replaced by the clumsy but now conventional “developmentally disabled.” There is no particular logic to this evolution. But it’s a social fact. Unless you’re looking to give gratuitous offense, you don’t call someone “retarded.”
Let’s recognize that there are many people of good will for whom “Washington Redskins” contains sentimental and historical attachment—and not an ounce of intended animus. So let’s turn down the temperature. What’s at issue is not high principle but adaptation to a change in linguistic nuance. A close call, though I personally would err on the side of not using the word if others are available.
How about Skins, a contraction already applied to the Washington football team? And that carries a sports connotation, as in skins vs. shirts in pickup basketball.
Choose whatever name you like. But let’s go easy on the other side. We’re not talking Brown v. Board of Education here. There’s no demand that Native Americans man the team’s offensive line. This is a matter of usage—and usage changes. If you shot a remake of 1934’s The Gay Divorcee, you’d have to change that title too.
Not because the lady changed but because the word did.
Hail Skins.
The Washington Post, October 18, 2013
Early in their training in cinematic conventions, kids learn the rule of thumb for sorting out good guys from bad guys: The good-looking guy is good, and the bad-looking guy is bad. Indeed, if the guy is positively ugly, he is the likely villain. And if he has something visibly wrong with him—a limp, a scar—he’ll be an especially cruel one.
Of course, Hollywood did not invent this cultural convention. It is a tradition that goes back at least as far as Richard III, whose “deformed, unfinish’d…half made up” body—a hunchback, a limp—prefigured the disfigurement of his soul.
Hollywood, manufacturer of both dreams and nightmares, has always been of two minds about how to portray those who like Richard III are “rudely stamp’d.” It has settled on one of two stereotypical responses: sentimentalize or demonize.
The sentimentalizing you have seen often enough: those sickly sweet movies in which the hero’s physical impairment is a window onto a higher, purer, more spiritual plane of being. The perfect vehicle for this kind of schmaltz is the blind hero who invariably sees deeper and farther for having been loosed from the bonds of the physical world. The Elephant Man, Simon Birch and Regarding Henry (lawyer gets shot in head, becomes good person) are recent variations on the theme.
Alternating with this Hallmark beatification of disability is the more sinister convention of associating it with villainy. James Bond films are notable for adorning the villain with, say, hooks for hands or steel for teeth. In Batman, disfigurement in a vat of acid transforms Jack Nicholson’s small-time hood into evil incarnate. Even The Lion King cannot resist the convention: The bad lion is called Scar, and sports one.
A minor classic of the genre is The Sting. That otherwise delightful 1973 Redford-Newman vehicle felt it necessary to give Redford’s nemesis (a mobster played by Robert Shaw) a limp. This is no noble wound like Captain Ahab’s. It is an incidental piece of the landscape, never explained because one does not have to. Villains limp; Redford doesn’t.
But if The Sting offers a rather subliminal link between inner and outer defectiveness, the just-opened, wildly popular Wild, Wild West—a Will Smith vehicle otherwise too silly to merit notice—dispenses with the subtlety, offering up the most extreme and revolting example of this convention in recent memory.
West features two villains, both embittered, crazed Confederate officers now getting their post–Civil War revenge on the Union. They’re notable because the minor baddy is missing an ear and has a disfigured face, while the major baddy, played with creepy gusto by Kenneth Branagh, is missing the entire lower half of his body. With so much more missing, you know he is so much more psychotically evil than his merely earless friend.
Much fun is had with Branagh’s half body, none of it funny, much of it cruel. Yet with the P.C. police so outraged at the alleged racism of George Lucas’ new Star Wars—going so far as to locate, ridiculously, a Yiddish accent in Watto, the slave-owning merchant—it is rather odd that nothing has been said about the savage mockery of physical deformity in West, a blockbuster hit aimed squarely at kids.
What makes it odder still is that this is the same Hollywood that routinely gives teary standing ovations every time Christopher Reeve makes an appearance at some awards ceremony. It is the same culture that falls over itself in soppy sentimental tributes to the “inspiration” that emanates from the disabled.
Or maybe it is not so odd. The whole politically correct vogue for paying tribute to the “courage” and higher powers of the disabled—and of acquiescing to such comic linguistic conventions as calling the disabled “differently abled,” as if those of us in wheelchairs have chosen some alternative lifestyle—is, in the end, a form of condescension.
To be sure, patronizing the disabled is not as offensive as the in-your-face mockery of something like West. But its effect is similar: to distance oneself, to give expression to the reflexive mixture of fear and pity that misfortune in others evokes in all of us.
Disability—like exile, the human condition it most resembles—neither ennobles nor degrades. It frames experience. It does not define it.
But that undramatic reality is hardly grist for Hollywood, which specializes in, and swings wildly between, fawning idealization and primitive caricature of disability. It is saint or sicko—now (as ever) playing at a theater near you.
The Washington Post, July 9, 1999
After a massacre like the one at Emanuel AME Church in Charleston, our immediate reaction is to do something. Something, for politicians, means legislation. And for Democratic politicians, this means gun control.
It’s the all-purpose, go-to, knee-jerk solution. Within hours of the massacre, President Obama was lamenting the absence of progress on gun control. A particular Democratic (and media) lament was Congress’ failure to pass anything after Sandy Hook.
But the unfortunate fact is that the post–Sandy Hook legislation would have had zero effect on the events in Charleston. Its main provisions had to do with assault weapons; the (alleged) shooter Dylann Roof was using a semiautomatic pistol.
You can pass any gun law you want. The 1994 assault weapons ban was allowed to expire after 10 years because, as a Justice Department study showed, it had no effect. There’s only one gun law that would make a difference: confiscation. Everything else is for show.
And in this country, confiscation is impossible. Constitutionally, because of the Second Amendment. Politically, because doing so would cause something of an insurrection. And culturally, because Americans cherish—cling to, as Obama once had it—their guns as a symbol of freedom. You can largely ban guns in Canada where the founding document gives the purpose of confederation as the achievement of “peace, order and good government.” Harder to disarm a nation whose founding purpose is “life, liberty and the pursuit of happiness.”
With gun control going nowhere, the psychic national need post-Charleston to nonetheless do something took a remarkable direction: banishment of the Confederate battle flag, starting with the one flying on the grounds of the statehouse in Columbia, then spreading like wildfire to consume Confederate flags, symbols, statues and even memorabilia everywhere—from the Alabama state capitol to Walmart and Amazon.
Logically, the connection is tenuous. Yes, Roof does pose with the Confederate flag, among other symbols of racism, on his website. But does anyone imagine that if the South Carolina flag had been relegated to a museum, the massacre would not have occurred?
Politically, the killings created a unique moment. Governor Nikki Haley was surely sincere in calling for the Confederate flag’s removal. But she also understood that the massacre had created a moment when the usual pro–Confederate flag feeling—and, surely, expressions of it—would be largely suppressed, presenting the opportunity to achieve something otherwise politically unachievable.
But there’s a deeper reason for this rush to banish Confederate symbols, to move them from the public square to the museum. The trigger was not just the massacre itself, but even more tellingly the breathtaking display of nobility and spiritual generosity by the victims’ relatives. Within 48 hours of the murder of their loved ones, they spoke of redemption and reconciliation and even forgiveness of the killer himself. It was an astonishingly moving expression of Christian charity.
Such grace demands a response. In a fascinating dynamic, it created a feeling of moral obligation to reciprocate in some way. The flag was not material to the crime itself, but its connection to the underlying race history behind the crime suggested that its removal from the statehouse grounds—whatever the endlessly debated merits of the case—could serve as a reciprocal gesture of reconciliation.
The result was a microcosm of—and a historical lesson in—the moral force of the original civil rights movement, whose genius was to understand the effect that combating evil with good, violence with grace, would have on a fundamentally decent American nation.
America was indeed moved. The result was the civil rights acts. The issue today is no longer legal equality. It is more a matter of sorting through historical memory.
The Confederate flags would ultimately have come down. That is a good thing. They are now coming down in a rush. The haste may turn out to be problematic.
We will probably overshoot, as we are wont to do, in the stampede to eliminate every relic of the Confederacy. Not every statue has to be smashed, not every memory banished. Perhaps we can learn a lesson from Arlington National Cemetery, founded by the victorious Union to bury its dead. There you will find Section 16. It contains the remains of hundreds of Confederate soldiers grouped around a modest, moving monument to their devotion to “duty as they understood it”—a gesture by the Union of soldierly respect, without any concession regarding the taintedness of their cause.
Or shall we uproot them as well?
The Washington Post, June 26, 2015