CHAPTER 4
The Shrunken Citizen
Elmer Kelton, perhaps our greatest Western novelist, is best remembered for his tales of the early Texas frontier. In my opinion his most compelling work is The Time It Never Rained, a saga of the devastating West Texas drought of the 1950s.
The book’s protagonist, rancher Charlie Flagg, differs from and eventually angers his neighbors by his obstinate refusal to accept government assistance, even as the drought destroys everything he has worked a lifetime to build.
In one scene, his friend Page Mauldin expresses exasperation at Charlie’s unwillingness to accept a government handout even though everyone else has their hands wide open. Charlie replies, “I’ll spend my own money if I spend any at all.”
When Page tries to reason with him, explaining that he might as well try and get some of his tax money back rather than let someone else run off with it, Charlie is nonplussed. “If you get to dependin’ on the government, the day’ll come when the damn federales will dictate everything you do. Some desk clerk in Washington will decide where you live and where you work and what color toilet paper you wipe yourself with. And you’ll be scared to say anything because they might cut you off of the tit.”
It would be hard to find a Charlie Flagg these days. Over the past several decades there has been a push in Washington to expand the scope of federal power and hand out government subsidies to as many Americans as possible. Some of these efforts have been curtailed over the years, most notably through welfare reform in the 1990s. But the push has continued and has expanded to include subsidies for working Americans through health care benefits, subsidies for buying a home, and refundable tax credits. The Obama administration even created, as part of its stimulus package, a tax credit called “Making Work Pay,” which targeted lowerincome Americans but benefited couples making as much as $190,000 a year. The result of this push has been a string of government deficits and, even though we may laugh at Charlie’s suggestion that the government might dictate the color of toilet paper, an explosion in federal mandates that impact our daily lives. Indeed, recent years have seen superfluous legislation enacted that Charlie essentially predicted. Just imagine, for example, drought-stricken Charlie’s reaction to the 2010 EPA rules that attempt to control the amount of dust farmers are permitted to create on their own land. Or his reaction to mandates that dictate how many gallons of water a toilet can flush or the type of light bulb one can buy.
If Charlie foresaw the shape of things to come, he wasn’t the first. Both the Skeptics and the Founders predicted that democracy ’s demise would most likely come not from runaway debt, dangerous as that would be, but from an erosion of the character traits (they always said “virtues”) that give rise to liberty in the first place. The founding generation worried from the outset that what Lincoln later termed the “mystic chords of memory” would fade away, that a free people would come to forget what freedom requires, or what life without it would be like.
Madison said, “To suppose that any government will secure liberty or happiness without any virtue in the people, is a chimerical idea.” Franklin concurred, saying, “Only a virtuous people are capable of freedom. As nations become more corrupt and vicious, they have more need of masters.” Patrick Henry was the most concise: “Bad men cannot make good citizens.”
Many of today’s Americans are tone-deaf to the “chords of memory.” In fact, it may be more accurate to say that they’ve never even heard them. Stunning percentages of us know almost nothing of the nation’s history, its principles, or of what is sacrificed when security and subservience become prized above personal independence.
No one has echoed the Founders more eloquently than our greatest modern president, Ronald Reagan. In his farewell address, he said, “If we forget what we did, we won’t know who we are. I’m warning of an eradication of the American memory that could result, ultimately, in an erosion of the American spirit.” As usual, no one has said it better.
Launching a new organization aimed at civics education, former Supreme Court justice Sandra Day O’Connor said, “Barely one-third of today’s Americans can name the three branches of government, much less say what they do.... Less can say what the Declaration of Independence is, and it’s right there in the title. I’m worried.”
Judging from tests of our young people, the trend line is heading down, not up. On the National Assessment of Educational Progress (NAEP), the best national measure we have, less than a quarter of students are proficient in civics and American history. Sadly typical is the high school junior who in 1995 was surprised to learn that the school he attended, Grover Cleveland High School in Los Angeles, was named after a former U.S. president: “I always thought it was named after that city in Canada.”
1 And it’s not as if things are getting better. In June 2011, the latest round of NAEP results revealed, yet again, that young Americans have inadequate understanding of their own history.
Further reinforcing this distressing decline, a recent Newsweek survey revealed that the vast majority of Americans do not know why we fought the cold war, can’t name the incumbent vice president, and are unable to even define the Bill of Rights.
In recent years, Arizona has figured prominently in the national debate over illegal immigration. Among other concerns, many in that debate worry about the erosion of our national culture and patriotism that a huge, unassimilated ethnic group might pose. Ironically, when Arizona’s own high school students were given ten basic questions from the U.S. citizenship exam given to legal immigrants, only 3.5 percent of students passed the test.
2
Findings like these are not new. In 1994 longtime Harper’s editor Lewis Lapham memorably described the NAEP scores of that year as “a coroner’s report [that] returned a finding of mortal ignorance.” Scores have declined since then.
The public education system bears most of the blame for this abysmal civic ignorance. This failure may well be its greatest disservice to our nation, outweighing even the more publicized disaster of students who cannot read, compute, or grasp rudimentary scientific concepts.
Entire volumes have been written on these shortcomings. SAT scores have dropped almost linearly since 1972. U.S. students trail badly in every international comparison, particularly in the critical subjects of math and science. Recent marginal improvements offer no real comfort. Indiana ranks generally in the middle among American states; as I pointed out to our General Assembly, “At the recent rate of improvement, it would take 21 years for us to catch Slovenia, and that’s if Slovenia stands still.” It is an iron law of the modern economy that a nation so poorly educated will fall behind in its standard of living. In 1983 the groundbreaking education study titled “A Nation at Risk” stunned the country with the stark assessments that “[T]he educational foundations of our society are presently being eroded by a rising tide of mediocrity that threatens our very future as a Nation and as a people.... If an unfriendly foreign power had attempted to impose on America the mediocre educational performance that exists today, we might well have viewed it as an act of war.” Now, decades into the education reform movement, almost nothing has been accomplished, although at long last some structural change has begun.
It’s easy to blame the failures of our public education system on bad teachers, overcrowded classrooms, and inadequate resources, but the root of the problem lies within the system’s own bureaucracies, in particular, the government unions such as the National Education Association and the American Federation of Teachers. The adults who hijacked education for their own advantage did not intend for our young people to suffer as a consequence, but it was inevitable that they would suffer once the unions sought to protect incompetent teachers, principals, and superintendents against any accountability for producing well-prepared students.
The second disaster visited on our educational system was more intentional. A huge and equivalent danger comes from the content of schooling that does nothing to foster a sense of national mission, pride, or personal responsibility. The tragic civic ignorance we see today represents a purposeful reversal of the entire basis on which public education came into being in the first place.
Public education was invented with the goal of producing citizens endowed with the knowledge and character traits needed to govern themselves responsibly and without the corrective supervision of despots. It was created explicitly to promote knowledge of the founding principles, the heroic figures who forged and preserved our republic, and an overt patriotism for the nation.
Jefferson was an outspoken education advocate and recommended the study of history as a means of improving “moral and civic virtues”—there’s that word again—all in the service of effective self-government. He warned, “If a nation expects to be ignorant and free . . . it expects what never was and never will be.” In today’s United States we have people who appear willing to bet that Jefferson was wrong.
Until the 1840s, education was privately provided, usually by religious denominations, and was therefore usually available only to the wealthy. Reformers such as Horace Mann argued for public provision of education specifically on the basis that it would produce better citizens. Public schools, as Pennsylvania’s Walter Johnson put it, were essential in order “to give every member of American society a portion of knowledge adequate to the discharge of his duties as a man and a citizen of the Republic.”
And yet, in 2010, the North Carolina State Department of Education recommended that eleventh-graders stop studying U.S. history before 1877.
3 Their idea of a high school history course would cover “international relations, the struggle for human rights, and understanding how changes in the physical environment have impacted American society,” whatever any of that means. The recommendations went on to say, “Major emphasis will be given to political and social movements, and students will be asked to critique the outcomes of major political decisions.” Mercifully, a public outcry is forcing those standards to be revised.
It would be bad enough if today’s students were merely left unaware of the greatness and superiority of the free institutions America has brought to the world. In reality, they are more often taught the converse: that we are a deeply flawed nation, with few if any past heroes to look up to, and no cause to view ourselves as any better than any other country or culture. Scan any modern civics textbook for clear statements that free citizens have a responsibility to take care of themselves whenever possible and to demand the right to personal autonomy, and you will most likely find none. Another search for any intimation that a government by the people must of necessity be limited in its scope, having no power that is not expressly enumerated, will most likely turn up the same result.
What is easy to find is denigration of our institutions and our past. An especially atrocious example comes in the form of
A People’s History of the United States, a textbook drafted by an avowed Marxist, the late Howard Zinn. Since 1980 the book has sold more than a million copies and has found its way into numerous high school classrooms. It offers a decidedly slanted version of American history that downplays the triumphs of our Founders and free systems. In 2010 the release of FBI files detailed Zinn’s extensive connections to radical leftists and communist organizations that dated back to investigations of him that started in 1949. Picking up on the released files, Daniel J. Flynn wrote an article for
City Journal that noted a quote Zinn gave back in 1998. “I wanted my writing of history and my teaching of history to be part of social struggle,” he told the
Revolutionary Worker. “I wanted to be part of history and not just a recorder and teacher of history. So that kind of attitude towards history, and not just as a political act, has always informed my writing and my teaching.”
4
A recent review of world history textbooks in widespread use found extensive treatment of the internment of Japanese Americans in World War II but virtually nothing about Japanese aggression, or about Japanese atrocities committed against prisoners of war and civilian populations.
At least in their official, organized form, the teaching profession seems quite comfortable with this rewritten history. The American Historical Association, which weighed in recently on a new history curriculum being drafted in Texas and can influence what ends up in new textbooks, never led a push to oust Zinn’s text from classrooms. Likewise, the NEA has inserted itself into a wide variety of left-wing causes over the past decades, but hasn’t come to the defense of teaching the virtues of America’s free institutions. Indeed, the NEA, which recently spent more than $2 million on a ballot initiative aimed at raising corporate taxes in California, has donated money to the Sierra Club and to ACORN (the later disgraced and disbanded left-wing community activist group), and has been a stalwart opponent of education reforms aimed at raising the quality of education through testing, vouchers, and charter schools. It has even opposed education reforms, such as merit pay, that are supported by President Obama. In a free society, such organizations are well within their rights to wage political campaigns, but we shouldn’t pretend that such campaigns are entirely motivated by what’s best for the children and that they don’t impact the individual rights the rest of us enjoy.
As American citizens, we have never encountered such a massively intrusive state as the federal government of today. We have never been so poorly equipped to detect its creeping (or, in the age of Obama, leaping) encroachments on our traditional liberties, because so few of us remember that we are supposed to possess such liberty in the first place. We are told time and again that the government knows what is best for us, and we are thus resigned and acquiescent in the face of losses to our freedom and personal dignity.
Consider the stoic acceptance of the unprecedented inconvenience and privacy intrusions of the post-9/11 bureaucracy known as the Transportation Security Administration (TSA). Starting with hour-long waiting lines to pass through metal detectors and other scanners, the TSA lumbered on mindlessly, as bureaucracies do, until it was practically molesting plainly innocent and harmless passengers who just wanted to catch a flight home for Thanksgiving.
Columnist Charles Krauthammer, with his customary incisiveness, pointed out, “[E]veryone knows the entire apparatus of the security line is a national homage to political correctness.... This has nothing to do with safety—95 per cent of these inspections, searches, shoe removals, and pat-downs are ridiculously unnecessary. The only reason we continue to do this is that people are too cowed to even question the absurd taboo against profiling—when the profile of the airline attacker is narrow, concrete, uniquely definable and universally known.”
5
The decline in our grasp of freedom’s requirements has been paralleled by a sharp increase in the degree to which Americans are dependent on government, which is to say, subsidized at the expense of their neighbors. A Census report that looked at American households in 2008—before the more recent spike in unemployment—found a surprisingly increasing number of Americans now receive some form of public assistance. At the time, 28.4 million households, or one out of every four in the country, were receiving checks through welfare programs. This was up from 16 percent in 1984. Add in social insurance programs, such as Social Security, Medicare, and Medicaid, and the number of households with at least one person receiving government assistance was 45 percent.
6
During the 2008–9 recession, and well beyond the point at which it technically ended, unprecedented amounts of unemployment compensation were distributed to those out of work. Presumably everyone can agree that a system of temporary assistance is the right thing in bad economic times, and unemployment insurance (UI) has been a fixture of our system for decades, seeing us through a host of previous recessions.
But the latest round was different. UI had historically been offered for twenty-six weeks, with an option in the law for extension to fifty-two weeks. Between 2008 and 2010, benefits were eventually lengthened to ninety-nine weeks, essentially two years. The question is begged: Where does it end? Are we in favor of permanent payments to those not working?
Employers and individual Hoosiers related to me on countless occasions that UI recipients were rejecting employment offers, telling me that “UI is a better deal.” My concern is deeper than that. Companies that have laid off employees in the recession won’t necessarily hire them back once the good times return. Many of them are undergoing a restructuring that may lead them to hire new workers in the future for different types of jobs than they hired for in the past. Similarly, American workers have to make hard decisions to seek retraining or otherwise restructure their lives in order to make themselves eligible for new jobs in the future. Extending unemployment benefits for years may be a compassionate thing to do in one sense, but in another sense it creates an entitlement mentality and an incentive for workers not to restructure their lives. These programs offer benefits but do not require workers to make themselves ready to take advantage of new opportunities down the road.
The hundreds of billions paid out in UI benefits were also charged to the very businesses we hoped would start hiring again. Having already been crippled by one of the worst economic downturns in recent history, these companies found it even more difficult to recover under this added pressure. As odd as it may seem, at some point the worthy and compassionate programs we have put in place become economically self-defeating. But the worst defeat suffered at the hands of this system is that of our citizens’ personal responsibility and dignity.
American “Entitlement”
Maybe solving our debt problem starts with vocabulary. As the late Patrick Moynihan once said, he who controls the dictionary controls the debate. I have come to disagree with the word we use to describe Social Security, Medicare, Medicaid, and similar programs: entitlements.
When one stops to think about it, an American is entitled to life, liberty, and the pursuit of happiness, period. We are entitled to be safe in our homes, and to the freedom to live our own lives, and make of ourselves whatever our talents and hard work can produce. That’s what we’re entitled to from our government, and our Constitution.
Everything government does after that is a matter of choices we make as a caring and compassionate people. When we decide together that some of us will contribute some of our substance so that others may have good health care, or avoid destitution in their older years, those are good things we opt to do for one another, not something anyone is entitled to. These programs got their name because of their design, which says that anyone who meets a certain description can access them by definition. But those definitions are not in the Constitution; nor are they on the Supreme Court cornerstone. They were established as matters of choice by the American people through their elected representatives, and they are subject to change in the same way.
The comedian George Carlin once asked, “What’s another good word for ‘synonym’?” We need one in this situation. We should retire the misleading term entitlement in favor of safety net or social welfare or some other noun that avoids implying that any citizen of our nation, apart from the conscious generosity of his fellow Americans, automatically has a claim on one dime of his neighbor’s property. The notion that he does is not only a prescription for national poverty, but also an affront to the freedom and the dignity of each one of us.
Neither Charlie Flagg nor his creator, Elmer Kelton, who wrote his book in 1973, could have conceived the extent of the dependence we now have on government. But they saw clearly the effect such dependence would have on the character of those receiving it. In the book, when another acquaintance of his tells Charlie that he should accept government handouts for the sake of everyone else, Charlie consoles him: “[T]hey won’t cut off your aid. They’re too busy buyin’ your vote with it.” In Charlie Flagg’s view, when a person sells his vote for a government transfer, he throws in an unconscious gratuity: a little bit of his own dignity and self-reliance. He becomes a shrunken citizen.
A Dollar Less Free
There is a game I like to play whenever I visit high school classrooms. I ask if anyone has a dollar bill handy, and when some unsuspecting youngster produces one and hands it to me, I thank him, stick it in my pocket, and continue talking. As the giggles and tittering build, I pause theatrically as if to say, “What?!” and when I “realize” that young Brandon wants his dollar back, I make the following point:
“Please note,” I say, “that Brandon is now a dollar less free than he was a minute ago. If he still had that dollar bill, he could decide what to do with it. Now I get to decide. And that’s why we need to be so careful before we take away Brandon’s money, or yours, through the coercive power of taxation. If we really value personal freedom the way we say we do, then we should never take a dollar away from a free citizen without a very good reason. And then we have a solemn duty to spend it as wisely and effectively as possible. Otherwise, we should never have taken it in the first place.”
Of course I give Brandon his dollar back, but I also leave the class with a lesson that, I can tell, they have never heard before.
Unfortunately, when we talk about government, we’re talking about trillions of dollars, not just one. Over the past four decades, federal spending per household has grown by more than 160 percent (see Figure 6), putting us deeper into the pockets of our Benevolent Betters.
In case our increasing dependence on cash from “government” isn’t enough to convince us how much we need our Betters, regular reinforce-ment of a sense of victimhood is there to help. Consider the way in which business, and business leaders, are depicted.
Figure 6 | Federal Spending per Household
Source: Heritage Foundation Calculations based on data from the U.S. Census Bureau, White House Office of Management and Budget, and Congressional Budget Office.
Among the many things today’s young people are unlikely to learn either at school or in front of a television or movie screen is where wealth comes from. Even if they are lucky enough to be exposed to some sort of basic economics class, they probably won’t be told that the creation of wealth is among life’s highest and most important achievements. One can search K–12 textbooks at great length before finding any laudatory comments about those who built America’s great businesses or the industries that opened the way for millions of impoverished citizens to build a better life for themselves. And the chances of hearing anything positive after a student gets to college are even more remote.
Determining bias is, of course, a subjective endeavor. However, there has been enough research done on the political slant on college campuses for us to conclude with confidence that many students are offered a one-sided view of the world when in college. Consider the results dug up by the Center for the Study of Popular Culture in 2003. In a study that looked at the political affiliation of faculty and administration members at 32 elite colleges, the center found that the ratio of Democrats to Republicans was decidedly lopsided. The overall ratio was more than 10 to 1 (1,397 Democrats to 134 Republicans), while at some schools the ratio was much wider. At Bowdoin College, for example, the ratio was 23 to 1. At Brown it was 30 to 1. At Williams, Oberlin, Haverford, and the Massachusetts Institute of Technology, not a single Republican could be identified among the faculty.
7
Portrayals of business in the popular culture are ludicrously inaccurate and unfair. Michael Medved has chronicled bias against capitalism embedded within our TV shows and movies. In the two decades that followed the end of World War II, Medved reports in his book
Hollywood vs. America, just 11 percent of Hollywood’s villains were businessmen. By 1986, 67 percent of characters who committed felonies on-screen were businessmen. “Big business has become television’s favorite villain,” he writes.
8 And over the past two decades, the trend has only continued. Moviemakers become enormously wealthy by tediously denigrating the very free market system that allows them to thrive, but that irony never seems to deter them or even catch their attention.
As a result of these portrayals, Americans are left to believe that the private sector is a jungle of predators against whom the average citizen must be safeguarded by an ever-vigilant government. Villains are everywhere, and if things go badly for you, it must have been someone else’s fault.
If you borrowed huge amounts of money to buy a house beyond your present means, someone deceived you. If you compounded the risk by refinancing the home, based on the wishful expectation that its value would shoot up in the future, it’s only because you were misled. If you maxed out all your credit cards, you must have been tricked by a devious marketing ploy. In all cases, it’s someone else’s fault, and your caring federal government must intervene to get you off the hook.
Medved once spotted a telltale indicator of the Benevolent Better worldview in President Clinton’s First Inaugural Address: “We must provide for our nation the way a family provides for its children.” As Medved pointed out, “In other words, [Clinton] viewed his fellow citizens—or at least many of his fellow citizens—as helpless kids who couldn’t support themselves.” That is exactly right, and exactly the problem we face.
As we’ve come to say, when you are child and you are deceived into losing money, someone must come in and bail you out. And why not? After all, we did it for Chrysler, and AIG, and Bear Stearns, so why not for you? Of course, the answer is, there’s only one boat, and we’re all in it. Bailing from one side of a sinking vessel to the other is a futile exercise that simply delays the search for a real solution.
Victimhood and Its Enablers
The absurdity and destructiveness of the American legal system as it applies to civil wrongs, or torts, has been well catalogued for years. My concern here is with the way in which the tort system serves few interests other than the financial interest of the trial bar millionaires it spawns, while fostering a sense of victimhood.
In 2005 we saw in Texas what happens when wealthy lawyers persuade a large group of Americans to declare themselves victims and seek undeserved compensation. In that case, federal district court judge Janis Graham Jack uncovered a scheme that made national headlines. Apparently, in filing a class-action lawsuit aimed at winning money damages for individuals supposedly suffering from silicosis, a law firm had recycled many victims used in previous cases filed over ailments supposedly caused by exposure to asbestos. Upon close examination, it turned out that many of the “victims” in both cases had likely been fraudulently diagnosed. The aim of that scheme, of course, was to use the courts to extract money from innocent companies. After a remarkable hearing in Corpus Christi, Judge Jack issued a lengthy order with this conclusion: “these diagnoses were driven by neither health nor justice; they were manufactured for money. The record is not clear who originally devised this scheme, but it is clear that the lawyers, doctors and screening companies were willing participants.”
9
Most Americans remember the successful lawsuit against McDonald’s for the “tort” of serving a customer the hot coffee she ordered. We as a people have been encouraged to treat every bad break in life as an opportunity to sue. As one foreign observer marveled, “You Americans are trying to outlaw the concept of bad luck.” And if life doesn’t deliver the right opportunity, you can always create your own bad luck. People in my hometown, for example, were caught by the news media encouraging their children to provoke confrontations with the police in the hope of producing grounds for a lawsuit.
People who are coached to shift the cost of their own bad luck, or their own mistakes, to their neighbors are actually being told, “You are incompetent.” People who are supplied with increasing amounts of money, transferred from their neighbors in an open-ended fashion by the force of government, are being told, “You are not fit to provide for yourself and your family.” Those who are described as helpless victims, at sea in an ocean of private-sector sharks, are being told, “You are incapable of looking out for yourself. You need us to protect you.” Americans accepting these premises are not the kind of citizens Jefferson hoped to educate, or that Franklin had in mind when he hoped we could keep our republic. They sell themselves short whenever they submit to the idea that they are not the creatures of dignity and autonomy for whom, and on whom, a free society exists.
Every expansion of public power shrinks the citizen, automatically and axiomatically. Beyond all the arguments about spending, taxing, deficits, health care, regulation—and on and on—lies the largest decision of all: What kind of people do we want to be?
Abdication of the “People’s Branch”
Even a civics-deprived product of today’s typical public schools should know that we have three branches of government, and that the legislature (named after the Latin word for “law”) owns the sole power to make the laws of the land. It is through the legislature, often referred to as “the people’s branch,” that free citizens acting through their representatives are supposed to set the rules of society. That’s no longer the way it works. As detailed earlier, an astonishing amount of the laws created today are not made by elected, and therefore recallable, representatives, but by unelected bureaucrats and judges. Congress is not blameless in this situation. All too often its members have preferred to abdicate the details of their handiwork to agencies and move on to their next act of stylized benevolence.
The 2010 2,319-page financial regulation bill delegated to a variety of agencies the opportunity to write more than 240 separate sets of new rules. The Obama health care bill, a 2,000-page tome in its own right, lobbed to the Department of Health and Human Services several hundred more similar opportunities. In 2010, Americans tossed out of office dozens of those who voted for the underlying bill, but they had no means to remove from a much greater position of power Health and Human Services Secretary Kathleen Sebelius, who began her new career as the czarina of health care by sending threatening letters to the nation’s insurers who had committed the sin of fully disclosing that the bill’s mandates and coverage expansions were going to raise premiums and cause many customers to drop health insurance completely.
Sometimes our Betters simply arrogate new power for themselves without even bothering with the nuisance of securing legislative delegation. The Environmental Protection Agency (which probably should be renamed the Employment Prevention Agency) declared with no statutory basis at all that it had the power to regulate carbon dioxide, which is necessary to life on this planet, as a dangerous pollutant, on a par with mercury or lead. A compliant judiciary somehow found a way to defer.
Not only has Congress never empowered the EPA to limit CO2, it has expressly refused to do so on multiple occasions, even when asked by President Obama during the highly compliant 111th Congress. No problem. The unelected EPA and the unelected federal judiciary collaborated to redefine CO2 on their own, unleashing the agency to pursue the irrational, horrendously expensive, job-killing policy of carbon limitation despite the nonagreement of “the people’s branch.”
Skin in the Game
Resistance to the expansion of government is lowered further when a citizen senses no cost attached to it. Increasingly taxation is either indirect or invisible, so that fewer and fewer of us find reasons to question the way our dollars get spent. Indirect taxation includes the trillions we pay for the cost of federal regulations. Goods, services, and energy all cost far more than they would in the absence of these requirements. Sometimes we get our money’s worth, sometimes not, but in neither case do we detect the dollars as they depart our pocketbooks. Many other mandated costs, and of course corporate taxes, are also passed on to consumers.
The taxes that are paid, and especially the individual income tax that is government’s single largest revenue source, are increasingly imposed on a dwindling percentage of Americans. Most people would be surprised to learn that in 2009, 47 percent of Americans did not pay anything in federal income taxes. They either made too little or qualified for enough exemptions so as not to owe any tax liability. What’s more, some 40 percent of Americans actually made money off the tax system through refundable tax credits and other provisions.
Both political parties—and this author—have been party to this enormous shift. When I worked in the Reagan White House I enthusiastically advocated the president’s 1986 tax reform. One of our proudest talking points was the bill’s promise to remove six million low-income Americans from the tax rolls altogether. Fifteen years later I was back in another White House making a similar claim on behalf of the Bush tax cuts of 2001.
Positive as these reforms were in many ways, cumulatively they have produced a distorted system and some perverse outcomes. First, fewer of us recognize and feel personally the burdens of even the largest and most expensive public sector the nation has ever seen. We’re not invested. We have no skin in the game.
Second, the federal government and many states have become precariously dependent on a few high-income (and therefore highly mobile) individuals for very large shares of their revenue. In 2008, for example, the top 1 percent of earners paid more than 38 percent of federal income taxes. The top 10 percent of earners—those making about $114,000 a year or more—paid nearly 70 percent. Meanwhile, the bottom 50 percent of income earners paid just 2.7 percent.
10 The net result is that today’s federal government is overly reliant on taxes paid by a volatile and mobile group of individuals who have relatively large swings in their income and who have the luxury to withdraw either themselves or their capital from the marketplace during downturns.
California in recent years has demonstrated the pitfalls of building a tax system that is overly reliant on the wealthy. Writing in the
Wall Street Journal in January 2011, Andrew G. Atkeson and William E. Simon, Jr., reported that in 2007 the state had nearly 36 million residents, but just 7,000 households—those with adjusted gross incomes above $5 million for the year—paid one-fifth of the state’s income taxes. By the following year there were only 4,700 households making that kind of money. The result was that taxes paid by these households fell to about $7 billion from about $11 billion. That drop-off was half of all the revenue lost by the state. Nearly all of the rest of the revenue lost also came from “wealthy” households. Atkeson and Simon tell us that “The decline in tax receipts from those households earning more than $200,000 accounts for fully 93 percent of the decline in total tax revenues from 2007–08.”
11 In other words, as California plunged deep into deficit, more than ninety cents of every tax dollar lost in revenue can be attributed to the “wealthy” paying less because they either made less or fled the state. They now say in California, “When one of those people moves out, it’s an event.”
The result of such an imbalanced system is that government becomes ever more reliant on the growing prosperity of rich people, and therefore of income inequality, to pay for its expansion. Nothing could be more self-defeating than for the proponents of the social welfare state to attempt to limit the incomes of the wealthy. After all, the tax system they have constructed cannot produce the funds they long to spend unless the rich get even richer. I wonder if the deep irony of this conundrum ever dawns on our friends, the Betters.
When any change is suggested that would lower any tax rate, even in exchange for closed loopholes, the nation’s left knees all jerk in unison, alleging that “the rich” will reap some undue benefit. Each time top rates have been reduced the fact that upper-income people wind up paying much more in taxes and a higher share of all taxes never seems to matter. The implication is that our current system already favors the wealthy.
If that’s so, then so does every nation’s on earth. Using data from the OECD, both the Congressional Budget Office and the nonpartisan Tax Foundation found the United States to have the most progressive tax system of all twenty-four countries studied. Counting taxes of all kinds, a wealthy American is paying between 31 percent and 35 percent more than his proportionate share of income, whereas the one-fifth with the lowest incomes are paying only 20 percent of their proportionate share. The only way to extract more federal revenue from today’s system is through an increase in income inequality. Is that really what our statist quo friends desire?
I learned a hard lesson in the dynamics of current federal taxation in my role as director of the Office of Management and Budget in 2001. The new Bush administration arrived in the third year of federal surpluses, the first since the federal government experienced a small surplus in 1969.
12 Much credit for this remarkable turn of events goes to the Clinton administration and a Republican Congress, which cooperated in restraining spending growth, and to President Reagan, who as Clare Booth Luce once said, “won the cold war without firing a shot” and produced a huge “peace dividend” of reduced defense spending.
But the surpluses more than anything could be traced to the combination of the dot-com bubble economy of the late 1990s and the lopsided tax system just described. To an extent that no one realized at the time, the huge surge in federal revenues that moved the nation into surplus came from a relatively few taxpayers whose income was highly correlated to the stock market’s upward movement.
Income taxes on bonuses, stock options, and other incentive compensation were a large part of the surge. So were capital gains taxes paid on what turned out to be inflated stock values. Believing the new levels of revenue to be natural, all forecasts—those of OMB, the Congressional Budget Office, the Federal Reserve, and everyone else—assumed those levels as a base and projected upward from them depending on their chosen GDP growth assumptions. Reducing taxes as President Bush proposed was a straightforward call given these consensus projections; there was plenty of room to do that and keep paying off debt at the same time.
Proudly paying down debt as 2001 unfolded, we studied these forecasts and saw the potential to retire the whole federal debt within a few years. The nation’s most respected financial figure, Alan Greenspan, spoke of the need to leave at least some debt outstanding simply to preserve a liquid market for any occasional future public borrowing.
How wrong we all were.
13 The recession that had begun as President Clinton left office was a mild one by historical standards. It lasted only eight months, shorter than the average recession since the end of World War II, and was mild enough to spark a debate over whether the slowdown fit the technical definition of a recession. Revised economic data released by the Commerce Department’s Bureau of Economic Analysis in 2004 indicate that the country might not have experienced three consecutive quarters of negative GDP growth in 2001.
14
In any case, such a recession in the past would not have had a large impact on government revenues. But in the new tax system, a dropping stock market took a gigantic bite. Capital gains payments fell from $127 billion in 2000 to $49 billion in 2002. During this period, individual income taxes paid plummeted by 14 percent, the majority of the drop coming from the upper-income levels. The budget recovered within a couple years and, even bearing the enormous cost of the Iraq War and the new post-9/11 activities dealing with homeland security, brought deficits back down to a historically small 1.2 percent of GDP by 2007, before the next recession raised them again.
I take this digression only to underscore how detached from the costs of our national government most Americans have become. When the payroll taxes associated with Social Security and Medicare are counted, our overall tax burden is more widely shared than it appears. But half of those are listed on pay stubs as “employer” taxes, obscuring the fact that the worker’s wages would have been higher by that amount if Uncle Sam hadn’t claimed it first.
Full citizenship in a free society involves some measure of contribution to our necessary collective activities. And it is entirely just that those receiving more of society’s income should contribute at least in proportion to their gains. But however we decide to apportion the bill for government, true democracy requires that we see the full size of the check we are paying.
In recent years, the most vigorous taxpayer uprisings have centered on state and local property taxes. When one thinks about it, it’s no surprise. Property taxes are usually as visible as a tax can be. They are collected not in weekly dribbles but all at once, or maybe twice a year. The bill comes to the taxpayer directly, not to his employer or through higher prices for the things he buys at the store. No one “painlessly” pays this tax on his behalf; he writes a check for the full amount or pays it each month as part of a mortgage payment to his bank. He can evaluate for himself whether his property taxes have become too high relative to whatever benefit he perceives he is receiving.
In 2007, Hoosiers in large numbers began to complain that their local property taxes were out of line with benefits. We responded by reducing property taxes an average of 30 percent. In some counties, the cut was as high as 60 percent. All told, it was the largest tax cut in state history, and it gave Hoosiers the lowest property taxes in the United States.
To prevent them from becoming excessive again, we added a cap to the new law, and to our state’s constitution. No Indiana homeowner can ever again be charged more than 1 percent of his home’s value, no farmer or landlord more than 2 percent, and no business more than 3 percent.
But then we added an important feature by providing flexibility for spending units to seek revenue above what the caps would permit, if and only if they could secure taxpayer permission through a referendum. Higher local government spending, and higher taxes to pay for it, can still happen in Indiana; the difference is that now the spenders need the taxpayers’ permission first.
In its first years, the new law has given rise to its first few dozen referenda. About half of them passed and about half were turned down. Hoosiers seem to be showing the common sense and reasonableness for which they are known. When they hear a good case for more service, they are generally willing to approve and pay for it. They are more fully engaged in governing themselves, by making mature decisions about how much government they want, and at what price.
Government by the people works best when everyone has some skin in the game and knows it, when taxes are visible and quantifiable to those paying them. When government hides its costs, it reveals its doubt that a fully informed citizenry would condone all that it is doing. Government grows fastest the more it shrinks the citizen’s role in holding it accountable.
As much as the government may try to shrink its citizens, recent years have shown that we’re becoming less willing to stand by and watch. In 2009, in the out-of-nowhere fashion that a wired world makes possible, a network of people who came to call themselves the Tea Party suddenly sprang to life. Unlike many alleged protest organizations we have seen, consisting of a dozen sign carriers and a couple of television news cameras, this was a genuine grass-roots citizens movement, motivated by the kind of freedom-minded citizenship this chapter has fretted about the nation losing.
The Tea Party may have been born in reaction to bailouts of Wall Street, takeovers of businesses, and the explosion of federal spending, but it rapidly evolved into a national movement worthy of the term, sophisticated not only in its tactics and communications, but in the depth of its thought. For the first time in their lives, many Americans were reminded that there is a Tenth Amendment to the Constitution that reserves certain powers to the states and therefore puts more responsibility in the hands of the people.
The response of the president and his partisans to this insubordination was comical when it wasn’t sinister. First, they dismissed the Tea Party as a fictional contrivance of the Republican Party, or the Vast Rightwing Conspiracy, or somebody. Then it was a dangerous, radical operation with racist overtones. When the president’s supporters were caught manufacturing false charges of racist behavior, and trying to infiltrate rallies with hate-speech signs, new criticisms had to be trotted out. Still, the movement grew.
Of course, such a rambunctious and spontaneous outfit would attract its share of quirky adherents. And it was fair to point out that the early Tea Party was long on anger and light on answers, more about diagnosis of the nation’s problems than prescriptions for their improvement. It was true that, during the first election cycle of its existence, no one stood before a Tea Party rally and spelled out, for instance, how huge the changes in Social Security and Medicare would have to be to bring our debt back to survivable levels.
Yet for the first time in a very long while, Americans saw living examples of full, active citizenship by people who politely—well, sometimes impolitely—insisted that the government work for them and not the other way around. These people are determined to make their own decisions and stand on their own two feet, and they believe that others ought to do likewise.
The housewives who started the Tea Party have probably never heard of Charlie Flagg, but they have succeeded in returning some of his spirit to our country. During one confrontation in the novel, a haughty federal bureaucrat lectures Charlie, “If you don’t like laws, Mr. Flagg, perhaps you should move to some country where they don’t have any.” To which Charlie replies, “I liked this country before it was saddled down with so many of them.” In order for our country to survive the challenges it faces—both today and down the road—we must take a lesson from Charlie. We must remember the mystic chords, and refuse to be shrunk.