Every time some new income statistics come out, two predictable fallacies follow in their wake. The first is that the rich are getting richer, while the poor are falling behind. The second is that the real income of American families has not risen significantly for years.
These fallacies return as regularly as the swallows returning to Capistrano, though not nearly as gracefully. A typical headline in the New York Times proclaims: “In A Time of Plenty, The Poor Are Still Poor.” Yet study after study has shown that “the poor” do not remain poor in contemporary America.
An absolute majority of the people who were in the bottom 20 percent in 1975 have also been in the top 20 percent at some time since then. Most Americans don't stay put in any income bracket. At different times, they are both “rich” and “poor”—as these terms are recklessly thrown around in the media. Most of those who are called “the rich” are just middle-class people whose taxes the politicians avoid cutting by giving them that name.
There are of course some people who remain permanently in the bottom 20 percent. But such people constitute less than one percent of the American population, according to data published by the Federal Reserve Bank of Dallas in its 1995 annual report. Perhaps the intelligentsia and the politicians have been too busy waxing indignant to be bothered by anything so mundane as facts.
Alarmists are not talking about real flesh and blood people. They are talking about abstract categories like the top or bottom 10 percent or 20 percent of families or households. So long as all incomes are not identical, there will always be top and bottom 10 percents or 20 percents or any other percents. But these abstract categories do not contain the same people over time.
Households do not contain the same numbers of people, even at a given time. The bottom 20 percent of households contains 39 million people, while the top 20 percent contains 64 million. Comparing households is comparing apples and oranges.
If you are serious about considering the well-being of flesh and blood human beings, then you can talk about their real income per capita. But alarmists avoid that like the plague, because it would expose their little game for the fraud that it is.
Real income per capita has risen 50 percent over the same span of time when household income has remained virtually unchanged. How is this possible? Because households are getting smaller. The very fact that there are higher incomes enables more people to afford to go out and set up their own independent households.
Behind both the statistics on inequality that are spotlighted and the statistics on ever-changing personal incomes that are ignored is the simple fact that people just starting out in their careers usually do not make as much money as they will later on, after they have had years of experience.
Who should be surprised that 60-year-olds have higher incomes and more wealth than 30-year-olds? Moreover, that was also true 30 years ago, when today's 60-year-olds were just 30. But these are not different classes of people. They are the same people at different stages of their lives.
At some times and places, there have been whole classes of people who lived permanently in poverty or in luxury. But, in the United States today, the percentage of Americans who fit either description does not reach beyond single digits.
It is one thing to be concerned about the fate of flesh and blood human beings. It is something very different to create alarms about statistical relationships between abstract categories.
Despite desperate efforts of activists to keep “hunger in America” alive as an issue by manipulating numbers, actual examinations of flesh and blood people show no nutritional differences between people in different income brackets. In contrast to the gaunt and undernourished poor of other times and places, Americans in the lower income brackets today are slightly more likely to be overweight than is the rest of the population.
The magnitude of statistical differences may tell very little about the condition of human beings. A two-to-one difference in the amount of food available would be very painful if it meant that those on the short end did not have enough to eat. But a thousand-to-one difference in price between wearing a Rolex and wearing a Timex is something that can be left to the alarmists—especially since both watches tell time with about the same accuracy. And both are a lot more accurate than “income disparity” hysteria.