CHAPTER THREE
I
n the late 1800s, an English scholar named James Bryce traveled extensively in the United States and wrote a multivolume study filled with his observations entitled The American Commonwealth. Like Alexis de Tocqueville before him, Bryce was struck by how Americans were so much more trusting of each other and so much more generous than Europeans. Bryce attributed this quality, as Tocqueville had, to an ethos of class equality that pervaded American social relations. “People meet on a simple and natural footing,” he wrote, “with more frankness and ease than is possible in countries where every one is either looking up or looking down. . . . It gives a sense of solidarity to the whole nation, cutting away the ground for the jealousies and grudges which distract people.”1
Does this sound like ancient history or what? If Bryce returned today, he’d find a very different country—a country filled with VIP lounges, personal assistants, chartered jets, exclusive restaurants, luxury hotels, and private golf clubs; a country where superrich celebrities and sports stars reign as demigods, where the wealthy engineer superior looks and health through expensive medical intervention; a country soaked in poisonous envy spurred on by a $250 billion advertising industry; and a country where millions of affluent people live behind guarded gates.
These unhappy changes—the new class divisions in an America far richer than ever before—are an important part of my story about why cheating has increased over the past few decades.
It’s not hard to see how much richer America has become since the 1970s. Just stop in at a Banana Republic clothing store. Over 400 of them are sprinkled across the United States, each one carrying identical upscale clothes. For $78, a man can purchase a pair of classic straight-leg jeans. A hundred dollars buys a fancy pair of wool dress pants. A short-sleeve polo shirt is $42. The ubiquity of Banana Republic stores—or rather, the ubiquity of customers willing to blow $42 on a polo shirt—is testament to the remarkable new affluence of American society. Twenty years ago, Banana Republic consisted of two stores in San Francisco specializing in outdoor clothing, along with a catalog business. After it was bought by the Gap in 1983, it jettisoned any serious connection to outdoor activities and refocused on clothing affluent urban professionals. Other newly expanding chains, like J. Crew, catered to the same demographic. By the mid-1990s, Banana Republic was in every major city in America and was opening over twenty-five new stores a year, expanding in step with America’s upper middle class: during the 1990s, the number of households filing tax returns reporting incomes over $100,000 a year quadrupled.
The creation of this mass upper middle class is a historic accomplishment on par with the creation of the mass middle class after World War II. Yet this gain has come at a significant cost. The vast gulf between the top tiers of American earners and everyone else is the most obvious of these costs. How big is this gulf? Very big.
To get a sense of these gaps, stick around the Banana Republic store. The people working the cash register and sales floor are typical of the losers in the new economy. Most of them are young entry-level workers without college degrees. As unskilled entry-level workers, these salespeople are doing terribly, in historical terms, and may well be making less money than their parents. If they’re making the minimum wage, they’re earning a wage that has decreased in constant dollars by over 20 percent since 1979. If they’re making more than the minimum wage, they’re still not doing well. Between 1979 and 2001, during one of the greatest economic booms of all time, entry-level wages for workers with only a high school degree fell by 14 percent. At the same time, employer-provided pensions and health care became less common for these workers. Nearly 30 million American workers—almost a quarter of all working people—earn under $19,000 a year.2
The manager at the Banana Republic is likely to be the one lucky employee who actually makes an okay salary and gets good benefits. In fact, Gap, Inc., has some of the best benefits in the retail business, including free therapy for stressed-out workers. But lucky as she is, the store manager may well be doing worse than her blue-collar parents who never went to college. And she’s making chump change compared to the professionals who are running the Banana Republic division at the Gap, Inc., headquarters in San Francisco. For example, the senior vice president for product design and development or the senior vice president for marketing are probably making five times the salary of the store manager. These people have firmly made it into the mass upper middle class.
And what about the people at the very top of the Gap’s $14 billion empire? Donald Fisher, who opened the first Gap store in San Francisco in 1969, along with his wife, Doris, is worth $3 billion. Millard Drexler, until recently the CEO of Gap, Inc., saw his compensation skyrocket in the late 1990s and walked away with a fortune of over $500 million. Even in 2000, when Gap, Inc., did poorly and saw its earnings fall, Drexler managed to finagle a raise of $25 million. He did this by cutting his bonus while taking many more stock options. The sleight-of-hand earned Drexler a place in the “Pay Hall of Shame” put out each year by Graef Crystal of Bloomberg News. Crystal wrote: “Drexler’s case illustrates how CEOs often are able to extract a raise in hard times, simply by reengineering their pay packages in a manner that makes the average shareholder think they took a cut in pay.”3
The pay structure at Banana Republic mirrors that of the nation as a whole. From the late ’70s through the late ’90s, the top 1 percent of American households saw their incomes increase by 157 percent. On average, these superwealthy households scored after-tax income gains of nearly half a million dollars. More broadly, all Americans in the top 20 percent of households did extremely well during the 1980s and 1990s. The average after-tax income gains for the top fifth of households increased by over 50 percent.
Families at the bottom of the economic ladder and those on the middle experienced the boom years in an altogether different fashion. For these households, there was no boom at all. In typical fashion, those at the very bottom missed out the most. On average, the incomes of the bottom fifth of households actually declined slightly during the supposed boom—going from an average of $10,900 in 1979 to $10,800 in 1997. In the last few years of the 1990s, the bottom tier of earners did see some gains, but these were not great enough to make up for the stagnant wages of the previous twenty-five years.
And what about the people smack in the middle, the fabled pillars of American society? The boom passed them by, too. Aftertax income gains by the middle fifth of households through the ’80s and ’90s were about 10 percent, or $3,400 in new income—less than a tenth of the increases of the top 20 percent.4
Wealth gaps have grown even faster than income gaps in America’s Banana Republic economy. Forget all the talk of how everybody now owns stock and how we’re in some golden age where the “middle class has joined the money class.” In fact, we’ve been hurtling backward in time. The top 1 percent of Americans now holds nearly 40 percent of all household wealth—such as retirement savings, household equity, stocks, etc.—up from 20 percent in 1979. This tiny sliver of the population has more wealth than the bottom 90 percent of households put together. Many Americans lower on the economic ladder have seen their net worth actually decline; the net worth of the bottom 40 percent of households fell by a shocking 76 percent between 1983 and 1998. As for the supposed populism of Wall Street, most Americans missed out on the hyped bull market of the ’90s since the lion’s share of stocks was—and is—owned by wealthier households. For example, between 1989 and 1997, 86 percent of stock market gains went to just the top 10 percent of households.5
Social scientists have long been arguing about the causes of skyrocketing inequality. Leading suspects typically include technological change and globalization. But scholars also blame the new inequality on bottom-line business strategies that have made companies leaner and meaner and have abolished equity norms along the way. Management pays itself bigger salaries while more and more jobs are outsourced to temporary workers without benefits or to contractors who pay poor wages. A rollback of government intervention in the economy has encouraged this trend by keeping the minimum wage low, making it harder for workers to form unions, and reducing government oversight of existing labor and workplace safety laws. Free-market ideologues have also successfully pushed tax polices that favor the wealthy and further concentrate wealth at the top of our society.6
Inequality has grown across the industrialized world since the 1970s, but it’s more acute here in large part because America has done next to nothing to tackle the problem: There have been no huge new investments in education or job training to ensure that everyone can compete in the postindustrial economy; no consistent government efforts to prime the economic pump to keep labor markets tight and raise wages for those at the bottom; and no major assistance to lower-income families to help them build wealth in the form of homes and retirement savings.
Many say that inequality doesn’t matter. It’s said that as long as there is opportunity, as long as people can move upward and transform their lives through hard work, inequality is not a social problem we should worry about. “If you drive a Mercedes and I have to walk, that’s a radical difference in lifestyle. But is it a big deal if you drive a Mercedes and I drive a Hyundai?” asks Dinesh D’Souza. “Why should people feel aggrieved that the rich are pulling further ahead if they are also moving forward?”7 If this sounds fine in theory, it’s not how things are playing out in reality. For one thing, America’s many workers who are without the education or skills to prosper in the new economy have little opportunity to improve their station in life. And mobility is not nearly as great in American society as is commonly imagined. In fact, research shows that the United States now has less economic mobility than some Western European countries such as Sweden.8
At the same time, there is mounting evidence that inequality has a variety of poisonous effects on the fabric of American life. As our society has become more divided along income lines, social and cultural gaps between people have increased. The very affluent have retreated from the public sphere, increasingly sending their children to private schools or living in gated communities—while the middle class deals with cuts in services and bad schools. As the rich have used their wealth to grab more influence over public policy, the middle class and the poor feel even more alienated from politics. Meanwhile, even though the economic pie has been expanding, insecurity has actually grown for many workers as their slice of that pie has stayed the same or shrunk—while living costs keep rising.9
The yawning gap between winners and losers is also having a lethal effect on personal integrity. In a society where winners win bigger than ever before and losers are punished more harshly—whether by losing a job with benefits or not earning enough money to make ends meet—more and more people will do anything to be a winner. This is an absolutely critical point to grasp in understanding the cheating culture, and it is obvious when you think about it. Cheating is more tempting if the penalties for failure are higher, if you’re feeling pinched or under the gun, like the Sears mechanics or many lawyers today. It’s also more tempting if the rewards for success are greater—if cheating can make the difference between being a multimillionaire or just getting by. When people perceive this kind of choice, they will often kiss their integrity good-bye.
REMEMBER Danny Almonte?
Danny was the Little League champ from the Bronx who awed baseball fans by pitching a perfect game during the run-up to the Little League World Series in 2001. “He’s head and shoulders above anybody I’ve seen come through the Little League system,” said one coach. Even though Danny was only twelve—or so he said—there was much talk of Danny’s bright future in the major leagues. Danny was such an amazing player that an ABC Sports executive producer described him as “the single biggest story in the history of the Little League.”
Danny became an even bigger story when it turned out that he had lied about his age to make himself eligible to play in Little League. Danny’s father, Felipe Almonte, an immigrant from the Dominican Republic, had conspired with Bronx All-Star coach Rolando Paulino to falsify his son’s date of birth. The senior Almonte altered Danny’s Dominican passport, changing the date of birth, April 7, 1987, to read April 7, 1989. Fourteen-year-old Danny Almonte had violated the most important rule of Little League—that it’s for little people.10
The revelations triggered an uproar. Even President George W. Bush offered an opinion in the aftermath of the disqualification. “I was disappointed that adults would fudge the boy’s age,” he said. “I wasn’t disappointed in his fastball and his slider.”11 In besmirching Little League, one of the most hallowed icons of Americana, the scandal left many people angry. It was another of those moments in national life, so common lately, when one wondered: Why would anyone do such a thing? How low can people go?
Yet the episode should not have been surprising. More than ever before, the glittering world of sports looms above the lives of poor and middle-class Americans as an escape hatch from the Anxious Class and a ticket to the Winning Class.
Felipe Almonte apparently wagered a great deal on his son’s prospects as a future Alex Rodriguez, the extravagently paid pitcher for the Texas Rangers who was born to Dominican immigrant parents. Not only did Danny never attend school during his eighteen months as a pitcher for the Bronx All-Stars, but he apparently did not attend much school at home in the Dominican Republic, in part because of his baseball regimen. Felipe Almonte was not unlike any number of other parents these days who raise their children to be stars on the lucrative playing fields and courts of professional tennis, basketball, baseball, and football. And as the rewards for top athletes have risen sharply, more parents have driven more children harder.
Almonte had better motives than most parents. The Dominican Republic is among the poorest countries in the Western Hemisphere, and Dominicans have one of the highest poverty rates of any immigrant group in New York City. Hundreds of thousands of Dominicans live in crowded neighborhoods with poor schools, pervasive unemployment, and high crime—neighborhoods just a mile or so away from some of the wealthiest areas in the United States. Many end up trapped in low-wage, dead-end jobs. Lowskilled immigrants did okay in an earlier America, where factory jobs were plentiful and even a high school dropout had a decent shot at making enough money to be the sole family breadwinner and buy a home. Now it’s common for both parents in an immigrant family to work full-time at low-wage jobs and yet barely escape poverty.
In the face of these odds, the world of Little League offered tangible possibilities for Danny Almonte and his family. Little League is no longer a small-town, pint-sized pastime. It can offer a shot at fame and wealth. In 2000, the organizers of Little League baseball began to market the broadcast rights for the Little League World Series for the first time and to make lucrative deals with corporate sponsors. Little League lined up $3 million in sponsors for the 2000 World Series, including major contributions from Honda, Wilson Sporting Goods, and myteam.com. In 2001, around the time that a $7 million stadium project was being completed in Williamsport, Pennsylvania, the number of teams in the Little League World Series doubled as sixteen teams advanced to the finals.
The Bronx All-Stars were a big fish in the expanding Little League pond. David Komansky, then CEO of Merrill Lynch, was a special fan of coach Rolando Paulino and his team. Merrill provided a corporate sponsorship for the team, while executives, including Komansky, attended the games. Boosted in part by the sensational play of Danny Almonte, television ratings for the Little League World Series hit record highs, especially in the New York area.
Little League stardom offered a clear upward path for Danny. A player who excels is guaranteed attention in high school from big league scouts. Big money can lie just around the corner. In 2001, a high school catcher named Joe Mauer was paid a $5.15 million signing bonus when he was recruited by the Minnesota Twins.
The scandal about Danny’s age put a damper on his family’s dreams of this kind of money, but only temporarily. Danny is living in the Bronx, concentrating on getting through school, and playing baseball for James Monroe High School. The coach there sees big league potential in Danny’s pitching. Danny may yet make it to the majors one day.12
If he does, he will leave behind one of the most impoverished neighborhoods in one of the most unequal cities in America—and join a profession where men feel poor on $200,000 a year and the inequalities between sluggers and benchwarmers are comparable to the contrast between Harlem and the Upper East Side. He will meet a lot of guys from humble backgrounds similar to his own who are under intense pressure to perform at a very high level and keep their toehold in the Winning Class. Danny will also enter a world so rife with cheating that his own past sins will seem laughable in comparison.
Consider the San Francisco Giants as one example of the winner-take-all market in sports. On opening day in spring 2002, the Giants paid its starting roster of twenty-six players a total team salary of $78.3 million. Almost a fifth of this pie went to one player: left fielder Barry Bonds, who took home $15 million during the season. Over half of the total team salary in 2002 went to five of the Giants’ top players. At the bottom of the salary pyramid were seven team members with salaries of $300,000 or less.
The Giants’ salary structure is a relatively new phenomenon. In just four years, between 1996 and 2000, the average salary on the team doubled, and the gaps between the highest- and lowest-paid players widened dramatically. This trend reflects the bigger picture in baseball over the past two decades, where salaries along with income gaps have grown exponentially. The average salary of a professional baseball player was $60,000 in 1975, $135,000 in 1980, and $413,000 by 1988. Today it’s about $1.5 million—an increase of over 2,000 percent since 1975. A key factor pushing up these averages has been the money paid to top players, which keeps getting higher and higher.
When Barry Bonds first started making over $10 million in 1997, he was the highest-paid player in baseball. A year later, Kevin Brown became the highest-paid player when he signed a contract with the Dodgers worth $15 million a year. Then, in 2000, Alex Rodriguez became the highest-paid player when the Texas Rangers agreed to pay him $25 million a year. During these same three years, the minimum union salary paid to men sitting on the same bench as these megastars only increased from $150,000 to $200,000.
What happens in a sport where top players rake in 50 or even 100 times more than their teammates? Bad things.
Take the career of Bonds. He’s a slugger and a star, and he should be basking in glory and serving as a role model to younger players. Instead, he is distrusted and surrounded by controversy. Part of Bonds’s problem stems from a cantankerous and arrogant personality. But much uneasiness around Bonds is fed by widespread rumors that his huge salary gains in recent years are the ill-gotten fruits of drug use.
Bonds is widely accused of using steroids starting in the late 1990s to pack on thirty-eight pounds of muscle in just a few years and transform himself into baseball’s most powerful slugger. Anabolic steroids are a form of synthetic testosterone and produce hormone levels that help generate more muscle. Bonds’s accusers say that his increased power would have been impossible without serious pharmaceutical help, especially given his age. (Bonds was thirty-eight in 2002.) Bonds and his supporters counter that the new muscle came from training and a diet that included Creatine and protein pills. “Barry Bonds could be on steroids,” said a leading expert on drugs and sports, Charles Yesalis, “but his power comes from the fact that he has the closest thing to a perfect swing that I’ve ever seen.”13
Whatever the truth, Bonds’s case vividly illustrates the rewards that await athletes who can bulk up fast. Bonds’s bigger muscles and better hitting enabled him to break Mark McGwire’s home-run record, slamming seventy-three balls over the fence in 2001. McGwire’s own power hitting was fueled by Androstenedione, a steroid banned in the NFL but permitted by the major leagues. Bonds was also able to sign a new contract that nearly doubled his salary over what he was making when he still weighed in at 190 pounds in 1997. Bonds now makes $18 million a year; in the 2003 season, he’ll be paid over $35,000 for each time he’s at bat.
Steroid use in sports has been around for decades, especially in football and among Olympic athletes. But it is only in recent years that it has become common in baseball. More players are also taking other drugs like human growth hormone (hGH) and amphetamines. Sports Illustrated conducted an investigation of the problem in 2002 and reported that “Steroid use, which a decade ago was considered a taboo violated by a few renegade sluggers, is now so rampant in baseball that even pitchers and wispy outfielders are juicing up . . . the game has become a pharmacological trade show.”14
The year Mark McGwire beat Roger Maris’s record, he stood six feet, five inches tall and weighed 245 pounds. He had twenty-inch biceps and seventeen-inch forearms. Other big-time sluggers are comparably built. And yet, as anyone who has trained with weights knows well, it is not easy to either build or sustain a large mass of muscle over time. To be successful—and to have a shot at astronomical money—today’s professional baseball player must do more than push his body to its maximum brawn. He has got to stay at this peak level month after month, year after year. Steroids can help players do this, but they have numerous side effects, including impotence, liver damage, and heart disease.
There have always been enormous incentives for players to take shortcuts, either to get to where they want to be or to stay there. But these incentives increased in the 1990s, as the salaries for reliable sluggers soared into the stratosphere and merely average players made only moderate gains. “A big, big year means a big, big contract,” observed Kevin Towers, general manager of the San Diego Padres.15 A bad year may mean a return to the minors.
While there are no reliable statistics on the pervasiveness of drug use in baseball, a variety of players have offered their own estimates. Yankee star David Wells recently estimated in his memoir that 40 percent of major leaguers use steroids. Jose Canseco, who has played with the Oakland A’s, has commented that the use of steroids has “revolutionized” baseball and estimated that 85 percent of players are on the drug. Arizona Diamondbacks pitcher Curt Schilling guesses that the number is between 40 and 60 percent. Said Schilling: “I’ll pat guys on the ass, and they’ll look at me and go, ‘Don’t hit me there, man. It hurts.’ That’s because that’s where they shoot the steroid needles.”16
The few baseball players who have openly explained their steroid use emphasize the financial reasons for risking their health. Ken Caminiti was a third baseman for the San Diego Padres when he began taking banned steroids in 1996 to help overcome an injury. Caminiti did more than just heal. He played as never before—with 40 home runs, 130 RBIs, and a .326 batting average. The year’s record earned him the Most Valuable Player of the National League. “At first I felt like a cheater,” Caminiti told Sports Illustrated. “But I looked around, and everybody was doing it.”17
Caminiti shares the view that at least half of major league players are on steroids, and he depicts an environment where the use of drugs has now passed a tipping point and become normalized. “If a young player were to ask me what to do, I’m not going to tell him it’s bad. Look at all the money in the game: You have a chance to set your family up, to get your daughter into a better school. . . . So I can’t say, ‘Don’t do it,’ when the guy next to you is as big as a house and he’s going to take your job and make the money.”18
A minor league baseball player echoed this point when asked why he used steroids. “I’ve got an easy answer for that. I’d say, You’ve set up a reward system where you’re paying people $1 million to put the ball into the seats. Well, I need help doing that.” Players in the minor leagues report pervasive steroid use. Everyone who is there wants out and up. They want to make the majors—and their personal fortunes.19
Meanwhile, even the best players at the top of baseball are anxious to keep their position. Maybe Sammy Sosa didn’t purposefully bring a corked bat into a game in June 2003, as he has claimed. And maybe Sosa doesn’t use steroids as is widely rumored. But if he did these things it wouldn’t be that surprising: As a slugger like Sosa gets older, it’s harder for him to sustain the muscle and power needed to drive balls over the fence and justify his multimilliondollar salary. It’s natural to look for an edge in this kind of situation.
Professional sports are an extreme environment. Success can transform you into a cultural icon and a centamillionaire, while failure can leave you injured, broke, and barely employable. People act in extreme ways with stakes like these. In a survey of 198 top athletes conducted in 1995, more than half indicated they would take a drug that would help them win every competition for a five-year period—even if they knew that at the end of five years the drug’s side effects would kill them.20
Other sports have also seen an intensification of winner-take-all inequities in recent years—and rampant doping by athletes. Professional cycling is one such sport.
Superstar cyclist Lance Armstrong is a hero in the United States. He gives speeches for $200,000 a pop, higher than Bill Clinton’s rate. His memoir, It’s Not About the Bike, spent months on bestseller lists. Armstrong’s appeal is obvious: He’s a man who defied death and went on to become a better athlete than before—and a multimillionaire. Cancer cost him a testicle and made him sterile, yet he has three children thanks to frozen sperm. He says his illness helped to steel him for the intense punishment of professional cycling and the grueling ordeal of the Tour de France, an annual race that covers 2,000 miles over 23 days. Quite apart from all these superhuman qualities, Armstrong is so admired in the United States because he beats the tights off French riders every year. Any nemesis of France has got to be all right.
Armstrong’s reception is very different on the other side of the Atlantic. European journalists have dug in his trash and combed through his past, hunting for evidence of drug use. The French authorities investigated him and his teammates for two years, summoning them for drug testing and questioning.
The animosity toward Armstrong is not as simple as continental pride. For one thing, the European public happens to know quite a bit about professional cycling, unlike Americans, and they know that doping is pervasive among professional cyclists. In 1998, Willy Voet, a key staff member for the top cycling team Festina, was arrested by French customs agents with several coolers full of pharmacological wonders. The following year Voet published a bestseller entitled Chain Massacre: Revelations of 30 Years of Cheating that claimed that nearly all top riders were doping. Those who weren’t inevitably lagged at the “back of the pack.” Voet’s revelations largely confirmed conventional wisdom and gave Europeans good reason to be skeptical that a cyclist like Armstrong could not be on drugs when he beats the best cyclists of the world who are on drugs.21
The Europeans also know that drug testing during the Tour de France is a joke. The most widely abused performance-enhancing drug in cycling is erythropoietin (EPO), an artificial hormone that allows the blood to carry more oxygen, thus boosting endurance. EPO was originally conceived to help fight cancer and kidney problems (Armstrong admits taking it when he was ill), but it soon swept the sports world, especially professional cycling. A huge black market for the drug thrives in both Europe and the United States—a black market fueled by shameful covert appeals to athletes by the drug’s licensed makers, including Johnson & Johnson. EPO is a problematic drug for professional sports because it is undetectable by urine tests, the most common form of drug testing. And because EPO exits the blood stream in just a few days, even sophisticated blood tests can’t provide evidence that an athlete was using EPO during his training. “Athletes don’t tend to stay on EPO year-round,” explains Dr. Michael Ashenden, a leading expert on doping in sports. “The athlete typically has to take EPO for three or four weeks to gain the maximum advantage, and once they’ve raised their blood-cell mass, they can lower it to maintenance level. So, if the athlete is cunning . . . there is no way to detect it even though they still have the benefits from it weeks later.” Also, diluting EPO with a saline solution can make it difficult for blood tests to detect its presence even in an active user. “A racer who gets caught by doping control is as dumb as a mule,” said Willy Voet.22 (Although not nearly as dumb as the staff member of a cycling team who gets busted with a whole cooler of drugs.)
Armstrong says that persistence is his secret formula for success, not an EPO drip bag. “What am I on?” Armstrong asks in a Nike commercial that ran in France. “I’m on my bike six hours a day. What are you on?”
Whatever the truth about Armstrong, peering into the high-stakes world of professional cycling further helps to illustrate the powerful logic behind cheating in today’s winner-take-all sports world.
If you’re an aspiring top cyclist, you quickly learn three central facts about your profession. One, many cyclists cheat, especially in Europe. To not dope while competing in Europe is akin to playing by your own rules, rather than the prevailing rules of the sport. “A lot of us were really naïve when we first went to Europe,” says Marty Jemison, who raced on the Postal Service team with Armstrong in the late 1990s. What the American riders discovered was that there was “a race within a race” as the European cyclists and their doctors tried to manipulate human physiology, often with newly created designer drugs. “It changes every year. Every year is a learning experience of medicine,” says Jemison.
The second fact you learn as a cyclist is that it’s very difficult to get caught if you’re using EPO and other drugs. Cycling has drug-testing practices that look tough, but actually getting caught is a different matter altogether, given the difficulty of detecting the drugs. Cyclists see both their own personal doctors and the team doctor, says Jemison. “Your team doctor might not know what your doctor gives you. They won’t question something if the results are good. They can be fired if a team doesn’t perform. So maybe their job is to make sure that someone just doesn’t test positive.”
The third fact you learn is that the rewards of using drugs and winning can be astronomical. Twenty years ago, a top American cyclist would be regarded as an oddity and yet manage to make a decent living. Now cyclists can make a huge amount of money through lucrative sponsorships and endorsements. While Lance Armstrong’s annual income of $15 million is an extreme case,23 a few other top cyclists also rake it in—as long as they consistently win races. No sponsor wants to back a loser. In contrast, merely average cyclists struggle just to keep riding. “There are very few people in the cycling world getting rich from it,” says Gerard Bisceglia, who runs USA Cycling, the leading association of cyclists. “The top rider makes more than the next five combined.”
Professional cycling is a murderous sport, and it’s not uncommon for cyclists to ride 20,000 miles a year as part of their training. For all professional riders, but especially those at the bottom, the sense of insecurity and pressure is unrelenting. “You know you have to ride well, that’s why you train so much,” says Frankie Andreu, who has raced with Armstrong in Europe.
Many aspiring pro cyclists don’t have a whole lot to fall back on, as is typical of other athletes. Their intense years of training may have pushed aside any serious college education, and they may be essentially without job skills.
Marty Jemison, who has heard the din of the cheering crowds on the Tour de France, never made it to the top ranks of cyclists. These days he runs a small bike-tour group in Utah. “When the screws are turned so tight, it’s like war,” Jemison says of cycling. “It’s survival of the fittest, and if people don’t find their way, they’re out. It’s so competitive.” Jemison says that he did not use drugs himself. “It was a joke among the French about me, that I could have been great but that I ‘rode with water,’” which means riding clean.
The French riders were right, Jemison thinks. “I could have been great, and financially it could have changed my life and made it a lot easier for me now, but you make your own choices. You’re only responsible for yourself.”
THE WINNER-TAKE-ALL phenomenon seen in sports is found everywhere. The pay gap between top heart surgeons and typical general practitioners is huge and much greater now than it was thirty years ago, as is the gap between star corporate law partners and lowly associates, between telegenic stock analysts who hold court on CNBC and the grunts that churn out much of Wall Street’s research, between top scientists who strike gold with the right patent and more average lab rats, between pop stars with a global audience and unsigned musicians who tour in cities like Wichita.
Robert Frank, a Cornell economist, first began noticing the winner-take-all phenomenon in the 1980s. It wasn’t hard to spot this trend, since as an academic, Frank worked in a star system himself. The very top university professors increasingly commanded six-figure salaries in the 1980s even as the average pay of professors inched up only slightly. Looking at other professions, Frank saw these same trends at work. In 1988, he teamed with an old classmate from his graduate days at Berkeley, Philip Cook, and began researching the new pay gaps. The result of their work, The Winner-Take-All Society, published in 1995, quickly became a classic in the field of economics.
Professional cycling provides one example of several of the winner-take-all dynamics that Frank and Cook analyze. Television audiences for the Tour de France have expanded rapidly in the past decade, with more fans watching from beyond Europe via satellite TV. Companies sponsoring a top rider like Lance Armstrong—in effect, plastering him with logos—can thus expect to have their brand identity beamed to more people worldwide. Such exposure is more valuable now than it once was, since many companies are global in scope and are competing for market share in many more places than they were a decade ago. An increasingly competitive economy also vastly reinforces the commercial value of someone like Armstrong who achieves a well-known “brand identity” status. If you get that athlete to attach his brand identity to your product, your product has a much better chance of standing out against the noise of an ever-more crowded advertising marketplace. And that is worth a huge amount of money.
In turn, once the value of an athlete’s association is recognized by one company which pays handsomely for that association, this value can feed on itself through “self-reinforcing processes,” as Frank and Cook write. Everyone wants what is hot, and the clamor further increases the value of whatever becomes hot. Through this upward spiral, top performers can create an unbelievable earnings gap between themselves and other performers, even ones just as talented. "[A] small initial advantage can eventually engender a nearly insurmountable lead,” note Frank and Cook.24
Lance Armstrong’s relationship with the U.S. Postal Service shows just how high the value of a single superstar can be in comparison to others. Following his third victory in the 2001 Tour de France, Armstrong renegotiated his sponsorship with the U.S. Postal Service and landed a deal worth $4 million a year. The Postal Service justified this as a savvy marketing move for an institution that’s fighting for its life against UPS, FedEx, and e-mail. “Lance is about perseverance, and so is the Postal Service,” said a spokeswoman, apparently referring to the old “rain, sleet, or snow” promise.25
News of this hefty payout was not greeted warmly among many postal employees. How was it, they wondered, that the Postal Service could pay millions of dollars a year to a bicyclist when many workers who had dedicated their careers to the Postal Service struggled to get by on mediocre pay? “I felt sick to my stomach reading that figure,” one mail sorter said after the deal.26 The average postal worker earns about $39,000 a year, which means that Armstrong is making as much money annually as 102 postal workers combined—workers who cumulatively put in over 200,000 hours a year.
What does Armstrong do for a payout that amounts to over $11,000 a day? Mainly, his labors consist of one simple act. Whenever he dresses for a race—or, as important, a photo shoot—he pulls on a biking jersey that features the Postal Service logo on both the front and back.
Pretty good work if you can get it.
WRITERS LIVE ON an entirely different planet than professional athletes. But here, too, winner-take-all dynamics have reshaped the field and are linked to a rise in cheating.
Jayson Blair is now famous as the lying journalist who triggered an uproar at the New York Times and brought about the fall of its imperious editor, Howell Raines. The story was remarkable in some ways—never had the Times been turned so upside down. But it was otherwise familiar. Hadn’t something like this just happened a few years earlier?
In May 1998, Adam Penenberg, an editor at the online magazine Forbes Digital Tool, found himself frustrated by his efforts to further investigate an incident he had read about in the New Republic, where a fifteen-year-old Bethesda computer hacker had penetrated the security of a company named Jukt Micronics. Penenberg couldn’t find any record of the company’s existence. So he called the writer of the story and asked for more information.
Stephen Glass, a twenty-five-year-old hotshot at the New Republic, seemed happy to help Penenberg out. He gave him the phone number of the company, and also directed him to its Web site on AOL. Penenberg eventually realized that both were fakes. The number was the cell phone of Glass’s brother; the Web site had been created by Glass himself. Jukt Micronics didn’t exist; nor, needless to say, did the fifteen-year-old hacker. A full-blown meltdown ensued at the New Republic as Glass’s long history of deceptions came to light.
Glass has since gone on to more lucrative pursuits. He went to Georgetown Law School and nailed a marketable degree. And his journalistic dishonesty was lavishly rewarded by Simon and Schuster, which paid him a six-figure advance to write an autobiographical novel. The timing of Glass’s novel was fortuitous. In May 2003, as Glass was making the rounds on 60 Minutes and elsewhere to promote his book, Jayson Blair was fired from the New York Times for a pattern of behavior very similar to Glass’s activities at the New Republic.
The Glass and Blair episodes each triggered a huge amount of media attention. But most of the analysis in the wake of both scandals failed to offer any compelling answers as to why promising young journalists would possibly fabricate or plagiarize material on a large scale. The only explanation to many observers in the Glass case was that the guy must have suffered from some kind of sociopathology, a view that Glass gives some credence to in his novel, which features a protagonist with an excessive desire to be loved. A more complicated set of motives has been imputed to Jayson Blair. These include not just psychological problems but drug addiction and alcoholism as well. Blair has said that all of these may have played a role. It’s also been alleged that Blair got away with cheating because he was black and was coddled by a newspaper mindlessly committed to affirmative action. (An explanation that Blair has said is absurd.)
Only a few postmortems of these journalistic scandals have focused on the most obvious possible motive for why young, ambitious professionals might take such big risks—to reap big rewards.
Fabrications by journalists are nothing new nor are conflicts of interest in the media. But while there is no hard evidence that misconduct in journalism has increased in recent years, there are plenty of reasons to think that journalists are facing new pressures on their integrity that stem from a greater focus on the bottom line and bigger pay disparities.
Most people are familiar with the ways in which a growing obsession with profits has undermined the media’s watchdog role and propelled it toward a crasser, less credible focus on “infotainment.” Public trust in the media has fallen, and many journalists share the public’s uneasiness with their profession. According to a study by the Pew Center on the People and the Press, “Majorities of working journalists say that increased bottom-line pressure is hurting the quality of coverage. This view is more common than it was just four years ago.” According to another survey, 63 percent of journalists perceive a decline of ethics and values in the profession.27
Less familiar—and more relevant to the recent spate of plagiarism cases—are the ways in which many journalists are scrambling to score financially. “Everyone in journalism wants to make as much money as the lawyers and various other people they write about,” commented Richard Blow, the former Washington editor of George magazine, who had briefly employed Stephen Glass.28 A generation ago, making big money wasn’t a realistic option for most journalists. Now it’s commonplace—if you can become a star.
In the world of opinion journalism that Glass inhabited, the path to stardom is to write hit pieces that draw attention and heat, ideally at the expense of people more powerful than yourself. Then come the calls from the talk-show producers and lecture agents and book publishers. If you’re lucky, you can end up like Tucker Carlson, the bow-tied Weekly Standard prodigy who now is a co-host of Crossfire and commands big bucks for his various activities; or David Brock, an opinion writer of few scruples who turned himself into a millionaire by attacking Anita Hill and the Clintons, and bought his own townhouse in Washington before the age of thirty; or Michael Lewis, another opinion writer who scored early with his first book and has been commanding fat six-figure advances ever since. To increase one’s odds of making it big, it helps to write a lot of stories at a fast clip—to build up the kind of cachet and brand identity that can be parlayed into high earnings. Stephen Glass was in such a hurry to get big that he overcommitted himself, taking on assignments for Harper’s, George, and Rolling Stone even as he tried to deliver on all his responsibilities at the New Republic.
The path to stardom for newspaper reporters is different, but can be even more lucrative. A reporter with large ambitions tries to get assigned to the big stories of the moment and then do well enough to get noticed and have a wider choice of beats. Eventually he can find a hot story that can be parlayed into a book deal. Blair tried to do this unsuccessfully with the Maryland sniper story, shopping around a book about the case. Many other journalists have done it with far greater success: Thomas Friedman’s book on the Middle East, From Beirut to Jerusalem, catapulted him into the ranks of America’s most successful journalists and launched him on the road to being a millionaire. Newsweek's Michael Isakoff stumbled on the hot story of a lifetime when he became closely involved in the Clinton sex scandal—and cashed in his experiences for a huge book advance. Joe Klein’s coverage of Bill Clinton’s presidential campaign gave him the insights he needed to write Primary Colors—and rake in a fortune as a bestselling author.
Thirty years ago Bob Woodward and Carl Bernstein became very wealthy young men when their Washington Post articles on Watergate helped bring down President Nixon. Their sudden transformation was something of an oddity, and other journalists marveled at their riches and newly lavish lifestyles. Today this kind of thing is far more common. “Journalism didn’t used to appeal to people who wanted to become famous,” explained Charles Peters, editor of the Washington Monthly. “Now you’ve got people drawn to Washington who used to be drawn exclusively to New York or L.A.—Washington journalism has become another path to becoming famous.”29
Yet for all the new examples of journalists striking it rich, the financial situation for most journalists has actually grown more precarious over the past two decades. Writing opinion commentary has always been a dubious career choice, but it is even more so today given the high cost of living in places like Washington, D.C., and New York City. A few decades ago, an intellectual with a clever pen could take solace in low housing prices and manage to make ends meet. For example, in the 1960s my father bought a six-bedroom house with a river view in a prime New York suburb on the salary he earned as an associate editor of Commonweal, a small weekly opinion magazine. Now that same house is out of reach even for a young corporate lawyer. The only opinion journalists who live well in today’s America are among the lucky few who either have “broken out” or have a source of income unrelated to their job, like a trust fund or a rich spouse.
Even journalists at top news organizations often are barely able to afford to live in the cities that they cover. For example, a reporter at Time magazine with seven years experience can expect to make roughly $75,000, while a reporter who’s been at the Wall Street Journal for four years is probably pulling down around $55,000. This kind of money doesn’t go very far in New York City. I know one journalist who moved to New York to take a plum job in television news, only to find that she couldn’t afford an apartment on her salary. Lucky for her, she managed to bribe a building superintendent to get a rent-stabilized place. Reporters at other top publications like the Washington Post, the Los Angeles Times, the Boston Globe and the San Francisco Chronicle are paid well compared to reporters working for smaller papers, who have a median salary of $40,000 a year. Yet all of the big cities have become a lot less affordable over the past decade.30 At the same time, many more young journalists are starting their careers with large loads of student debt incurred by master’s programs in journalism—an expensive credentialing process that’s become more common as competition has increased for the better jobs in journalism.
In short, journalism has become yet another winner-take-all arena over the past two decades. So has the realm of book publishing, where name-brand authors can make millions of dollars writing a book every year or two—while merely ordinary authors find publishers less willing to invest in the careers of “mid-list” writers. Not surprisingly perhaps, book publishing, too, has been beset by an unprecedented number of plagiarism cases in recent years.
Winner-take-all trends may not be the only explanation for the cheating of Jayson Blair and Stephen Glass, or the recent ethical problems of other journalists over the past decade, including Mike Barnicle, Michael Finkel, Ruth Shalit, Rick Bragg, Patricia Smith, Monica Crowley, Mike Hornung and a dozen others.31 Nor may these trends account for the spate of scandals involving book authors in recent years, including Stephen Ambrose, Doris Kearns Goodwin, Michael Bellesiles, and Brian VanDeMark. In each of these cases, there are different possible explanations for wrongdoing: a psychological breakdown, a sloppy research assistant, an erroneous pasting of words from a Nexis file, a political ax to grind. It also may be that the flurry of so many scandals involving ambitious journalists and authors in recent years is pure coincidence, that it has nothing to do with the bigger rewards now being dangled before writers along with the greater economic insecurities they face.
Maybe. But I suspect otherwise.
IT’S INTUITIVE THAT as the rewards at the top become bigger, more people will do anything to get to the top. What is less easy to understand is the pressure and anxiety felt by those people who, by most measures, are doing very well already. The precepts of laissez-faire ideology suggest that inequality is okay on principle, but especially okay if everyone is getting richer. Yet in the real world, big pay gaps can have a corrosive effect on the integrity of even those people who should be extremely grateful for all the money they are earning.
Consider a rookie ballplayer. He earns $200,000 a year, mainly for sitting on the bench. He is a legend back in his hometown and should consider himself a lucky man. Unfortunately, when he compares himself to other people, he’s not thinking about the guys he played ball with in high school who are now auto mechanics and accountants. He’s more likely to look up than down—comparing himself to the multimillionaire celebrities he works with every day. As he does, he may not feel so lucky. He might even feel poor.
The reason for this is that most human beings think about their well-being in terms relative to those who share their immediate community, as Thorstein Veblen pointed out a century ago in The Theory of the Leisure Class and as Robert Frank has discussed in some detail in his book Luxury Fever. Absolute well-being doesn’t matter as much as it should. Most of us would rather earn $100,000 a year in an organization where nobody makes more than $90,000 than make $110,000 at a job where all our colleagues are paid $200,000. We’d feel better about ourselves if we owned a ’97 Toyota Camry in a neighborhood where everyone else drives ’88 Honda Civics and ’90 Mazda Protégés, than if we owned a brand-new Camry in a neighborhood filled with Jaguars and Mercedes.32
The notion that people worry more about their place in the economic pecking order rather than the size of their paycheck has found support in research exploring the interplay of money, hierarchy, and happiness. Studies by biologists and health researchers also suggest that being in a subordinate position can do a hatchet job on your self-esteem, leave you chronically stressed out, and undermine your physical health. A famous long-term study of thousands of British civil servants found that lower-ranked employees died earlier—even when researchers controlled for diet and personal habits like smoking. Stress and “low job control” appeared to explain the difference in mortality rates.33
Concerns about relative position are not simply the product of envy and other shallow emotions. We compare ourselves to others for very good reasons. If you’re wearing a $500 suit and another job applicant sports a $1,000 suit, both of you are wearing nice suits. But the other guy may have an advantage, all other things being equal. If you’re living on a street where everyone has 10,000-square-foot mansions and your home is 6,000 square feet, you and your neighbors all have plenty of room. But when you throw a party and invite lots of professional acquaintances, they’ll see that you’re the poor person on the block. Maybe they’ll be less likely to think you’re a rising star after all, and won’t throw venture capital, or big contracts, or whatever, your way. And God help the would-be mogul who shows up in Jackson Hole on a commercial flight.
Frank and others argue that, ultimately, anxieties about relative position reflect evolutionary imperatives shaped by a long human history in which small advantages over others translated into a better chance to survive and reproduce. “There is compelling evidence that concern about relative position is a deep-rooted and ineradicable element of human nature,” Frank writes.34 So go ahead, feel sorry for that baseball benchwarmer pulling in paychecks bigger than anything most of us will see in our lifetime—but who shares a locker room with guys who make $10 million a year.
Feel sorrier, though, for the sales manager at a Banana Republic who can barely make ends meet at a job selling expensive clothes to young professionals who make five times what she does. Worries about relative position are most wrenching when people are hurting economically and when competitive emotions are mixed with survival instincts. This is exactly the situation for tens of millions of Americans who were bypassed by the boom—yet see its fruits displayed before them every day.
THE FALL OF TRUST in the United States over the past forty years has long been discussed and debated. It is well known that Americans trust nearly every institution less than we used to. We’re less trusting of government, less trusting of the media, less trusting of religious institutions, and less trusting of lawyers and other professionals.
The falling trust in various professions is especially notable. Americans are more fearful of being ripped off, misled, or otherwise cheated by people whom are charging us money for services or whom we are relying upon to advise us on key parts of our lives.35
Americans have also become less trusting of each other. In 1960, 58 percent of Americans agreed that “most people can be trusted.” By 1998, only 40 percent agreed with this statement. Every few years for the past quarter century, the General Social Survey (GSS) has asked hundreds of Americans a telling question about trust: “Do you think that most people would try to take advantage of you if they got the chance, or would they try to be fair?” When the GSS first started asking this question in the 1970s a large majority of Americans didn’t fear being cheated. But such fears increased during the 1980s, and by the late 1990s, nearly as many Americans thought that most people would try to take advantage as thought that most people would try to be fair. Sixty percent of Americans now say that “you can’t be too careful in dealing with people.”36
Distrust is obvious fuel for cheating. If you think people are out to cheat you, you’re more apt to believe that rules don’t really matter and that you’ve got to live by your wits as opposed to ethical principles. You may imagine for self-protective reasons that you need to cheat others before they get a chance to cheat you.
For all the talk over many years of rising distrust, only recently have scholars begun to make the link between inequality and distrust. The notion of such a link rests, in part, on common sense. If you don’t see yourself as doing well economically in relative terms and if you think the system is stacked against you, it’s easy to be pessimistic and resentful. In contrast, feelings of trust are associated with optimism about the future and goodwill toward others. In his book The Moral Foundations of Trust, scholar Eric Uslaner used a variety of opinion surveys taken over the past several decades to examine how and why people trust others. He writes: “If you believe that things are going to get better—and that you have the capacity to control your life—trusting others isn’t so risky. Generalized trusters are happier in their personal lives and believe they are masters of their own fate.”37
It’s not easy to feel like you can control your life in America’s postindustrial economy. In a winner-take-all market plagued by stagnating wages, downsizing, and rising prices for key life necessities like health care and housing, many people have good reasons to be pessimistic and resentful. Polling during the boom periods of both the 1980s and 1990s showed that even as the economy grew by leaps and bounds, many people didn’t believe that their own incomes would rise. More than half of Americans consistently said that they weren’t making enough money to lead the life they wanted, and many didn’t see such money in their future. Instead, large percentages of Americans worried that their own financial situation might well deteriorate and also worried about their children’s prospects. Upward of half of Americans, for example, felt that their children would be worse off than they were.38
The prevalence of such views might be understandable during a prolonged recession. But it’s remarkable that so many Americans would feel this way even as the nation as a whole grew wealthier than ever before in history. The vitality of the U.S. economy was among the most celebrated aspects of American life during the late 1990s when the American economic model had conquered the world, when kids in their twenties could make millions of dollars, when unemployment had fallen to a thirty-year low, and when some observers were predicting an end to the business cycle, with its booms and busts. Self-congratulation was the mood of the moment, as it had been in the mid-1980s, when it was “morning again in America.” And yet up to half of Americans during both of these periods felt they weren’t earning enough money and their kids would be worse off than they were.
Why did so many people believe this? Because it was true. Most of the gains from the boom went to the top 20 percent of households, while many households lost ground. Being a middle-or lower-income American during the ’80s and ’90s was akin to sitting through a long and rowdy victory party—when you’re from the losing team.
The divisive effects of inequality have been further aggravated by the way in which large income gaps have pulled American society apart culturally and geographically, slicing it up into different groups that have little in common. The size of our paychecks determines where we live, what we wear, what we drive, what beer we drink, what kinds of restaurants we choose, what we watch on television, where we work out, where our children go to school, where we vacation, what hobbies we engage in, and much more. All of these features of our lives help shape if not our own class identity, then certainly the class labels that others put on us. Dramatically uneven levels of income result, inevitably, in big disparities in class identities across a society. Since most people feel more comfortable around those who are having a similar life experience, the divisive potential of these disparities are obvious.
Class is nothing new to Americans, even if we’ve tended to deny its existence here. But the intensity of class divisions has waxed and waned over two centuries, along with levels of inequality. Class was a powerful aspect of American life in the Gilded Age, and again in the 1920s, two periods that were characterized by many scandals. The middle decades of the twentieth century saw these divisions fade substantially. The 1940s through the 1960s have been called the “Great Compression,” because of the dramatic narrowing of income gaps that occurred during this period. During these prosperous decades, all classes of Americans got richer at roughly the same rate. Narrowed income gaps, in turn, produced one of the most socially egalitarian eras in American history (although this was also an era marked by pervasive race and gender discrimination).39 The nation’s imagination was captured by the ideal of a universal middle class that could, and should, encompass everyone. Identical suburban homes stand as an icon of the early postwar period for good reason.
The egalitarian mood of the day was reflected in the executive suites of corporate America, where top executives understood that they would not be granted salaries that too greatly dwarfed those of average workers. Sociologists and economists speak of the “equity norms” that prevailed in business in the early postwar period. In 1965, CEOs made on average fifty times more than the typical worker. While large, this gap is nothing compared to today, when CEOs make nearly 300 times what the average worker makes. Before the 1980s, pay gaps were still small enough that many CEOs didn’t imagine themselves as some separate imperial breed of leaders. For example, members of the Harvard Business School Class of 1949, a third of whom became CEOs, exemplified the everyman sensibility of yesterday’s corporate leaders. Most of them lived in comfortable but modest homes and frowned on conspicuous consumption. They drove average-priced cars, and saw themselves as lucky to have the opportunities that they did.40
Not surprisingly, the early decades of the postwar period were a time of enormous social solidarity and trust. The early postwar years were also an era of comparatively little cheating in business and other sectors of American society, which makes sense. The middle class perceived that the social contract was delivering on its promise and felt respectfully treated by those higher on the economic ladder. The rich, in turn, were not living radically different lives than the middle class and weren’t able to spin off easily into a separate moral reality governed by its own rules.
That was then. As the income differences among Americans have grown larger in recent decades, so have social differences. The enduring correlation between ethnicity and income aggravates the problem, piling ethnic and cultural differences on top of class differences. Looking at each other across the chasms of class and race, many Americans see little reason to believe that they share each other’s values—and little reason to trust each other. “Trust cannot thrive in an unequal world,” writes Uslaner. “People at the top will have no reason to trust those below them. . . . And those at the bottom have little reason to believe that they will get a fair shake.”41
In the past decade, geographic divisions among Americans by income have become especially noticeable. Many working-class people find themselves priced out of the areas where they grew up, yet their services are still needed in these communities. And so you see car mechanics, garbage collectors, and police officers driving long distances every day to work in towns or cities that used to be affordable to blue-collar people. This kind of thing—residential segregation by income—has deepened as inequality has grown in the past quarter of a century.42 Much residential segregation is explained by the simple fact that people live where they can afford to rent or buy a place. But more deliberate self-segregation by the affluent is on the rise. Wealthy enclaves have always existed—places like Beverly Hills and Palm Beach and Sutton Place—but they tended be small in size and unusual. Now, homogenous communities of affluent and semi-affluent Americans are both more numerous and larger in scope. In the early 1970s, there were roughly 2,000 private “gated communities” in America where access was restricted to members and visitors. Today, there are more than 50,000 such communities. Some seven million households now live in gated communities, and 40 percent of new homes built in California are in gated communities. “What is the measure of nationhood when the divisions between neighborhoods require guards and fences to keep out other citizens?” ask Edward Blakely and Mary Gail Snyder in their book, Fortress America. “Can the nation fulfill its social contract in the absence of social contact?” Blakely and Snyder, along with many other social observers, answer an emphatic no to this question.43
WHEN I FIRST STARTED investigating cheating, my guiding assumption was that nobody wants to cheat. I still think that. No athlete wants to pump his body full of drugs that shrink his testicles, or change the shape of his head, or could turn his blood into molasses and leave him dead halfway through the Tour de France. No stock analyst wants to go on CNBC and hype a stock that every insider knows is a piece of junk. No chief financial officer wants to cook earnings reports, and no accountant wants to rubber-stamp these reports. No journalist wants to make up her sources.
But when you look at the effects of inequality in our society, you can understand why respectable people consistently do all of these things. The winner-take-all economy has loaded up the rewards for those who make it into the Winning Class, and left everyone else with little security and lots of anxiety. Inequality has also pulled us apart, weakening our faith that others follow the same rules that we do.
Unfortunately it gets worse. Two decades of change in American economic life—and a steady string of victories for laissez-faire ideologues—hasn’t just shifted the financial incentives for individuals or the operating strategies of business organizations. It has deeply affected American culture overall, reshaping nearly everyone’s values.
And not for the better.