CHAPTER 7

Take Care of Yourself

A tenured professor at a leading business school met with his associate dean for a performance review and to discuss the following year’s increase in salary. Now, to be clear, universities are not the most cutthroat, short-term-performance oriented places, particularly compared with most businesses. Plus, the money sloshing around the richest, most prestigious universities reduces the politics and interpersonal competition even further.

Their discussion focused on why, given this individual’s consistently high ratings on teaching and research performance, his raise was so small. The answer: business schools, including this one, pay higher salaries than most other university departments, and to preserve some semblance of compensation equity across the campus, business school budgets earmarked for salary increases are constrained by the central university administration. Translation: raise dollars are scarce. The associate dean then explained that the business school needed to use those scarce salary resources to retain the people who were going to be the most crucial to the future leadership and well-being of the school. The message, stated almost but not quite this bluntly: regardless of your outstanding record and past contributions, you are, because of your age and career stage, the past, not the future. Like most workplaces, this one, too, needed to invest in the future, and therefore it intended to allocate scarce raise dollars accordingly.

You may think your employer owes you something for your past contributions and good work—but most employers don’t agree. Whether it is paltry raises, painful rounds of layoffs, or cost-cutting moves to open-office plans, companies, and, for that matter, nonprofits and government agencies, look after themselves and their own interests to ensure their survival and prosperity. So a senior editor of a major publisher was laid off without warning, even though he had built the business over many years and made millions or maybe tens or hundreds of millions of dollars for his employer. No matter. As we saw in the last chapter, nice speeches and noble sentiments notwithstanding, leaders mostly take care of themselves first—and maybe second and third, also—regardless of what they are supposed to do. The obvious conclusion: you should do the same.

If you hold the expectation that your hard work and good efforts are invariably going to be appreciated, acknowledged, and rewarded by your employer in perpetuity, it’s time to get over yourself. If you believe that there is some implicit contract of, on the one hand, your doing good work and being loyal in return for reward and recognition or even a job from your employer—an implicit bargain or contract—don’t expect such implicit contracts to be honored. Data show that companies violate implicit contracts with their employees all the time. For instance, one study of 128 alumni from a graduate management program asked people at graduation and then two years later about whether or not their employers fulfilled the reciprocal obligations of the employment relationship. Some 55 percent of the respondents reported that the implicit promises made during recruitment and thereafter as they joined their employers had been breached.1

Nor is this emphasis on “what can you do for me in the future?” and the breaching of implicit understandings confined to the relationship between employers and their employees. When Kyle Hardrick, a basketball player for the University of Oklahoma, injured his knee, not only were he and his family stuck with medical bills, but he also had to find the money to pay for his college tuition, because his Oklahoma scholarship was not renewed.2 This sort of thing happens all the time. According to the National Collegiate Athletic Association (NCAA), while schools cannot revoke a scholarship because of illness or injury, “many have players on one-year, renewable scholarships that the schools can opt not to renew.” And most schools do, in fact, drop financial support for athletes should those individuals become unable to play, regardless of those individuals’ past contributions to the schools’ athletic programs or whether the injury was incurred helping the school’s team.3 Colleges, just like employers, spend their scarce resources on those individuals who can help them succeed in the future—and those are certainly not people too injured to play again. And, of course, teams in professional sports drop or trade people all the time if these folks are no longer perceived to be worth the cost—again, regardless of what the players may have done for the team in the past.

The logical conclusion from systematic data and countless cases in multiple environments, ranging from college and professional athletics to corporations to universities: relying on the good behavior and positive sentiments of work organizations for your career well-being is singularly foolish.

This should not be news to any sentient person. For more than forty years, companies in the United States have been telling their employees that the company owes its people (at most) interesting and challenging work that will help those individuals hone their skills and make or keep them marketable; companies should help employees become better skilled and ready to compete in the labor market. But both human resource departments and senior management constantly reprise the refrain, often found in formal written policies and employee handbooks, that individuals are employed “at will” and can be fired for any or no cause.4 The company does not owe its individual employees a job, regardless of their job performance or, for that matter, any consideration for their past loyalty and service.

There are, of course, a few small exceptions to this pattern. If you happen to be one of the small minority of people covered by a union collective bargaining agreement, you will have greater employment security and, more important, access to greater levels of due process should your employer decide to get rid of you or in other ways treat you inequitably. If you are employed in some countries in Europe that have yet to follow the United States completely in its labor market regulations, or lack thereof, you will have some job protections. And protection against arbitrary dismissal varies by state, with California providing somewhat more protection than other states, for instance. Some companies have voluntarily installed internal systems to ensure fair and equitable treatment of their employees. The military tends to recognize those who have served with honor and, in many cases, have made substantial sacrifices. But to be clear, the military’s honoring of past contributions arises in part because not to do so would be to discourage others from joining and staying and making similar contributions to the common good. And all that being said, even the military does reductions in force, its version of layoffs, that leave career service members scrambling for new jobs.

For the typical work organization, or for the typical individual in a social relationship, the question posed by the counterparty is typically not “What have you done for me in the past that deserves repayment?” but rather, “What can you do for me in the future that warrants spending any time or resources on keeping you happy or keeping you at all?”

This state of affairs is sensible and inevitable for the most part. Organizations, both for-profit and nonprofit, do—and probably should do—what is required to ensure their continuity and success. That includes shedding employees at the drop of a hat to maintain their solvency, or sometimes simply to increase their stock price or to keep investment analysts happy.5 Workplaces are mostly—and there are obviously notable exceptions—not what many people apparently seek from them: communal settings in which people take care of each other, provide economic security and social support, and possibly even provide meaning and purpose from the work people do. Of course a few organizations—the best-places-to-work lists are a good source for many of them—do all these things. But don’t count on your place of employment being one of them. Unfortunately, most people are understandably reluctant to heed this message.

RECIPROCITY: MISSING FROM THE WORKPLACE

On the phone, a former student of mine is on the verge of, and then in, tears. She needs to work for the money and is about to be fired by her current employer. Her sin? Communicating too directly and mostly not being subservient enough, obsequious enough, with her new boss.

Yes, there is a new boss. The person who hired her for her marketing acumen and appreciated her work was, as they say, “re-orged” out of his role and is now in an “individual contributor” position, which means he can no longer be of much help to her. My question to her (which I albeit put much more politely than I do here): “What were you thinking? If you need to work to support yourself, you need to (a) always, and I mean always, be looking for new jobs both within your current employer and ones on the outside so you have options available to you at all times, and (b) be constantly working on your relationship with your new boss. No one can count on stability in companies these days, so you need to be prepared.” Her reply was that her intelligence, conscientiousness, and her great marketing analysis and insight, qualities that had earned her positive feedback particularly from the person who had hired her, would permit her to keep her job. Her job performance would protect her from the many vicissitudes visited upon employees by jealous and insecure leaders. Bad bet.

For the most part, the leadership industry either presumes the existence of or advocates for benevolence. Leaders are expected to be virtuous and responsible, so that organizational members can trust and rely on those leaders to take care of them and to look out for their interests, so long as those employees do their jobs well. But few leaders, and few organizations, fill these prescriptions. We need to understand why.

What Happened to the Norm of Reciprocity?

At this point, you might be confused. On the one hand, a lot of evidence confirms that companies and universities do not repay the hard work and long service of their employees or, for that matter, their athletes. But on the other, there is this norm of reciprocity, a moral obligation to repay a past favor, a norm argued to be universal across societies.6 Because reciprocity norms help solve the universal problem of ensuring cooperation among people, some scholars argue that there are evolutionary advantages that help explain reciprocity’s emergence and humankind’s almost mindless adherence to it.7 What gives?

What gives is actually pretty simple. When people are in an organizational as contrasted with an interpersonal setting, they feel less obligation to repay favors and, in fact, are less likely to do so. Peter Belmi and I did a series of experiments demonstrating this effect.8 Two of the studies involved scenarios in which people imagined that another individual had done something nice and unsolicited for them—in one instance in a personal context, and in the second in an organizational setting. The study’s participants were then queried about how obligated they felt to repay the favor. So in one study, people imagined they had been invited to dinner and another person had paid for their dinner. In the second study, people imagined they were coming back from a trip and someone from work (or, alternatively, just a friend) volunteered to pick them up at the airport. In both cases, people felt less strongly motivated to reciprocate the favor when that favor had occurred in an organizational rather than a personal context.

Of course these are only hypothetical scenarios. What about actual behavior? When we conducted a study in which someone else in the experiment gave the study’s participants lottery tickets for a drawing in which they could win a prize, we observed that the participants reciprocated with fewer lottery tickets when the favor was done in an organizational setting than when the identical situation unfolded under a personal context condition. In organizational settings, people feel less obligated to reciprocate favors. And that is what we found in yet another setting, one in which people were paid more than they had expected to receive, a payment described explicitly as a favor, and then asked if they would volunteer to do some additional work. Once again, people were less likely to reciprocate in organizational settings than they were in more personal, informal settings.

There are many differences between personal and organizational or work settings that help account for why the norm of reciprocity operates with less force in the workplace and, as a consequence, implicit agreements are more readily and frequently breached. First of all, the norm of reciprocity speaks to the moral obligation to repay favors done for you by others. It is far from clear, as the social psychologist Robert Cialdini mentioned in a conversation, that the employment relationship engenders similar moral obligations. Yes, employees may have worked hard for their employers, investing their careers for years in helping to build the organization’s success. But the employees got paid for that work, and so it would seem that the act of being paid for one’s labor means that the employer owes its people nothing else. There was an exchange—money for labor. In a presumably free, competitive labor market in which both sides of the transaction have choices, the fact that the transaction was consummated means that both sides felt the deal was fair. In other words, it was a fair exchange, but not one pregnant with ongoing moral obligation.

Second, organizational contexts promote both a more future-oriented and also a more instrumental, calculative, and transactional orientation on the part of people. In everyday social relationships, people expect fair treatment and favors to be reciprocated. But in work settings, things become a lot more calculative. Specifically, people make more evaluations of whether or not coworkers and superiors could be useful in the future; likewise, people show more concern with the future than with repaying past kindnesses.

These studies reinforce what commonsense observation suggests: workplaces are primarily instrumental, calculative settings largely free of moral sentiments and even normative constraints. In a nutshell, companies will treat you well as long as you seem as though you are going to be useful in the future, and companies will probably be less inclined to treat you well or to repay past contributions the minute you are perceived as being less useful in future endeavors.

WHY PEOPLE PUT THEIR FAITH IN LEADERS

Notwithstanding the foregoing, people nonetheless often seem to put their faith in their employers and their leaders. Which leads to the question: Why?

One answer comes from the classic 1941 book Escape from Freedom by the Frankfurt-born psychologist Erich Fromm.9 Fromm sought to understand why people voluntarily embrace authoritarian regimes such as Hitler’s Germany. Fromm argued that when people are freed from the restrictions placed on them by institutions or other individuals, not everyone experiences this freedom as a positive thing; in fact, many do not enjoy it and find the absence of constraint uncomfortable. Fromm suggested that some of the ways that individuals seek to minimize the negative feelings associated with freedom include engaging in conformity, destructiveness, and authoritarianism.

Jean Lipman-Blumen, a professor at Claremont Graduate University, developed and expanded much of this analysis in her insightful book on toxic leaders.10 She wanted to understand a seeming paradox: On the one hand, numerous toxic leaders in all walks of life who led their organizations to calamity, while, on the other hand, followers who eagerly went along, sublimating themselves and subordinating their judgment to the most outlandish of follies, ranging from the catastrophe at Enron to the WorldCom financial fraud and meltdown, orchestrated by Bernard Ebbers, an individual with a degree in physical education who went from running a chain of motels to putting together what was presumably one of the largest telecommunication companies in the world—except, of course, that it was built on financial chicanery.

Lipman-Blumen’s analysis of why people follow toxic leaders provides important insight into why people seek leaders in their lives at all, hoping, of course, that they get a good one rather than one from the more toxic, harmful variety. One such motivation is the need for security. Autonomy seems great until you have it, and then many people want the reassurance and even guidance that comes from belonging to a larger entity like a workplace that will provide at least some minimal sense of security. Lipman-Blumen, echoing Freudian analysis, notes that as children we are used to being cared for and taken care of. As adults, we may miss that quasi-parental caregiving and therefore decide to seek care from other, nonparental authority figures such as bosses.

Leaders fill other needs as well, as Lipman-Blumen discusses. People are social creatures, which is why one of the first things done to break prisoners of war—or for that matter, prisoners in general—is to isolate them from social contact. We are afraid of being ostracized, of being excluded from the group—something that accounts for a lot of teenage behavior, including teen bullying. We can seemingly avoid ostracism by joining a group with a strong leader that includes and incorporates us.

Lipman-Blumen also notes that people respond particularly well to leaders who make them feel special, part of an elite, distinguished, unusual group. Certainly that was a theme at Enron—the phrase “the smartest people in the room” stands out—but it also characterizes other workplaces as well. People love to feel good about themselves—the self-enhancement motive—and if a leader feeds their individual or collective egos, people may not ask too many questions about the leader’s own behaviors and motives as they bask in the warm glow of feeling special.

Finally, people are motivated to avoid cognitive dissonance—having two discrepant, inconsistent ideas in their minds at the same time. They can do so by cognitively reevaluating one or possibly both ideas to make them more congruent. Cognitive dissonance runs rampant inside workplaces. The idea that I have joined and voluntarily remain in a place, and the idea that the place I am in is run by some incompetent, venal, mean individual, are two highly discordant thoughts. It is often difficult to change the reality of my joining and remaining in my present place of employment. It is much easier to change my perception of the leader and my workplace, deciding that they are actually wonderful and special, which is why it was so sensible for me to join and remain in the first place.

The many ways in which people conspire in their own deception, including deceiving themselves about their leaders and the organizations in which they work, would require a book in itself to do justice to this vitally important topic. Suffice it to say that all of the dynamics described here provide a good beginning for understanding why people trust and put their faith in leaders without much critical thought and are often unlikely to look too closely at what sort of leaders they actually have.

THE RISKS OF RELYING ON LEADER BENEFICENCE

On Saturday, June 22, 2013, an Anchorage newspaper reported that “local employees of a men’s clothing chain store put all sales on hold . . . in protest of the abrupt firing of the store’s founder and executive chairman.” Yes, employees at the Anchorage, Alaska, Men’s Wearhouse store walked off the job to express their displeasure with the board’s removal of George Zimmer from his role as chairman of the board. The employees stopped picketing and went home when the new management threatened to call the police.11 “‘We’re avid supporters of George . . . he’s been there for all of us,’ said Tiffany Karling, Assistant Manager . . . ‘He’s given us a family feeling . . . and he’s made a big impact on me for sure, personally.’”12

It is hardly surprising that these employees appreciated Zimmer and the culture he had created. After all, they worked in retail, a typically low-wage industry. In 2014, retail clerks and sales associates earned just 60 percent of the median hourly wage. Moreover, retail is filled with part-timers whose work hours (and therefore pay) are at the whim of scheduling software algorithms that ensure that there aren’t too many people in the store for the anticipated customer traffic and also that people don’t get enough hours to qualify for benefits. Retail employees struggle to earn a living wage even if they manage to work full-time, and they are people on whose behalf various advocacy groups such as Fast Food Forward push for improved job conditions. The Men’s Wearhouse paid higher wages, used fewer part-timers, and offered more training and internal promotion opportunities, than other retail chains, and in turn its employees quit less frequently, often because they understood and appreciated the unique retail work environment they enjoyed.

Soon after Zimmer’s departure, the organization’s culture began to change in ways that signaled it would not be “employees first” as it had been in the past. With little warning and no consultation, the world of the fifteen thousand Men’s Wearhouse employees changed—and not for the better. Once again, we have an example of why there is no assurance that the person in charge will remain in charge, and certainly no assurance that this individual will be replaced by another with the same values and qualities.

There is no shortage of similar examples. For instance, when the family-owned business Fel-Pro of Skokie, Illinois, was purchased by Federal-Mogul Corporation for $720 million, employees went from working for a beneficent employer concerned about employee well-being to working for a rapidly growing company concerned about margins and share price. The culture changed in a hurry. The sale of the Wall Street Journal by the Bancroft family to Rupert Murdoch’s News Corporation radically altered the work environment, as evidenced by the exodus of longtime Journal employees.

The realities of estate taxes, liquidity events, private equity ownership, going public, and so forth all conspire to ensure that except in the case of companies that are completely or majority employee-owned, even if employees enjoy a wonderful leader for a while, continuity in that leadership and culture is far from assured. Stuff happens, as they say. Companies sell out, leaders retire or die, and the new people in charge aren’t the same as the old, particularly in how they relate to and treat employees. SAS Institute has resisted going public because Jim Goodnight, the CEO, is concerned about the effects of public ownership on its employee-centric, family-oriented culture. So long as he holds all the cards, his workers’ loyalty is probably justified, but only so much as they can be certain that he’ll be in charge forever. Which they can’t. Which none of us can.

And even if there aren’t new owners, even the old, wonderful owners aren’t wonderful and charitable 100 percent of the time—this is business, after all, and there are economic realities. Howard Behar, the former president of Starbucks International and a longtime board member of the company, told me when we talked at a Conscious Capitalism conference on April 6, 2013, in San Francisco, why he left the board of a company where he had worked for decades. During the financial crisis of 2007 and the years following, Starbucks’s sales declined. Although the company remained profitable, its profits declined. So the company, in an effort to maintain its margins and presumably its share price, laid off thousands of employees. Not because it had to, in the sense of maintaining its economic viability, but because it sought to preserve its economic performance, at the cost of the jobs of many of its employees. Behar did not agree with that decision or its signal about the real values of Starbucks, and so he resigned.

At a panel discussion at the same conference, Chip Conley, the founder of the amazingly successful boutique hotel chain Joie de Vivre Hotels, described why he had sold the chain and left its active management after the recovery from the same economic recession. Because of the chain’s size, during the economic downturn it was impossible to fully insulate the company’s operations from the financial stringency—it was simply too large to hide in some protected niche. Therefore, Joie de Vivre had done a large number of layoffs. Doing the layoffs was hard for Conley, who understood the emotional and financial costs inflicted on the people who lost their jobs. As he noted, there will be other economic cycles, and he did not want to go through the experience of doing layoffs again. Hence, the sale and his exit from day-to-day hotel operations.

The point is that even in the best cultures, with the most well-intentioned leaders, the working circumstances, including economic security, are not invariably perfect or permanent. So trusting in the kindness of organizational leaders with their own agendas, points of view, and interests is inherently perilous, even if many people do precisely that.

CREATING LESS LEADER-DEPENDENT SYSTEMS

If leaders are, as everyone I know would admit, inevitably imperfect and impermanent, there are two possible solutions for the problems this fact causes for those who work for such leaders. The first solution is an approach advocated by many leadership-development practitioners and teachers: do a better job of developing, training, educating, and selecting leaders so we change the distribution of the leaders we have in the talent pool from selfish versus selfless, competent versus incompetent, egotistical versus modest, trustworthy versus untrustworthy—you get the picture.

This is a nice sentiment, and one that animates the leadership industry and its many practitioners, but one that, for all of the psychological and social psychological reasons already covered in this book, is not very likely to work, at least on a consistent basis. There is, however, a second approach worth considering, one that grew out of the quality movement; it’s an approach that helps explain why flying in an airplane has become so incredibly safe.

Whenever there is an airplane accident, or for that matter, a number of near-accidents or other problems, the customary response is to try to redesign the plane to make such problems less likely to occur in the future. This may entail changing the controls or the guidance systems, or increasing mechanical or other system redundancies—in short, doing things that make it easier for the people flying and servicing the plane to do the right thing and more difficult for them to screw up. Such an approach is completely consistent with the principles of the quality movement, which promotes fixing the system rather than relying on the skills of individuals—to produce, in other words, an environment in which ordinary, albeit conscientious, people can reliably produce desirable results.

The lesson of W. Edwards Deming and his peers in the quality movement is that relying on individual motivation and acts of great competence is a singularly unreliable way to produce consistently high levels of system performance. Deming argued that if there are performance problems and quality defects, one needs to understand how those problems arise almost naturally as a consequence of how a system has been designed—and then fix those design flaws. Put simply, attack the problems by fixing the system, not scapegoating the necessarily fallible human beings working in and operating that system—whether or not they deserved it.

Inside organizations of all kinds, there are many ways to redesign governance that would reduce the dependence of employee well-being on the vagaries of people’s doing a better job of selecting and training all-powerful leaders. Such solutions mostly entail building work systems that are less leader-dependent, and instead devolve more power to a wider set of organizational constituents, particularly employees. Such systems include employee ownership; building in formalized countervailing power, such as that provided by works councils in some European countries or unions in other places; building employment systems with more distributed power by having people elect their leaders, as occurs in some partnerships; and so forth. With more distributed and balanced power, the ability of a single individual to do remarkably good—or remarkably harmful—things becomes diminished. Interestingly, these approaches seldom get much attention. Instead, we hear more pleas for better leaders—pleas that have produced little improvement in any aspect of workplaces or leader tenure in the past fifty-plus years. But never mind, maybe the future will be better.

In the absence of any sustained movement to create better management and organizational governance systems that rely more on the “wisdom of crowds”13 and less on the hope that one’s leader is better than average and not overly self-interested, it seems sensible to look out for oneself. And as I argue next, systems based on people looking out for themselves have performance advantages, at least in some circumstances, not only for the people working in them but for the system itself.

SELF-INTEREST AS A GUIDING PRINCIPLE

If leaders can be and often are toxic, primarily taking care of themselves, and even if leaders are beneficent and well-intentioned, if they are frequently unreliable or impermanent as economic circumstances change, if companies don’t promise much anymore and nonetheless break their implicit bargains, then what are we to do? My answer: try doing precisely what companies have told you to do for decades, and what the fundamental principle of economics has advocated since the time of Adam Smith.

Take care of yourself and assiduously look out for your own interests in your life inside work organizations.

As the social psychologist Dale Miller documented years ago, there is a norm of self-interest—sort of like the norm of reciprocity, but possibly even stronger and more reliable.14 Miller noted that the norm of self-interest has substantial explanatory power, in part because people assume self-interest is operating and thereby take actions that make the norm of self-interest self-fulfilling through their very behaviors.

We know something is a norm because when the norm is violated, there are sanctions, and this is precisely the case for self-interested behavior. For example, in a series of four experimental studies, Miller and his colleague Rebecca Ratner found that people expected to be evaluated negatively if they supported a cause in which they had no stake (i.e., had no self-interest), that people reacted with surprise and anger when they observed others taking action inconsistent with those others’ interests, and that individuals were more comfortable taking action on behalf of some issue if that issue was described as being self-relevant.15 While it is of theoretical and intellectual interest to understand whether self-interested behavior arises from innate proclivities or conformity to social norms, the practical advice that emerges from the copious amounts of research on self-interested behavior is the same: presume that others are acting on the basis of their self-interest, and you will be better equipped to forecast and understand their actions.

I’m sure some of you must be thinking that surely not everyone behaves in a self-interested way, and of course that is true. In fact, throughout this book and in life, virtually nothing about social behavior is true all the time. There are, as I have frequently noted, good bosses who look out for the welfare of others, and humane workplaces filled with engaged and trusting employees. The question is the relative proportions and, to the point of this exposition, how to change those proportions—which requires recognizing the facts and understanding why they are as they are.

With respect to self-interest, as Adam Grant noted in his bestselling book Give and Take, people who are “givers,” those who are generous with their time and with their help of others, are often the most successful in building networks of support and therefore in their careers.16 But Grant also summarized research, including his own, that found that givers were not only among the most successful individuals, they were also among the least successful, and he provided advice about how to be generous without being a patsy. But even more to the point, Grant noted that “in the workplace, givers are a relatively rare breed.”17 Therefore you should not expect to be surrounded by such people. Moreover, research suggests that cooperative cultures are quite fragile, as are cooperation and trust in prisoner’s dilemma games. Once individualistic values come to dominate,18 or once people defect in prisoner’s dilemma situations,19 trust and cooperation are difficult if not impossible to rebuild.

Thus far I have offered what are essentially self-protective reasons for taking care of yourself—acting on the basis of your own self-interest—and assuming that others will probably do so as well. But there are normative reasons that also favor taking care of oneself as an effective paradigm for organizing the workplace and many other social settings. Let me offer three such rationales.

Most people, certainly those in Western societies, favor democracy as a form of government. The United States has spent a fortune promoting democracy around the globe, often unsuccessfully, as in much of the Middle East. Yes, democracy is messy, sometimes slow and inefficient, and prone to, and indeed a progenitor of, electoral politics in all of the good and bad that such a phrase implies. But as Winston Churchill commented, “It has been said that democracy is the worst form of government except all the others that have been tried from time to time.”20 Democracy fundamentally entails people choosing their own leaders—governing by the consent of the governed. Dictatorship is the opposite of democracy, and although history provides examples of enlightened autocrats who produced great social and economic progress for their societies, gambling on getting Singapore’s Lee Kuan Yew rather than Zimbabwe’s Robert Mugabe seems like a risky bet. Most important, much of the discipline of political economics argues that voters and their preferences are governed by calculations of rational self-interest.

Democracy as a form of governance is venerated in governments but not in work organizations, where not only is there precious little governing with the consent of the governed, the employees, but there is also little governing with the consent of the company’s owners, its shareholders, except under relatively rare circumstances. Law firms, accounting firms, management consulting firms, and other partnerships can and sometimes do elect their leaders, as do some worker cooperatives such as Mondragon in Spain and the John Lewis Partnership in the United Kingdom, both substantial economic entities. But such arrangements are rare.

The presumption seems to be that an appointed leadership is more to be trusted with the company’s well-being than any form of democratic or semidemocratic process for selecting and even removing leaders. It is interesting, to say the least, that while many people and much social science theory presumes that individual voters, acting at least partly out of their own self-interest, can produce better electoral outcomes than any other system, inside economic entities, which in some cases dwarf the size of at least some governments, a different calculus is advocated. But just maybe self-interest is as sensible a motive inside companies, which are, after all, in many respects political entities, as it is in electoral politics.

In economics as well as politics, self-interest reigns supreme as a normative principle for system design. It was the Scottish moral philosopher Adam Smith who articulated a principle often repeated to the present time: “It is not from the benevolence of the butcher, the brewer, or the baker that we expect our dinner, but from their regard to their own interest.” Much of the “mechanics” of economic analysis demonstrates precisely how it is that when all economic actors seek what is best for each of them individually, production costs are minimized, the right quantities of goods and services are produced, and prices, including the wages earned by labor, are determined fairly. Competitive markets require only that there may be many participants in the market and that each participant vigorously pursues its own interests. In such a system, the best possible outcomes occur, as much economic analysis demonstrates.

Competition, not just in economics but in many domains of life, in which individuals each seek their own success, is celebrated and put on a pedestal. Typical of the multitude of approving sentiments is that offered by the steel industrialist Andrew Carnegie: “And while the law of competition may be sometimes hard for the individual, it is best for the race, because it ensures the survival of the fittest in every department.”

Societies generally seem to celebrate competition as individuals seek their own advantage. That is, once again, until we consider the internal workings of organizations. So once inside a company, people are not only expected to collaborate rather than to compete with each other but, more important, to cooperate with their leaders (or bosses, to use a less noble term), to voluntarily subordinate themselves to their leaders’ interests and commands. That would be fine if leaders were invariably concerned about the welfare of those individuals or even the well-being of the collective. But as we have seen throughout this book, leaders often pursue their own interests. Consequently, it makes little logical sense for individuals not to do precisely the same thing, and take care of themselves, too.

But perhaps most important, beyond the better decisions and more efficient results in political and economic systems produced by self-interest seeking, there is a third benefit: encouraging individuals to be responsible for their own well-being helps them come to think of themselves as fully functioning adult human beings. Encouraging people to seek and then to follow benevolent leaders—to put their trust and faith in these individuals who, because they are human, will nevertheless be invariably flawed—strikes me as seeking to in some measure infantilize otherwise competent working adults.

As we’ve seen, plenty of people actively conspire in their own subordination as they seek the affiliation, self-enhancement, and security that can come from subjugating themselves to and believing in larger-than-life leadership figures. But the fact that many people apparently seek to “escape from freedom” does not mean that such a path is in any way beneficial to them or to the larger social structures, including workplaces, in which they participate.

The bottom line: If you have a beneficent environment and a leader who actually cares about you, enjoy and treasure the moment, but don’t expect it to be replicated elsewhere or to even persist indefinitely where you are. The world is often not a just or fair place, our hopes and desires notwithstanding. Get over it. Take care of yourself and watch out for your interests. If others do as well, all the better. To the extent you develop self-reliance and cease relying on leadership myths and stories, you will be much better off, and substantially less likely to confront disappointment and the career consequences that devolve from relying on the unreliable.