CHAPTER FOUR |
The Morality of the Self |
PART II OF THIS BOOK, the second half, will describe the new morality of self-fulfillment and explain its co-causal connection to the administrative state. For purposes of organization only, its three chapters move outward from the self in concentric circles, as stated in the Introduction. This chapter deals with the person’s sense of self and individual decisions: formulating a life plan, developing a career, choosing a religion, finding leisure activities, and deciding when to die. Chapter 5, which follows, addresses the person’s intimate and other face-to-face relations. The sixth and final chapter addresses the person’s relations with the wider society.
This structure is derived from the content of the topic it discusses. Beginning with the self and moving outward adopts the modern Western view that treats the self as primary. If people in the Early Middle Ages were attempting to describe the morality of their era, they would probably have started with their social standing, while those in the High Middle Ages might have begun with their relationship to God and the fate of their immortal souls. As stated in the Introduction, there is no neutral starting point, no objective (the Ancient Greeks would have said empyrean, Medieval people would have said divine) perspective from which to view the world. One might imagine adopting a non-Western perspective to achieve neutrality among these differing Western moralities, but even if that were possible for someone from the West, it would only add an additional set of preconceptions and probably produce more confusion than clarification.
The concept of self-fulfillment, needless to say, lies at the core of self-fulfillment morality. To explicate it, each component of this hyphenated term will be considered separately. This section will discuss the meaning of the self, and the next will discuss the meaning of fulfillment. It will then be possible to consider the ramifications of the self-fulfillment concept in the remaining sections of the chapter.
The self is an important topic in modern philosophic thought,1 but the concern here is social morality, not philosophy. Complex questions involving individual consciousness can thus be set aside, and only the meaning of the concept in High Modern morality needs to be considered. From this perspective, the self in self-fulfillment is, in its essence, a narrative existence, a continuous process that extends over time and is shaped by individual or personal choice. These two elements—temporal development and personal choice—provide the sense of self as agency that is relevant in discussions of High Modern morality. Anthony Giddens uses the term “lifestyle” for this concept;2 Christine Korsgaard describes it as a process of self-constitution,3 and Ulrich Beck and Elisabeth Beck-Gernsheim describe it as a “do-it-yourself biography.”4 It is related as well to Martin Heidegger’s concept of a time-embedded existence, or Dasein.5 For purposes of this discussion, the self will be described as a life path, that is, the succession of chosen steps by which one’s existence in the world unfolds.
Both elements of this concept of self—that is, temporal development and personal choice—can be clarified by contrasting them with the two previous moralities that prevailed in the Western World. The morality of honor and the morality of higher purposes are based on the view that the self is an identity that is largely fixed in time. It exists throughout the duration of the person’s life, a constant feature rather than a developing process. Moreover, this identity is not chosen by the individual but assigned to the individual by the social structure. Honor morality’s hyper-hierarchicalism assigned essentially different identities to people based on their position in society. The morality of higher purposes established the same essential purposes for everyone but reintroduced the idea of separate identities through notions such as the Great Chain of Being and asserted that all people could serve God and the monarchy or nation by performing their distinctive roles.6 Religious doctrine in both eras contributed to this view by asserting that the essence of the self is the soul, which is unchanging and immortal.
The modern morality of self-fulfillment, in contrast, views life as a temporally extended pathway rather than a fixed identity and insists that people choose the path themselves, rather than filling a predetermined position in the social hierarchy. As a result, modern morality represents a shift, in philosophic terms, from substantive to formal standards or, in legal terms, from substantive to procedural standards.7 Both of the Western World’s premodern moralities prescribed the content of the choices that people should make as they lived their lives. The morality of honor demanded that people carry out their established social roles as a matter of maintaining their proper status, and the morality of higher purposes demanded that they do so to achieve identified objectives, such as saving their souls and serving the nation. But the new morality leaves the content of the individual’s life choices open and instead prescribes a mode of decision-making: the demand that one must make those choices for oneself.8 Such a purely formal or procedural command is certainly different from the content-based demands of the other two moralities and just as certainly immoral from their perspective. It is not immoral from some neutral or external point of view, however, and, not surprisingly, it is the way that most philosophers in the period of High Modernity, that is, since Kant, approach the issue.9
These differences lead directly to a difference in the role that others play in the individual’s life. Both previous moralities urged a person to pay assiduous attention to the views of others, in the first case, as an intrinsic matter, because honor depended upon other people’s views, and in the second case instrumentally, because those views enforced the God-given hierarchy. Gregory of Tours, in his account of a bloodfeud that was summarized in Chapter 1, says that Chramnesind felt compelled to kill his friend Sichar after Sichar insulted him because “they will think of me as a woman” if he didn’t.10 Martin Luther declared that a servant who obeys his master is thereby serving God.11 The morality of self-fulfillment insists that people define their own positions and their own pathways through life. Modern writers extol this attitude and strongly urge it upon people as a matter of morality. When Heidegger warns his readers that following “the they” (das Man) will preclude them from an authentic understanding of their own existence,12 it is Chramnesind’s “they” to which he is referring.
A particularly important “other,” as previous chapters have discussed, is the mode of governance that public authorities impose. In the Early Middle Ages, these authorities regarded the individual as a follower of either the king or a noble landowner and imposed various standards of behavior, such as loyalty, that were derived from that relationship. In the High Medieval and Early Modern periods, individuals were regarded as subjects and were expected to treat the ruling monarchy as the higher purpose of their secular behavior. The administrative state that has developed co-causally with modern morality is designed to serve the people, to satisfy their self-defined desires. Thus, as Chapter 6 will further explicate, it imposes no substantive standard of behavior. In fact, it is based on people’s ability to identify their own desires and—at least in the representative democracies that now prevail throughout the Western World—to communicate those desires through the political process.
These distinctions are highly significant, but it is important not to over-dichotomize premodern and modern morality. To say that premodern morality required people to carry out their social roles whereas modern morality demands that they choose their own life paths does not mean that the previous moralities precluded choice or that the new morality encourages irresponsibility. In virtually any society, people exercise some level of control over their lives, and in virtually any society they are expected to carry out some identified set of tasks.13 A Medieval shopkeeper was not forbidden from choosing a different occupation, and a modern factory or office worker is not supposed to wander away from her job if the spirit moves her. The difference lies in the relative amount of choice and obligation that the particular morality demands. The morality of higher purposes urged the shopkeeper to remain in place, while self-fulfillment morality urges the worker to find a new career if she is dissatisfied with her existing one. Another way to understand the distinction is to recognize that differences in morality are often matters of interpretation. The moralities of honor, higher purposes, and self-fulfillment all permit choice, but the former two see it as suspect, while the modern one sees it as desirable. All expect people to fulfill their obligations, but the former ones define obligations as roles established by society, while self-fulfillment morality regards them as commitments one has chosen for oneself. In modern morality, being compelled to choose a course of action often exempts the person from the obligation to continue it.
Many consequences flow from the concept of the self that modern morality incorporates, but a few examples will suffice. One is a change in the meaning of heroism. In literature, the premodern hero, like Roland, El Cid, Sir Gawain, and Sir Lancelot, is a defender of the established order. A subsidiary but important theme, from Homer’s Achilles through Shakespeare’s Hamlet, is that the hero questions the value of that order but ultimately embraces it, even in the face of death.14 Modern heroes think for themselves. Their acts of heroism—which are quintessentially moral actions—often involve an assertion of their individuality against social convention, whether it is Victor Hugo’s Jean Valjean stealing bread to feed a starving child, dodging the police, and mounting the barricades; Henrik Ibsen’s Nora walking out of the stultifying, chauvinistic doll’s house of her marriage, or Clint Eastwood’s Dirty Harry throwing his badge down on the police commander’s desk and declaring: “I’ve had it with your rules; now I’m going to take care of things myself.”15
Another, rather different consequence involves the morality of aborting a severely impaired zygote or fetus. This is not the question, to be discussed in the following chapter, of whether a woman has the right to control her body in connection with her sexual self-fulfillment. That is a question involving her intimate relationships, whereas here the inquiry involves the concept of the self. The situation is that the woman wants a child, thus obviating matters of sexuality and choice, but according to tests performed in early pregnancy, the child resulting from her pregnancy will be profoundly retarded or doomed to die young. The morality of higher purposes demands that the woman proceed with the pregnancy on the theory that each child born alive has a soul whose higher purpose is to be saved, and perhaps as an emotive if not logical extension of its sexual morality. But the new morality centers on the self, which is the life of a conscious being. This suggests that if a potential human consciousness will not be able to live at least a minimally fulfilling life, if it cannot construct a life path in the most rudimentary sense, it is simply not a self.16 Thus, there is a moral obligation to abort it before it becomes an independent human being, particularly if it will suffer pain once it is born.
Modern morality’s concept of the self as life path can be further explicated by distinguishing it from a number of other ideas that are somewhat related but not necessarily characteristic of modern morality in the same essential way, including autonomy, individualism, and introspection. Linguistic usage encourages us to describe the choice of life path as embodying a principle of personal autonomy, but the term brings with it rather formidable philosophic baggage. This includes Kant’s demand that one’s entire ethical system must be a matter of self-legislation and Heidegger’s notion that the individual should seek existential freedom, an emancipation from ordinary social constraints into a higher realm of consciousness.17 As a matter of social theory, however, people in modern society are just as bound to the prevailing morality as they were in former times. It is given to them by their culture and, like culture generally, establishes the mental framework that defines the meaning of their actions. The difference is that the culture established by the morality of self-fulfillment contains an affirmative expectation that people will choose their own life paths and treats the failure to make such choices, through thoughtlessness or subservience, as a defective way to live. Thus, self-fulfillment is as decretory as prior moralities at the level of culture and is distinctive, as just indicated, only in imposing formal or procedural rather than content-based demands.
It is equally important to be cautious about the term individualism, as discussed in Chapter 3. The premodern people who have inhabited the preceding pages, such as Martin Luther, St. Francis, and even the despicable Sichar, seem indisputably individualized; finding some criterion which would reveal that modern figures are more genuinely individual would be a daunting task. What can be said, however, is that modern morality encourages people to think of themselves as individuals rather than as members of a group.18 It celebrates those who strike out on their own, who exceed expectations or even just reject them, who refuse to be defined by their family, their social status, or their circumstances. The child protagonists in Disney movies nearly always do so, a role that the screenwriters often achieve by killing off or disabling their parents at the start of the movie. When antagonistic aliens are portrayed in science fiction, they are often Borg or bugs or buggers19 who have one collective mind and no individual initiative. This may be contrasted with that masterpiece of Medieval science fiction, Dante’s Divine Comedy, where the evil beings are those who have disobeyed the word of God.20 Thus, to describe modern morality as valorizing individualism does not mean that it necessarily makes people more distinctive, but rather that it urges them to define their own pathway through life.
A third caution that should be observed is to distinguish the choice of one’s life path from the more general idea of introspection. It is certainly true that choosing one’s life path requires at least an element of introspection and probably true that this is a quality that has become more common, or more central, as Western society has developed over time.21 But the two are not equivalent, and it is not necessary to grant High Modernity a monopoly on introspection in order to identify self-fulfillment as this era’s preeminent morality. Even if one is prepared to assert that Early and High Medieval people were rarely introspective in the sense we use this term, the claim appears implausible with respect to Early Modern figures such as Montaigne, Shakespeare, and Descartes.22 What is being asserted as the distinctive feature of High Modernity is not simply that people are introspective, but that they are introspective about a particular topic, namely, their individual choice of life path. Modern morality does not demand that people be philosophers or playwrights, but simply that they make basic choices about the life they will lead and the role they will play in the society.
The other basic component in the idea of self-fulfillment is, of course, fulfillment. In addition to being chosen for oneself, a life path, as the defining feature of self-fulfillment morality, combines momentary experience with a general pattern. In terms of momentary experience, individuals are instructed to enjoy themselves as their lives unfold, to derive as much pleasure as they can from life as it is being lived. This is Freud’s rule of mental health and constitutes modern morality’s most essential standard of behavior. It is what philosophers describe as a hedonic standard, in that it focuses on people’s feelings of satisfaction or enjoyment in the here and now.
One way to capture this idea is with the popular saying that “life’s a journey, not a destination.” Consistent with the development of the new morality that was described in the previous chapter, the phrase seems to have evolved in various forms during the nineteenth century, acquired its definitive form in the first part of the twentieth century, and then been popularized in the late twentieth century, in this case by Steven Tyler, Aerosmith’s lead singer.23 Although an attribution of this sort might be demeaning for a philosophic system, it is appropriate for a social morality, which must, after all, be readily comprehensible by everyone in the society. The phrase explicitly rejects the morality of higher purposes, which finds life’s meaning in its destination; the idea of a journey, which it substitutes, is that one should enjoy the things one sees along the way.24
Tyler’s lyrics do not amount to a philosophy, but modern philosophers have devoted a great deal of attention to issues that are closely related, and often seem directly relevant, to self-fulfillment morality’s recommendations for the self. In discussing the enjoyment of life’s moments, a number of philosophers propose various criteria to distinguish between proper and improper or advisable and inadvisable enjoyments. Alan Gewirth says that self-fulfillment involves satisfying only our deepest or highest aspirations;25 Charles Taylor that it must be guided by authenticity, a sense of being true to one’s own originality;26 and Heidegger, although he would not deign to use a term like self-fulfillment, champions authenticity as the profound understanding of one’s free and finite existence.27 In a similar vein, psychologist Abraham Maslow distinguishes between a fully engaged existence, which he calls self-actualization, and an existence based on the felt need to overcome deficiencies.28
The difficulty with these views, from a descriptive standpoint, is that they require some externally imposed criterion by which the individual’s immediate desires can be judged.29 Such criteria can be formulated and argued for by philosophers, but they do not appear to be a matter of general belief in our society. It seems possible that these thinkers, although fully committed to the secular vision of modernity, are unwilling to relinquish the higher purposes of the previous morality. It is even possible that this connection to higher purposes morality explains Heidegger’s attraction to Fascism,30 which, as Chapter 3 described, sought to replace God with the nation-state. The modern morality of self-fulfillment rejects all such higher purposes. As accurately identified by Freud, it recommends that people should do whatever makes them happy, whatever feels good to them. Interposing an externally established criterion may well, like Freud’s punishing superego, impair their mental health according to the new morality.
While Tyler’s lyrics do not imply that any external criteria should be imposed on the individual’s choice of life path, they do suggest some important internal standards for the way the person approaches that choice or set of choices. Very few people begin a journey by simply leaving the house; typically, they plan an itinerary, make travel and lodging reservations, and pack the luggage they will need. Along the way, they take photographs of places they have been, collect some souvenirs, sometimes keep a journal, and typically think about what they have seen as they proceed. In other words, the image of a journey incorporates both an element of planning and an element of reflection into what is otherwise an ongoing, moment-to-moment experience.
Self-fulfillment morality urges people to treat their lives as a journey in this sense. They should regard their lives as a totality—a self-contained totality with no external purpose, but a totality that stretches forward to the future and back into the past. Planning for the future and reflecting on the past enable the individual to take control of his or her experiences, to make one’s life one’s own. This sensibility is, once again, strongly supported by modern philosophy; people’s embeddedness in time and their consciousness of this condition plays a central role in phenomenology and existentialism, particularly in the work of Husserl and Heidegger.31
The idea that one should plan one’s life is co-causally related to the rise of government planning and springs from a similar source: that actors must identify their own goals, not carry out some predetermined function. Jeremy Waldron argues that the primary arena of political morality in the modern world is not a static conception of rights, but the evolving process of administrative legislation, which, as discussed in Chapter 3, is a process of implementing planned, or articulated goals.32 Although planning obviously functions differently in these different contexts, it implies at least two moral principles that are relevant to both governments and individuals. The first is that future conditions should be taken into account in deciding on present actions, for the obvious reason that one’s future will, quite predictably, become one’s present. Many future pleasures or advantages require present planning, and many future miseries can be avoided or abated by that means. The second component of planning is that it often requires the sacrifice of present advantages, not in pursuit of a higher purpose, but to obtain greater advantages when the future arrives. People and governments have always planned, of course, generally to secure a variety of specific benefits. But in the morality of self-fulfillment, planning is an essential component of the self as well as a pragmatic strategy.
A familiar and commonplace example at the individual level involves dieting. Here is a slice of delicious chocolate cake. Eating it might have been condemned by the morality of higher purposes as a form of gluttony, but modern morality contains no such prohibition. Suppose, however, that the person sitting in front of it knows that he would be happier in the future if he were to lose weight. In other words, he sees one aspect of his life path, his narrative of himself, as being thinner in order to be healthier or more attractive. He might then decide to forgo his present pleasure for that future happiness. His resolve might be reinforced by second-order attitudes, such as his sense that he will be proud of himself if he resists the temptation and will feel miserable if he succumbs to it.33 Unlike the belief that gluttony is sin, neither his resolve to diet nor his anticipated pride or misery is external to his own desires. Rather, they are second-order desires that function as factors in his present decision. Another person, with different goals or a different metabolism, might have no reason to abstain from eating the cake.
Issues such as a person’s willingness to remain bound by a previously chosen course of action are central to many contemporary philosophic discussions of rationality.34 John Rawls states that if a person’s plan—“what he intends to do in his life”—“is a rational one, then . . . the person’s conception of his good is likewise rational.”35 A famous psychology experiment found that children who could resist the temptation to eat a marshmallow on the basis of the promise that they would get two marshmallows later grew up to be more successful adults.36 The same behavior is taken by researchers as a sign of intelligence in animals, as in the experiment where pigeons who could resist the temptation to depress a pedal giving them food would get more food later on.37 For purposes of describing a social morality, however, there is no need to decide whether that morality meets some definition of rationality or intelligence. The more meaningful connection is to the administrative state’s articulation of goals. Both represent the modern mode of thought, a conscious effort to achieve specified objectives. There is no external rule that either the individual or the government should be rational, only that both should act in a manner that will produce the consequences they desire for a future that will ineluctably arrive. From this perspective, rationality is simply a tool that enables the individual or the government to plan effectively.
For the individual, an important example of such planning is one’s career. The progress of industrialization, the cumulation of knowledge, the advances in technology, and the expanded scale of society have led to the increasing specialization of labor.38 Most careers require more training than they did in the past, and the more prestigious, remunerative careers require a lot more training. While many people enjoy college, and some even enjoy graduate and professional school, the years spent in these institutions are generally viewed as preparation for one’s future life, not as an end in themselves. People are willing to devote all this time to preparation because they know how society is structured and want the rewards that only training can provide. As a result, pervasive cultural attitudes have developed that are more future-oriented, more willing to defer fulfillment in the present.39 In effect, modern people tend to value anticipated pleasure in the future more highly than their predecessors did.
Conversely, people’s increasing ability to plan for the future and delay gratification in the present has advanced the development of the industrial economy, which might not even have been possible without this change in sensibility. Norbert Elias, whose history of manners was described in Chapter 2, attributes the change to a Freudian process of internalization, whereby social constraints that previously had to be imposed by force, and were only intermittently obeyed, became integral aspects of people’s motivation set. He describes this as the “civilizing process” and traces its initiation to the rise of royal courts and the courtly behavior of the military aristocracy. This further emphasizes the connection between politics and morality and suggests that self-control and future orientation was as much a cause as it was an effect of the modern industrial economy.40
In what may be the most sustained philosophic analysis of self-fulfillment, Alan Gewirth highlights another aspect of this issue by distinguishing between aspiration fulfillment and capacity fulfillment.41 For present purposes, capacities can be defined as externally evaluated skills, that is, a person’s ability to perform some task effectively as measured by the society’s prevailing standards. The question is whether, apart from satisfying one’s aspirations, self-fulfillment also involves developing or using one’s capacities. Consider a little girl whose parents and teachers realize that she is unusually intelligent and dexterous. Clearly, she would make an excellent surgeon. Should she regard that career as essential to her self-fulfillment because it makes use of her capacities? In other words, should making use of one’s capacities be regarded as an independent element of self-fulfillment?
The difficulty in doing so is the same as the difficulty in limiting a person’s aspirations to the highest, deepest, or most admirable. It requires the imposition of an external standard, some principle of choice that lies outside the ambit of the self. Modern self-fulfillment morality does not include any standard of that sort. But having thus relinquished what many people may regard as an attractive principle, the morality of self-fulfillment retrieves a major portion of that principle through empirics and psychology. As an empirical matter, people receive many of the social benefits they generally desire—money, power, and esteem—if they develop their innate capacities and talents. As a psychological matter, most people seem to find skillful performance to be inherently enjoyable.42 In addition, people’s predilections, far from being fixed at birth, are shaped by their interaction with those who surround them. When they receive encouragement or admiration for particular actions, they are likely to find the performance of those actions to be a source of enjoyment. For the most part, then, a person whose relation to herself is shaped by self-fulfillment will be well advised to develop her capacities. If she fails to do so, she is likely to suffer more misery in the future than she avoided in the present.
This correspondence of capacity and desire, however, is merely an empirical likelihood, not an inherent element of the new morality. The relationship to oneself that this morality prescribes allows people to ignore their innate capacities, to choose a life path shaped by their desires rather than their abilities. As the intelligent, dexterous girl grows into adulthood, she may decide that being a surgeon is too impersonal, too demanding, or not sufficiently aesthetic. She may opt instead for some alternative that satisfies her more, even if she is less likely to excel at it. In actuality, the choice is rarely so stark, if only because the future is uncertain and one’s judgment of one’s possibilities is frequently shaped by one’s desires. Hector Berlioz’s father, who wanted him to be a doctor, told him that he lacked the talent to be a successful composer; Berlioz, who was not a prodigy and had received little formal training, had the gumption to respond that he would do less harm as a third-rate composer than as a third-rate physician.43 To be sure, he turned out to be a towering genius, which is why we know the story, but many other people have made similar choices with much less spectacular results. Eric Weiner, an American journalist who traveled around the world trying to find the places where people are the happiest, notes that the citizens of small, insular Iceland are strongly encouraged to develop artistic careers. Most of them, predictably, turn out to have only modest abilities in these areas and must refashion their aspirations into hobbies, but they support each other’s efforts, and it makes them feel fulfilled.44
In addition to planning for the future, High Modernity’s narrative conception of existence—the idea of life as a journey rather than a destination—involves reflection on the past. This is the conception that led to the uniquely modern notion—which Rousseau offered as a confession, Wordsworth recognized as a discovery, and Freud turned into a theory—that one’s personality is shaped by childhood experience. Although the primary consequences of reflection might be regarded as aesthetic rather than moral,45 it can certainly serve as the basis for action in many circumstances. The itinerary of life’s journey, after all, is a record as well as a plan. As phenomenology suggests,46 people often want that record to possess a feature that can be described as meaning or coherence. The result is that one’s past tends to exercise an increasing influence on one’s present decisions as one’s life proceeds. In some cases, people will act in a way that validates their prior choices. In other cases, they will act to reverse or obviate those choices, to rescue meaning from the flames of error. These inclinations may also intensify the piquancy of anticipated misery; as people look forward, they will want to avoid finding themselves in a future where they mourn a lack of meaning. They may attach a higher value to avoiding this particular sorrow than to avoiding other future sorrows or present sacrifices, precisely because they have an independent desire that the narrative of their lives should be coherent.
The desire to avoid anticipated sorrow regarding decisions one is making in the present generates an important mechanism by which the morality of self-fulfillment is internalized, the way that it becomes a matter of belief as well as a rationale for action. Anthropologists and sociologists have established a familiar, if not always rigorously clear distinction between shame and guilt.47 As a rough approximation, shame can be regarded as the internalizing mechanism for the morality of honor. Because honor involves the way one is viewed by others—Chramnesind’s “they”—moral lapses produce a corresponding sense of shame within the individual, of humiliation, diminished reputation, and loss of social status. Its opposite, Anthony Giddens suggests, is pride.48 For the morality of higher purposes, the internalizing mechanism is guilt, the sense that one has violated a prevailing standard through weakness or corruption of the will. To do wrong, in this cultural context, is to succumb to temptation, to allow oneself to be diverted from saving one’s soul or serving one’s king. Its opposite is righteousness.49 Of course, people felt guilt in the earlier era and shame in the later one, and psychologists note that people continue to experience both feelings at the present time.50 As Oliver Goldsmith wrote in the Vicar of Wakefield, these two principles interact because “guilt gave shame frequent uneasiness, and shame often betrayed the secret conspiracies of guilt.”51 But the preeminent morality of each era tended to emphasize one or the other and induce people to interpret their reactions in terms of these alternatives.
In High Modernity, the internalizing mechanism that the preeminent morality of self-fulfillment produces, the mechanism that arises from the capacity for reflection on one’s past, is regret. Just as failure to behave with honor induces shame, and failure to serve higher purposes induces guilt, the failure to fulfill oneself induces a sense that one might have been happier if one had planned one’s life path differently. Modern people thus agree with John Greenleaf Whittier: “For of all sad words of tongue or pen, The saddest are these: ‘It might have been!’”52 Regret applies not only to lost opportunities for pleasure but also to excessive indulgence in present pleasures at the expense of preparation for a more pleasurable future. In other words, it is truly the opposite of fulfillment. The morality of self-fulfillment frees people from the narrow constraints that shame and guilt impose but leaves them free to go astray and make mistakes for which they can blame only themselves. And in self-fulfillment morality, there is no chance to die in a great conflagration that redeems one’s honor or to obtain final absolution from one’s sins as one departs for the afterlife. The purpose of life lies in the living of it, and that means that one must bear the consequences of one’s decisions. The desire to avoid this sorrow then serves as a basis for action, a basis for determining whether a particular decision is good or bad, moral or immoral, according to the new morality. Heidegger uses the term anticipatory resoluteness; while he discusses this in connection with the concept of guilt, he rejects all the traditional meanings of this term and replaces them with an idea more closely allied with regret.53
The nature of fulfillment, as the second element in the term that defines the new morality, can be further clarified by considering its relationship to three related moral concepts that are much discussed in current ethics scholarship: hedonism, virtue, and altruism. Hedonic standards of behavior, to which modern self-fulfillment morality belongs, are sometimes conflated with what may be described as vulgar hedonism,54 which can be taken to refer to a kind of thoughtless self-indulgence, an “eat, drink, and be merry” mentality with a more lurid definition of merry than the King James Bible’s translation of the phrase would imply.55 The conflation often results from either the accusations of ethicists who want to condemn the concept of happiness or the enthusiasm of survey researchers who want to measure it.56
But the concept of self-fulfillment, although it depends on the pleasure one experiences in the course of one’s life, is more closely related to the Aristotelian notion of eudaimonia, or self-flourishing. Fred Feldman offers a useful distinction between sensory hedonism, which is limited to immediate physical sensations, and attitudinal hedonism, which is pleasure or delight in a particular state of affairs.57 Holding an exam paper graded A or C is the same sensory experience but a very different attitudinal one. Sensory hedonism generally counsels the avoidance of pain, but an attitudinal hedonist might well embrace pain under certain circumstances, such as pushing oneself to the point of exhaustion to complete a race or staying up when one is tired to enjoy a party. Similarly, Martin Seligman’s psychological version of hedonic theory includes meaning and achievement as components of a person’s happiness or self-satisfaction.58 The concept of fulfillment in the definition of the new morality is further distinguished from mere hedonism because of its temporal dimension, its emphasis on planning, and its warning about regret.59 A person’s desire to be happy in the future can involve as much self-sacrifice in the present as the desire to serve a higher purpose; it can include staying up when one is tired not only to enjoy a party but also to do well on an exam.
Virtue morality, in its original form developed by Ancient authors such as Aristotle, is a special kind of hedonic theory. The idea is that one achieves the state or condition one desires—happiness or well-being (eudaimonia)—by developing and nurturing particular character traits or virtues.60 High Medieval philosophers such as St. Thomas attempted to combine virtue morality with the morality of higher purposes by adding the Christian virtues of faith, hope, and charity to pagan virtues such as wisdom, justice, or temperance and declaring that the real value of all virtues was as a way to save one’s soul.61 “[T]he object of the theological virtues is God Himself, Who is the last end of all, as surpassing the knowledge of our reason.”62 Luther reversed the relationship, asserting that salvation could be achieved only through faith but that a person whose soul was saved would naturally live in accordance with a similar set of lofty virtues.
Much of the current interest in virtue ethics seems to reside in using it as another argument for the amorality of High Modernity, the idea being that the only true morality is based on virtue, which is then declared to have become extinct in modern times.63 These arguments are often little more than a way of smuggling higher purposes morality into a secular analysis, just as Aquinas smuggles Christianity into Aristotle, because the writer gets to specify the virtues. To the extent that virtue means something different from values and refers to genuine elements of character, self-fulfillment morality rejects the claim that any particular character trait is essential. It does not necessarily reject the idea of virtue generally, but rather provides that people should define their own virtues on the basis of their goals. If one wants to become a biomolecular engineer or a clinical neurophysiologist, cultivating some capacity for delayed gratification is advisable. But many modern people, afflicted with either ambitions or anxieties, find they need to develop an opposing virtue. Instead of rushing past the daffodils, they need to stop along the way and enjoy the show—not because it reveals the hand of God, but for its own sake in the present and because only by enjoying it in the present will it provide a future “bliss of solitude.”
Both hedonism and virtue ethics, which relate to the feelings or behavior of the individual actor, contrast with altruism, which Thomas Nagel defines as being motivated by another’s reasons or concerns rather than one’s own.64 The subjectivity of fulfillment, the idea that it is only the actor’s own fulfillment that is taken into consideration, might suggest that it excludes or devalues the altruistic action that other types of morality encourage or require. Proponents of the traditional morality of higher purposes argue that people will devote themselves to helping others only if this behavior is made morally obligatory. Many modern microeconomists agree, asserting that people will only act to maximize their material self-interest.65 But the morality of self-fulfillment certainly allows for the possibility that some people’s chosen life path can be centered on helping others rather than achieving material rewards for themselves. The question is whether doing so will make the person feel fulfilled, whether it will provide him or her with a sense of enjoyment in the present and anticipated enjoyment in the future. In other words, according to the morality of self-fulfillment, helping others is optional or, to use the philosophic tongue twister, supererogatory. Susan Wolf suggests that an approach of this sort may well provide a more realistic standard for altruism than the demand for consistently other-regarding action.66 As a result, altruism may well be more common at present among those who embrace the new morality than it is among those who remain attached to the old one, for reasons that will be discussed in Chapter 6.
What has been said thus far regarding the morality of self-fulfillment may not seem related to morality at all, but only to specific aspects of the modern sensibility such as therapy or self-improvement. As noted in the Introduction, however, the term morality is a Wittgensteinian linguistic family encompassing various elements that appear in some individual moralities and not in others. Many moral systems include standards for organizing and embellishing one’s life, while others, as Lon Fuller noted, are more concerned with duty or prohibition.67 This latter emphasis may be particular to the subset of moralities, including the now-declining morality of higher purposes, based on what Max Weber described as world-rejecting religions.68 Because these religions regard material existence as a sinkhole of temptations, distracting people from salvation in a higher plane, their associated moral systems tend to scatter thou shalt not’s across a wide range of human action and minimize the here’s-how-to-flourish that other moral systems, including the morality of self-fulfillment, feature.
That being said, no recognizable morality consists solely of exhortations for improvement. As a means of regulating relations among people, morality must also prohibit at least some set of behaviors. These prohibitions are frequently so central to people’s understanding of their moral system that they regard a different morality, with different prohibitions, as simple immorality, an invitation to descend into an amoral netherworld where, according to Dostoyevsky’s terrifying vision, everything is permitted.69 To those committed to the morality of higher purposes, the modern emphasis on self-fulfillment is precisely such an invitation and not morality at all. In fact, as noted repeatedly in the foregoing discussion, self-fulfillment represents a new morality that contains as wide a range of prohibitions as its predecessor. Its prohibitions are different, however, overlapping in some cases, such as the criminal activity discussed in Chapter 3, but diverging markedly in many others.
The concerns in this chapter are the prohibitions that the new morality imposes on the self—that is, the person as an individual—with discussion of intimate and social relations being deferred until the following two chapters. These prohibitions can be described in terms of three secondary or subsidiary principles: noninterference, incommensurability, and equality. Each of these principles emerges directly from the concepts of the self and of fulfillment described in the preceding sections, not necessarily as a matter of philosophic argument, but instead as the result of current social understanding. Each of them, by this same process of social understanding, is co-causal with the development of the modern administrative state.
The noninterference principle is classically expressed in the work of John Stuart Mill.70 Since every person should fulfill herself or himself, no person should interfere with another person’s effort to achieve self-fulfillment. We must not prevent other people from following their chosen life paths, we must not intentionally place obstacles in their way, and we must not condemn them for the choices they have made. This provides an additional reason to control desire and can be alternatively described as an ethos of toleration.71 Not only should one resist desires that cause future misery for oneself but one should also resist desires that interfere with the self-fulfillment of others. Consistent with its formal or procedural approach to behavior standards, rather than the content-based approach of higher purposes morality, the new morality prohibits judgments that make others feel bad about their chosen life paths. Condemnation of such choices is no longer a means of enforcing moral standards but rather is, in itself, a violation of those standards; it is equivalent to an assault.
Of course, the new morality permits, and indeed encourages a person to give advice to others about how to achieve the goals that they themselves have identified. It also allows and encourages a person to engage others in an inquiry about whether those goals will truly lead to self-fulfillment for them.72 These are the stances therapists adopt, and they have gradually been generalized to all the helping professions, such as social workers and guidance counselors. Given the centrality of mental health to the morality of self-fulfillment, it is hardly surprising that self-fulfillment morality generalizes the same stance as a standard for all interpersonal relations. When we give someone else advice, according to the new morality, we should not be telling them what we regard as the proper behavior for ourselves or for human beings in general, as the morality of higher purposes demands, but rather what we regard as the proper behavior for the other person, given that person’s capacities, circumstances, and desires. This latitude refers, however, only to the individual’s self-definition. The new morality imposes numerous constraints on people as a result of the impact that their actions can produce on others, and these will be considered in the following two chapters.
The co-causal relationship between the noninterference principle, as a matter of personal morality, and the modern administrative state is readily apparent, but it merits emphasis. A demand that the state is supposed the serve the needs of the citizen would be virtually meaningless if the government were empowered to define those needs, just as the organically related principle of representative democracy—that the state is supposed to be guided by the desires of its citizens—would be meaningless if the government could define those desires.73 In both cases, morality requires noninterference; at least for adults, any advice, assistance, or support should be provided without attempting to control the person’s choices. As a matter of co-causality, a society that embraces the principle of noninterference between individuals is likely to demand that this principle apply to government as well, and only a government that emerges from such a society is likely to obey that principle. While this may seem natural enough, it is a demanding discipline, perhaps for individuals who see themselves as particularly knowledgeable or perceptive and certainly for government, which must now reverse its previous relationship with its former subjects and view them as the citizens who constitute the state. It runs counter to the almost inevitable sense of hierarchy that tends to regard the government—the institution that is, after all, in command—as the citizens’ superior. The natural hostility of national leaders to this role reversal, as it was formulated over the course of the nineteenth century, was one of the factors that unleashed twentieth-century Communist and Fascist dictatorships. The resulting collapse of the noninterference principle in personal life—the demand that everyone accept and enforce the behavior standards of Socialist or Fascist Man—was at least partially responsible for the totalitarian savagery of these regimes.74
The second subsidiary principle, incommensurability, can be explicated by contrasting it with utilitarianism. As a meta-ethical theory, utilitarianism was developed by Jeremy Bentham during the same crucial quarter century that spawned the administrative state.75 It was elaborated during the nineteenth century, continues to provide a theoretical basis for modern microeconomics, and is often invoked by administrative policymakers.76 Despite its temporal and conceptual overlap with modern morality, the two diverge because utilitarianism incorporates interpersonal comparisons. Rather than aggregating the individual’s moment-by-moment experiences, it balances one person’s pleasure with another person’s pain to reach Bentham’s and Joseph II’s goal of the greatest good for the greatest number. This leads to some well-known and formidable philosophic difficulties. Can John argue, for example, that Jane should be forced to marry him if he can prove that the marriage would make him ecstatic while it would make Jane only mildly unhappy? More dramatically, as Judith Thompson asks, may a doctor kill an unhappy person and harvest his organs to save the lives of five people who would then lead happy lives?77 Or suppose, as Robert Nozick suggests, there exists a utility monster, a person whose pleasure in mistreating others is so great that it counterbalances the agony that he or she inflicts.78 Would we be willing to accept the idea that our moral system approves this person’s depredations?79
Self-fulfillment morality answers these questions in the negative. Utilitarianism allows units of happiness to be added up without regard to the particular person who experiences them. But as a matter of modern morality, such computations are forbidden. Each person is to be viewed as a self-contained whole, an individual with a unique experience and life path; each person, considered separately, should have the best possible opportunity for self-fulfillment.80 Whether or not this principle can be sustained as a philosophic argument, it seems clear that it accurately captures the morality that is becoming preeminent in the Western World. Virtually all those committed to this new morality would agree that John may not try to compel Jane to marry him, no matter how happy it would make him, the doctor may not harvest any person’s organs while that person is alive, and utility monsters must find another way to get their jollies.
But the principle of incommensurability leaves a descriptive gap; it does not tell us how to resolve the conflicts between people that will inevitably arise in a world of limited resources and inevitable competition. This is, of course, the gap that utilitarianism is designed to fill. The prohibition against harvesting one person’s organs to save five people’s lives does not tell us what to do, through morally acceptable means, when we have one organ available and five people who need that organ to survive. Providing comprehensive answers to such questions would involve the entire field of public policy. At this discussion’s general level, the important point is that self-fulfillment morality’s starting point for resolving such dilemmas is its third subsidiary principle. That principle is equality, something that has, of course, been widely noted by many observers.81
Equality is a new principle, distinctive to High Modernity. The morality of honor that prevailed in the Early Middle Ages was hyper-hierarchical, centered on the warrior elite. Other members of society had to distribute themselves around the elite’s periphery, defining their behavior as support for the leaders or imitation of them in more modest terms. The morality of higher purposes imposed the same essential rules on everyone but incorporated the belief that people should serve God or the nation by carrying out their roles in a God-given biological and social hierarchy, exemplified by the Great Chain of Being. Since service to the higher purpose was the essence of morality, there was no reason to disturb the stratification that existed as a social fact. In the next world, the important one, people’s souls would be rearranged according to their sinfulness or sanctity in a different but even more elaborate hierarchy, such as the one described by Dante.
According to the morality of self-fulfillment, however, people must be treated as equal because each person is a self with his or her own life path. Each person should be able to choose that path and derive as much fulfillment as he or she can from its momentary pleasures and its overall design. The equality involved is equality of opportunity, not equality of result, and thus, contrary to Nietzsche,82 does not suppress individuality or discourage the desire to excel. What it does mean is that no person’s self-fulfillment is more valuable than any other person’s because there is no external standard by which one person’s self-fulfillment can be distinguished from another’s, no social hierarchy or higher purpose that individual people are supposed to serve.
The principle of equality does not obviate the need to make difficult choices, of course; it is morality, not magic. Rather, it provides a discursive framework by which such choices can be evaluated. It prescribes that in resolving the inevitable quandaries that human conflict and limited resources present to any decision-maker, individual or collective, each person should be regarded as equally meritorious as any other because each person has a life path, however long or short, grandiose or modest. Thus, there are a variety of acceptable ways, according to the new morality, to determine which of five people gets the organ transplant, such as lottery, temporal priority of demand, or most urgent need. But there are also many ways, including most of those that would have been used in premodern times, that are morally forbidden, such as wealth, social status, gender, race, or personal connections.
Slavery provides a dramatic illustration of the equality principle and its modernity. The absence of condemnations or even serious questions about this institution in premodern times is startling for us today. It was another continuity with Ancient Rome, prevalent throughout the Early Middle Ages and fully consistent with that era’s hyper-hierarchicalism. Unfreedom increased steadily throughout this period.83 While slavery itself may have declined in preference to serfdom, it was neither abolished nor condemned. The spiritualization of Christianity during the High Medieval and Early Modern eras, with its more insistent recognition that each person possesses an immortal soul, might have been expected to raise doubts about so severe an inequality, but the hierarchicalism that prevailed proved sufficient to suppress even the most perfunctory concern. Slavery gradually disappeared from Western Europe for largely economic reasons, but it was revived and intensified in Spanish, English, and Portuguese colonial practice. The great Spanish priest, Bartolomé de las Casas, successfully challenged the enslavement of the American Indians, but the solution he and others devised was to import Africans instead.84 No serious condemnation of slavery or effort to abolish it was mounted until the crucial quarter century that signaled the advent of High Modernity. Only when the morality of self-fulfillment introduced the principle of equality to the Western World was this most extreme form of inequality perceived as anathema, rather than an acceptable and unquestioned consequence of hierarchy.
As the timing of slavery’s condemnation and abolition suggests, the equality element of self-fulfillment morality was co-causally related to this principle’s emergence in both administrative practice and political doctrine. Equality, which in the governmental context incorporates and subsumes incommensurability, was explicit in the idea of merit-based credentials and the associated shift from purchased to appointive offices that occurred so rapidly and comprehensively in European nations during the last quarter of the eighteenth century. It was at least implicit, in the more gradual articulation of goals that characterized the advance of administrative governance and the transformation of subjects into citizens. Politically, equality—after some fifteen hundred years when hierarchy was unquestioned—came suddenly to the forefront in the American and French Revolutions. To quote once again language from the Declaration of Independence that is too familiar to require quotation, “We hold these truths to be self-evident, that all Men are created equal, that they are endowed by their Creator with certain unalienable Rights, that among these are Life, Liberty and the pursuit of happiness.”85 The equivalently famous first sentences of the French Revolution’s Declaration of the Rights of Man, written just a few years later, announce: “Men are born and remain free and equal in rights. Social distinctions may be founded only upon the general good.” During the nineteenth century, the combined and mutually supporting influence of administrative government and representative democracy systematically undermined the belief that God or His natural order commanded inequality. By the twentieth century, that belief had become untenable, and those who wanted to establish or maintain social distinctions needed to shift their arguments to biological and racial grounds.
Contrary to the ringing words of the two Declarations, however, equality is not self-evident at all, nor is it the way people are born. Instead, people vary obviously and dramatically. They are tall and short, strong and weak, quick and slow, smart and dumb; they have different hair colors, skin colors, and eye colors; they come in two genders, as the now-antiquated word usage of both quotations reminds us, and while procreation of the species requires only a casual few minutes from one, it demands months of discomfort and hours of agony for the other, plus postnatal care that can be replaced only by a well-developed dairy industry. What is truly self-evident is that no society can treat its members equally in every way. Treating them equally in even some selected contexts requires a robust ideology, backed up by affirmative social commitment.
That ideology was the morality of self-fulfillment and its political implications. Its first applications were to cast aside predetermined social distinctions, beginning with the most obvious and egregious case of slavery, and then to insist that each person have the same political rights in the emerging representative democracies. These initiatives were opposed at every stage by those who remained committed to the traditional morality of higher purposes or whose self-interest favored the status quo. The United States needed seven decades and a savage civil war to abolish slavery and another century to provide a semblance of minimal equality for the former slaves on such basic governmental issues as the right to vote, the right to receive a fair trial, the right to compete for government positions, and the right of access to public services and facilities. France, despite the inspirational language in its Declaration about social distinctions, retained severe property restrictions on the right to vote until 1848. The United States abolished these more quickly but did not give women the vote until 1920, and France did not follow until 1944.
Explicating the concepts of “self” and “fulfillment” describes the basic organizing principle of the new morality, and adding the secondary principles of noninterference, incommensurability, and equality describes its basic operation in society. The next three sections will explore its implications for the way that people live their lives, the way they design the life path that constitutes their selves.
An issue that is truly central to the way that people live their lives involves their functional role as a member of society. Consistent with the morality of self-fulfillment, this is increasingly conceived as the result of the individual’s decisions and thus described as a career, rather than the result of birth into a family, caste, or occupational group. Of course, characterizing the subject in this manner is as much a product of the new morality as the topic’s content. But as stated in the Introduction and at the beginning of this chapter, there is no neutral standpoint.
The general hierarchicalism of premodern Western society was instantiated most directly in determining, and restricting, the functional role of individuals in the society. There was, of course, a fair amount of social mobility in premodern Western Europe, as discussed in Chapter 2, particularly as the economic system became increasingly dynamic and the educational system advanced, but its extent should not be exaggerated. Most people remained in the same status as their parents and often, particularly for peasants, in the same basic occupations. Even if talented or aggressive individuals were able to rise above their assigned role, the prevailing morality, and thus the demands of social conformity, strongly insisted that people generally remain in place. It is difficult to appreciate the force of this belief. From a modern vantage point, there is a tendency to see it as a transparently pragmatic and self-interested effort by the ruling classes to preserve their perquisites. As stated earlier, however, the social hierarchy, like the biological one, was perceived as established by divine command. When people thought about it in secular terms, they tended to assume that society would collapse and chaos would run rampant if the hierarchy was not maintained. A minor but revealing indication of how seriously people held to these beliefs are the sumptuary laws that prevailed for centuries in many European nations. As Jasper Ridley reports, an English statute of 1554 forbade a man from wearing any silk in his “hat, bonnet, nightcap, girdle, hose, shoes, scabbard or spurs, on pain of three months imprisonment and a fine of £10 for every day on which the garment was worn” unless he was a knight’s son, owned land worth at least £20 or owned goods worth at least £200.86
No elaborate argument is required to demonstrate that statutes of this sort would no longer be politically acceptable in any Western nation, but the important point for present purposes is that they would also be regarded as immoral. It is immoral, according to the new morality, to stop people from wearing whatever they can afford, living wherever they wish, and, most important, pursuing the career they choose. The co-causal connection to administrative governance is that government’s articulated policy is to serve its citizens, to facilitate their individual choices rather than restricting them. Not everyone, needless to say, can actually achieve his or her desired goals. There are only a limited number of openings in most cases, and people must compete for them, particularly when the job involved is remunerative, prestigious, and enjoyable. But according to the new morality’s principle of equality, competition must be based only on criteria that relate to the person’s individual ability to perform the job effectively.
An important point about these criteria is that they are socially constructed. There is probably no abstract, society-independent way to state them. In another society, such as premodern Europe, the criteria would include being the right type of person or having good social connections. The French Declaration’s formula that the criteria be “founded only upon the general good” would not exclude the hierarchical considerations of premodern society because members of that society considered maintenance of the hierarchy as essential to the general good. But the new morality limits the criteria for career-related selection to measurable abilities that people of any social status can possess and treats the general good as an emergent social consequence that arises from equality of opportunity. To be more precise, two main types of criteria are deemed morally acceptable in High Modernity: capacities and credentials. Capacities, as defined earlier, are a person’s inborn or early-acquired abilities to perform some task effectively as measured by the society’s prevailing standards. Credentials are general or job-specific training that is accessible to people on the basis of capacities. Reliance on them constitutes equality of opportunity, the essential feature of the new morality that it describes as equality of opportunity.
Being socially constructed, these criteria are co-causally connected to the structure of the economy and the government. In an advanced industrial economy, instrumentally rational capabilities and specialized training are necessary to perform many tasks effectively, and only such effective performance will allow an advanced industrial society to develop. Similarly, government officials must possess instrumentally rational capabilities and specialized training to manage the economy. A person whose qualifications are good manners and an exalted position in the social hierarchy will not be able to design, manage, or inspect a nuclear power plant. Many of the difficulties that non-Western nations have experienced in recent times seem to stem from the effort to superimpose a modern economy and government on a premodern social structure.
Although these criteria are pragmatic, the principle by which they become the primary basis for selection is a moral one. Failure to apply them is not merely inefficient; it is discriminatory and therefore wrong, according to the modern morality of self-fulfillment and its subsidiary principle of equality. People who try to defend discriminatory practices sometimes assert that such practices are instrumentally rational in the light of the irrational but unalterable preferences of their client base or personnel: admitting African Americans will induce our white customers to stay away; gay and lesbian soldiers will make our straight soldiers uncomfortable. But these claims misunderstand the nature of the prohibition. According to modern morality, it is immoral to deny any person the opportunity to fulfill herself or himself by pursuing a particular career. Pragmatic considerations are designed to resolve conflicts among competing applicants for some careers, but they derive from self-fulfillment morality and function within the contours that morality establishes. An employer may rely on instrumental factors such as customer appeal or affability with co-workers in choosing among applicants, but only if they are being considered equally regarding race and gender.
Western people realized almost immediately that self-fulfillment’s equality principle applied to distinctions based on social background and religion, the two primary factors that—as they would have said—separated one man from another. The demands of people who were disadvantaged by these previously unchallenged distinctions and the efforts by the previous beneficiaries to resist or come to terms with these demands shaped much of the West’s history during the nineteenth century. The equality principle’s application to men of color would have been just as obvious but for Baron Montesquieu’s astute observation that Western elites could have perpetrated the horrors of slavery only by convincing themselves that Africans were an inferior type of human being.87 Abolition of African slavery was another crucial part of nineteenth-century history, but beliefs about racial inferiority,88 shored up by reference to Darwin, provided people who continued to be committed to premodern hierarchicalism with a modern-sounding justification. They sustained European colonialism and the American South’s reconstitution of the slave economy—through segregation andcrop liens89—until both systems were abolished in the 1950s and 1960s. Even now, such arguments continue to appear in support of racial discrimination.
The realization that the very same principle applied to women developed much more slowly.90 For premodern people, the difference between the sexes and the subordinate role of women was an obvious fact. Unlike the warrior elite of the Early Middle Ages, people of the High Middle Ages, as Chapter 2 discussed, worshipped a woman—sometimes more fervently than God or Jesus—and championed the secular adoration of women through the idea of romantic love. They were even willing to accept female rulers—not in France or the Empire, where an Early Medieval prohibition remained in force,91 but as queens in several other nations and as duchesses, baronesses, and countesses in many lesser jurisdictions. The general attitude toward women, however, was that they served the higher purpose of procreation. From this and the biological fact of nursing followed their role as nurturing the children that they bore, and from those two roles—serving the needs of the species and their children—followed their subsidiary position in the social hierarchy and their third role, which was serving the needs of men.92 All these roles, as part of the Great Chain of Being, were viewed as having been ordained by God.
Such views were so deeply ingrained in the Western sensibility that the advent of self-fulfillment morality had almost no effect for many years. A few lonely voices, such as Mary Wollstonecraft and Margaret Fuller among women and the Marquis de Condorcet and John Stuart Mill among men,93 called for full equality, but the traditional view that a gender hierarchy was inscribed into the order of the universe continued almost unabated. Those, like Nietzsche, for whom God was dead, saw Darwin taking His place with the same doctrine. Over the course of the nineteenth century, women obtained more property rights, but the West had gone back and forth on this issue, rather than displaying any clear trend line, and women may not have been much better off by the end of the century than they had been in the late Roman Empire or in some medieval jurisdictions. The women’s suffrage movement, however, generated an unprecedented level of mobilization among women, and by the first half of the twentieth century, they had won an unprecedented right of political participation.
Throughout this period, an intelligent, dexterous girl would never have been able to decide whether she preferred to be a surgeon. At most, she could have aspired to be a nurse; more likely, she would have been advised to use her dexterity for sewing and to suppress her intelligence as an embarrassing impediment to marriage. Concurring in an 1871 opinion that upheld an Illinois law prohibiting women from practicing law, U.S. Supreme Court Justice Joseph Bradley declared: “the civil law, as well as nature herself, has always recognized a wide difference in the respective spheres and destinies of man and woman. . . . The constitution of the family organization, which is founded in the divine ordinance, as well as in the nature of things, indicates the domestic sphere as that which properly belongs to the domain and functions of womanhood.”94 By 1961, no one on the Supreme Court was prepared to be that definitive or theistic, but Justice John Harlan, upholding the exemption of women from jury service, could still say: “Despite the enlightened emancipation of women from the restrictions and protections of bygone years . . . woman is still regarded as the center of home and family life.”95
The realization that these attitudes were inconsistent with modern morality struck Western society with the rapidity of revelation. Industrialization had generated a wide range of menial jobs that women increasingly filled,96 and there was a small amount of progress in higher-level occupations, but gender inequality remained extreme through the 1960s. During the twenty-year period that followed, women probably made more progress toward full equality of career opportunities than they had in the history of the world prior to that time.97 To take the United States as an example, the legal prohibition of gender discrimination was added as an afterthought to the Civil Rights Act of 1964. Female attorneys and physicians were still curiosities at that time; by the mid-1980s, any American professional school admitting a class that was less than forty percent female would have created a scandal. The only reason the President of Harvard University might have hesitated to say that women had less “intrinsic aptitude” for science and engineering in 1955 was that the point was too obvious to mention; when Lawrence Summers said that in 2005, he lost his job.98 After nearly two centuries of incredulity, resistance, and acclimatization, the consequences of the new morality are now apparent: it is morally wrong to deny a person the opportunity to self-fulfillment on the basis of gender. From the contemporary perspective, half of humanity had been consigned to a “mute, inglorious”99 existence throughout Western history when they would have been capable of equaling the achievements of their male counterparts.
It remains true, of course, even in this era of biotechnology, that women have a crucial and irreplaceable role in the continuation of the species. But the shift in responses to this undeniable fact reveals the socially constructed nature of morality.100 To premodern people, pregnancy and birth decisively defined women’s role in the society. To modern people, these events must—as a moral matter—be simultaneously ignored and taken into serious consideration in the interests of self-fulfillment. They must be ignored when women are competing for jobs; any employer who asked female candidates if they are planning to get pregnant would be excluded from recruiting on most American university campuses today. At the same time, serious efforts must be made to accommodate the fact that women do get pregnant and give birth once they are employed. To treat whatever incapacitation women employees experience as unexcused absences or a lack of commitment to the job would be equally immoral.101 These attitudes are not unified by logic or biology. Rather, they are generated by a new morality which insists that women have an equal opportunity to fulfill themselves in their careers.102
Within a few brief decades, this moral command has moved from the inconceivable to the obligatory. From their pronouncements, those who remain devoted to the morality of higher purposes seem to prefer women to remain in their homes, taking care of their children and their husbands. But they find it almost impossible to say the words that came so readily to Justice Bradley a mere century and a half ago or even those that Justice Harlan wrote more recently. Instead, they quote sociological studies about the advantages that children derive from having one stay-at-home parent. Such arguments are an admission of defeat, not only because they rely on empirical claims rather than the moral ones they are really intended to support but also because the advantages that the children of stay-at-home mothers might derive could only be defined, in the modern world, as including the chance that their daughters will grow up to be surgeons.
None of this is to assert that full equality of career opportunity on the basis of social class, race, or gender has been achieved. But few people would dispute that it is now a moral norm. There are, of course, intense disputes about the proper way to achieve it and to measure its achievement. Perhaps the leading dispute involves affirmative action. Opponents argue that this policy violates the principle of equality by holding some people to different standards on the basis of class, race, or gender. Proponents respond that holding everyone to the same standards is a false equality because it fails to take account of past discrimination that continues to affect people’s capabilities and credentials. The important point, for present purposes, is that this debate occurs within the framework of the new morality. The question is not whether we should achieve equality, whether people should have an equal chance to fulfill themselves in their careers, but how best that goal should be accomplished. To get the answer wrong, both sides agree, would be immoral; it would violate one of the most serious prohibitions that the new morality of self-fulfillment has established.
People’s position in society is nearly always crucial to their sense of self, but as previously described, the determinants of that position vary in conjunction with prevailing beliefs. In both the Early Medieval and the High Medieval–Early Modern eras, social position was based heavily on status. In Modernity, status has not disappeared, but social position is increasingly determined by one’s career, a view that is more consistent with self-fulfillment in general and equality in particular. A secondary but important consequence of this change in perspective is to cabin the functional role that people perform as one aspect of their self-definition, rather than its totality. Modern people, however career-oriented they may be, do not generally feel that their work defines their entire identity but typically insist that their lives include other elements that in many cases are deemed more important than their careers. Moreover, these various elements of a person’s life are often regarded as distinct and are articulated in a sense that is co-causal with the development of the modern administrative state. In terms of structure, each element is separately designed and often assigned to separate portions of the day, an approach that is also co-causal with modern work and residential patterns. In terms of goals, the elements are seen as having different rationales. In contrast to the premodern idea of virtue, success in one does not guarantee equivalent success in another; rather, each aspect of life must be separately planned and pursued. People who approach their lives this way tend to design their institutions along the same lines, and people whose institutions are designed along these lines tend to approach their lives in the same manner.
The three additional elements of modern people’s life paths that may be the most important, apart from their careers—and in any event are the ones that will be considered here—are family, religion, and leisure pursuits. Families involve, and for many people virtually define, a person’s intimate relationships, which are the subject of the following chapter. Considered at a more abstract level, however, these relationships clearly implicate the person’s sense of self and must be considered here as well.
In the premodern world, the family or household provided a comprehensive institutional context at every level of society. A man became head of a household when he married, while a woman moved from her father’s or other caretaker’s household to her husband’s. For both, this represented the transition to adulthood. There was a basic continuity between the Early Medieval and High Medieval–Early Modern periods regarding these transitions. The shift from aristocratic to ecclesiastical marriage that was described in Chapter 2 changed the meaning of marriage but did not affect its basic social role as the entrance to adulthood.
There were, of course, unmarried adults in the premodern world, but they generally fit within the same or an equivalent institutional context. They could be servants or, in the case of young men, squires or retainers of a nobleman, but in that case they were regarded as members of their master’s household. They could also join the Church as monks, nuns, or secular clergy, but the Church was, of course, an institution of its own, with monasteries and convents viewed essentially as large households. Widows and widowers were also numerous, but they were regarded as the remnants of a family. The point should not be exaggerated, as the reality was often more fluid than the norm, and people had to accommodate themselves to circumstances. But the norm was clear: the household was a moral institution, and to live outside it was both socially and morally suspect.
This has changed completely in the age of self-fulfillment. The transition to adulthood now occurs when people leave home to begin the life path of their choice—when they enter college, enter postcollege training, or get their first full-time job. Typically, these events occur before the person gets married and, in any case, they have no necessary connection to it. While the transformation of this transition is significant for both men and women, it is more dramatic in the case of women, who had previously moved from their childhood to their adult household without any intervening education, military service, or youthful adventures.
Interestingly, the transformation occurred before women achieved equality of opportunity in pursuing a career. As already mentioned, the industrial economy of the twentieth century generated a great many jobs that young women could fill; co-causally, young women’s increasing independence facilitated this industrial economy. To take just one example, telephones required human switchboard operators and young women were generally hired for this role. Because they had to greet the caller without knowing anything about the person, the previously rare greeting of “hello” was adopted in place of more traditional forms of address, and the switchboard operators became known as “hello girls.” The hello girls joined the ranks of secretaries, store clerks, waitresses, and factory hands, all of whom began using their meager wages to live on their own, however impecuniously. They had dead-end jobs, and success for young women still consisted of marrying an eligible man, but they had begun the process of separating themselves from their families, a process that career equality would build on and dramatically advance.
The modern disconnection of marriage and adulthood has transformed marriage from a normative requirement and an essential institutional context into one element of the life path that people can choose or reject. Having entered adulthood, the modern person may decide not to get married, even if he or she has established a long-term sexual relationship. The determining factor is what the person finds individually self-fulfilling. Far from imposing an expectation or obligation to get married, modern morality decisively condemns any expression of disapproval about another person’s choices in this area. It does not preclude friendly advice, but that advice would be couched in terms of “what feels rights for you” or “what will make you happy.” Those who remain committed to the morality of higher purposes often bemoan the decline of the traditional family and point to statistics such as the recent report of the U.S. Census Bureau that fewer than half of American households are headed by a heterosexual married couple.103 In fact, the traditional family has ceased to exist. Nearly everyone, including those married heterosexual couples, belongs to a household that was constituted by its adult members’ conscious decisions regarding their own self-fulfillment.
Another aspect of family that relates to people’s sense of self is, of course, their children. Having children turns a person into a parent, and it is probably a human universal that this experience constitutes some sort of shift in self-image or identity. The content of the shift varies with culture, however. The following chapter will discuss differing approaches to childrearing, which is, needless to say, an intimate relationship. Here, the concern is with the abstract idea of having children and its effect on people’s sense of self.
In the Early Middle Ages, having children, like marriage, was a source of honor and family continuity. In the High Medieval–Early Modern period, this attitude continued in attenuated form but was partially displaced by the belief that having children served the purposes of God, the species, and the centralizing monarchy. In the modern world, having children is regarded as a means of self-fulfillment. It is seen as an important experience, a part of one’s life path. As such, it is to be balanced against other forms of self-fulfillment, most notably one’s career, rather than regarded as a matter of course. Some people decide to forgo the experience due to career considerations. Others plan to have children, with their present desire often amplified by the sense that they will feel regret in future years if they do not. In some cases, this desire becomes a major motivation to get married or otherwise enter into a long-term relationship. If a woman fails to establish such a relationship with a man, however, or establishes one with another woman, she may decide to have children anyway. As time goes on, the law increasingly allows single men and gay couples to adopt children as well. Thus, just as a person’s entry into adulthood is no longer linked to marriage, neither is a person’s decision to have children.
Although having children is no longer considered obligatory, children are commonly regarded as one of life’s peak experiences, one of the most important ways people can fulfill themselves. An indication of the role that children play in many people’s sense of self are public statements by celebrities, well documented in People and its rival Us, that their children, as rock star Bret Michaels said, are “the most important people in my life.”104 The same edition of People that Chapter 3 described in connection with the therapeutic sensibility contains an article in which actress Mariska Hargitay declares her newly adopted baby to be “pure magic. I feel very blessed.”105 When pop singer Britney Spears, having lost custody of her children because of her erratic behavior, gained visitation rights, Us reported that, according to her friend, the children “mean everything to her.”106 These seem like odd statements for those who have spent their lives courting fame through personal display. Perhaps they are sincere, but even if they are not, the statements are a significant indication of a prevailing sensibility. The June 20, 2011, edition of People, for example, contained an article saying “work kept me away from home more than it should have” by the President of the United States. “During the [2008] campaign,” he continued, “not a day went by that I didn’t wish I could spend more time with the family I loved more than anything else in the world.”107 A subsidiary and equally revealing theme in this age of mental health emerges from celebrities’ assertions that their children enabled them to survive times of crisis. Country singer Shania Twain, after learning that her husband had left her for her best friend, told People: “I had a son who needed me. . . . I hadn’t always, but I kept it together for him when I needed to. Nothing else would have snapped me out of it.”108 Or as Bret Michaels said, after a brain hemorrhage brought on by “his rock n’ roll lifestyle,” “My kids saved my life.”
Because children are regarded as a fulfilling experience, not a source of honor, service, or family continuity, modern people are often satisfied with one or two. Of course, this decision may represent a compromise with the career goals of the parents, something that was rarely a consideration when the mother stayed at home. But there is also the sense that one or two children are sufficient to provide the fulfilling experience of parenthood. This evolving attitude toward parenthood also changes the basic way that parents relate to their children. Rather than being a means of continuing God’s creation or serving the regime, the act of parenting, and the resulting emotional bond with the child, is regarded as a valuable experience in its own right. While the results of parental upbringing continue to matter in the new morality, the experience itself matters to an equal or greater extent. The conflict that many people experience between parenting and career fulfillment stems from combining this emerging attitude toward parenting as an experience with the increasing recognition that all people are entitled to fulfill themselves through their occupational careers. For women, it often means that they want to spend less time with their children than their premodern predecessors, while for men it means that they want to spend more time. Both dilemmas, combined with the emphasis on parenting as an experience, lead to the uniquely modern notion of quality time.
These beliefs are so new and so discontinuous with previous approaches that modern people have adopted them quite gradually and still struggle to define their contours and significance. One indication of this process is the current prevalence of Hollywood movies about parents, more commonly fathers, connecting or reconnecting with their children. Greco-Roman and Western literature is filled with children searching for their parents or their home, generally so that they can claim or reestablish their rightful place in society. It is also filled with adults trying to return to their homes for similar reasons. But the parent—most commonly a man—reconnecting with his child and regretting his absence or excessive career orientation during the child’s early years is far more common in contemporary work. Modern movies feature not only many fathers who are ferociously committed to their children, as in Kramer vs. Kramer, Finding Nemo, and The Pursuit of Happyness,109 but also fathers who regret their prior lack of commitment and strive to make amends, such as Regarding Henry, The Sixth Day, Tron Legacy, Anchorman 2, and The Italian Job.110 What is striking is that many of these movies, including the last four, are not even primarily focused on parent-child relations but invoke this theme to motivate the action or end it on a note of resolution.111 This suggests that the issue reverberates strongly within the modern sensibility and betokens a new attitude toward childrearing that the emerging morality implies.
Despite the process of secularization described in the preceding chapter, religion continues to play an important role in many modern people’s lives, if not the pervasive, comprehensive role that it did when the morality of higher purposes prevailed. This seems particularly true in the United States, which does not display the dramatic decline in religious observance that characterizes most of Western Europe. Nonetheless, a comprehensive 2008 survey by the Pew Foundation reveals that the fastest growing religious category in the United States is “Unaffiliated,” at about sixteen percent of the adult population.112 The survey also reports that the unaffiliated category is even more prevalent among the youngest people in the sample, with a full twenty-five percent of those between eighteen and twenty-nine saying that they have no religious affiliation.113 This may indicate that secularism will increase in the coming years among the U.S. population, although it is also possible that many people become religiously affiliated later in life when they move into a community, have children, or start getting sick.
More significantly, however, forty-four percent of Americans report that their current affiliation is different from the one in which they were raised. As the Pew Foundation Study notes, this figure probably underestimates the total amount of denomination shifting because it does not include people who left their birth religion and subsequently returned to it. In addition, among highly religious people—those who attend services at least once a week—fully twenty-eight percent go to services outside their own faith; among the more casually religious, who attend on a monthly basis, that figures rises to forty percent. Eastern and New Age beliefs are prevalent among those identifying themselves as Christians; twenty-two percent say they believe in reincarnation, twenty-three percent believe in astrology, and about twenty-seven percent believe they have communicated with the dead. Their behavior is reminiscent of the comment, attributed to G. K. Chesterton, that “when people stop believing in God, they don’t believe in nothing, they believe in anything.”114
Paralleling this religious promiscuity is Americans’ well-documented ignorance of doctrine. Another Pew study found that the category of Americans who scored highest on a simple test for religious knowledge were atheists and agnostics.115 This is not entirely surprising, since the test included questions about non-Western religions. But other surveys reveal that only half of Americans can name a single one of the four Gospels, that thirty-nine percent think that the Old Testament was written in the decades following the death of Jesus, that only forty-two percent knew that Jesus delivered the Sermon on the Mount, and that ten percent thought that Noah’s wife was Joan of Arc.116 Will Herberg’s explanation is that religious observance in the United States is often a means of asserting one’s ethnic identity rather than a mode of worship or an expression of belief.117 More recently, it has also become a type of political statement, with the denominational differences that once set Western Europe ablaze all but forgotten as Christianity in general serves as a means of mobilizing support for conservative political positions.118
Taken together, the data indicate that people in the Western World, including the relatively devout Americans, no longer regard religion as an all-embracing belief system but rather as one possible means of self-fulfillment. They see themselves as having spiritual needs, just as they have career needs, sexual needs, and intimacy needs. In their search for the most fulfilling ways to satisfy these needs, they shift from one denomination to another, attend services at still another, and then harvest appetizing doctrines from entirely different ones. The majority of Americans have not become secular, but what they have done is assimilate religion to the new morality. In some sense, therefore, even the most impious premodern people were more religious than all but the most devout people in our modern world. A person in the Middle Ages may have murdered, stolen, fornicated, and skipped Sunday services but still saw God in the rising of the sun and the shining of the stars, still measured the year by religious festivals, and never questioned the validity of Christian doctrine. For modern people, in contrast, religion is cabined in the realm of spirituality or recruited for secular functions, an aspect of life rather than its overarching purpose. Given the ease and prevalence of changing one’s denomination, even those who remain securely within their birth religion are doing so as a conscious choice; they are treating religion as one part of their life path, one of several means by which they strive for self-fulfillment.
A natural concomitant of this demotion and delimitation of religion is a level of tolerance that is far more consistent with the new morality of self-fulfillment than with the previous morality in which religion played so large a role. At the most obvious level, a Catholic who converted to Methodism, attends Lutheran services almost half the time, cannot name even one of the first four books of the New Testament, and believes in the Hindu doctrine of reincarnation is unlikely to demand that people be compelled to follow the one true religion. More basically, a person with such a jagged spiritual trajectory is almost certainly imbued with modern morality and thus committed to the principle that other people are entitled to find fulfillment in whatever way they can. This means not only that people should be free to choose their own religion and religious practices but also that it is deeply immoral, as a social matter, to express disapproval of the choices they make. Part of the story of Charlemagne as a great, heroic king, it will be recalled, is his destruction of Irminsul, the towering tree pillar that was sacred to the pagan Saxons. But when humans destroy the Home Tree of an alien species in the movie Avatar, the filmmaker’s sympathies, and the modern audience’s, lie with the aliens and favor toleration for their forms of worship.119 Even conservatives, who tend to be religious traditionalists, embrace this view, as their alliance with other conservatives from different denominations indicates.
Once again, this transformation of private attitudes is co-causal with the development of the administrative state. As Chapter 2 described, toleration, which was inconceivable in Medieval Europe, did not follow naturally when the Reformation shattered the religious unity of Europe. Instead, state religions remained the norm, as embodied in the principle that “the ruler determines the religion” (cujus regio ejus religio).120 As long as the morality of higher purposes prevailed, this principle would persist, in part because each ruler was genuinely concerned that his subjects did not endanger their immortal souls by straying from what he perceived to be the truth, and in part because religious conformity was seen as essential for maintaining social order and central government control.
Religious toleration, although advocated by political theorists from the late seventeenth century onward, was implemented only in the administrative era. It sprang from numerous sources, including the work of these political theorists, but one major factor was the modern belief that the purpose of the government is to satisfy the people’s needs, as they themselves define them. In any religiously pluralist society, such as Britain, the Netherlands, and Germany in addition to the United States, this means that imposition of a state religion is forbidden. At a deeper level, it means that the government of any modern state must not intervene in the process by which people define their spiritual needs. Despite the extensive inculcatory powers that a modern state can deploy through its educational and propaganda apparatuses, the government must desist from any effort to tell people who they are or what they want. Twentieth-century Communism and Fascism, which, as Chapter 3 described, were efforts to resist the onset of these modern views, attempted to define Socialist or Fascist Man. Most Western people now regard these efforts as a violation of the administrative principle that the government must serve the people and thus as immoral. This attitude fostered the governmental policy of religious toleration, of course, and that policy encouraged the further evolution of the attitude. Social conservatives in the Western World have nibbled at the edge of this policy of toleration, but few question its basic premises. In accepting them, however, they are in effect accepting modern morality; toleration is inconsistent with the morality of higher purposes.
Closely related to religion, in terms of a person’s sense of self, is modern therapeutic practice. Just as modernity has replaced religious explanations for people’s external world with natural science, it has replaced religious explanations for people’s internal experience with psychology. When premodern people felt they needed outside help to deal with self-doubt, grief over a loved one’s death, marital problems, difficulties with a child, or similar problems, they typically went to the priest or minister and received advice, reassurance, or admonitions that were invariably embodied in religious terminology.121 Now people consult the profusion of therapists that modern culture makes available—insight analysts, marital counselors, child psychologists, addiction specialists, grief counselors, career counselors, and innumerable others. The resulting advice, whether personal, social, or pharmacological, is almost always couched in psychological terms, in part because modern people perceive their difficulties in these terms, and in part because the proliferation of therapies and therapists confirms and strengthens this perception.
Critics of modern culture not only condemn the allegedly irresponsible and self-indulgent character of therapy, as discussed in Chapter 3, but also excoriate modern people for their therapeutic promiscuity, their tendency to move or flit or drift from one type of counseling to another. The observation seems accurate; few modern people define themselves by their commitment to or reliance on any particular mode of therapy. Rather, they see therapy as an aid to self-fulfillment. Their choice of a therapist is dictated by their view that mental health is essential for fulfillment, but the different kinds of therapy they choose depend on the stage of their life path during which their problems arise. When the critics condemn this practice, they are holding modern people to the standards of premodern morality, in particular to the norms of Western religion, which demanded that belief must be exclusive. In fact, modern people reverse the analogy. They shop for different sorts of therapists because their primary motivation is self-fulfillment, and then transfer this same approach to their religious practices.
Leisure activities—that is, activities that people carry out for pleasure at times when they are not working for a livelihood—may be a human universal,122 but here again, their nature and meaning change with changes in their culture, and their relationship to the prevailing concept of morality changes as well. One distinctive feature of modern self-fulfillment morality is that its emphasis on the self as a decision-making entity and on fulfillment as a means of guiding those decisions leads to a luxuriant florescence of hobbies as a leisure-time activity. A hobby can be defined as a sustained effort that is unrelated to a person’s career or social role and is thus an articulated component of a person’s life. In premodern Europe, hobbies were often the particular preserve of the elite and often associated with eccentricity. In the modern West, they represent an important source of meaning for many individuals and, taken together, a huge sector of the society and its economy.
In 1958, a team of climbers using hardware and stirrup ladders first climbed El Capitan, the nearly 3,000-foot rock wall that looms over California’s Yosemite Valley. (See Plate 19.) They followed a route named The Nose and completed the climb in 47 days over a 17-month period. In 1994, Lynn Hill free climbed The Nose (that is, climbed with only a safety rope) in 23 hours. The current record for the climb is 2 hours, 36 minutes, and 45 seconds. There are now more than a hundred established routes up El Capitan, and thousands of people have free climbed them, often as many as sixty a day during the high season. Some of them have also free soloed smaller rock walls in Yosemite, which means climbing with no rope at all. These climbers gather at Camp Four, the famous meeting place where people who are about to ascend one of these rock walls, or have already done so, can share information, experience, and legends about past efforts.123
During the three-month period from May to July 2011, there were at least 312 baseball card shows at various locations in the United States.124 (See Plate 20.) Although the cards are sold extensively on the Internet, these shows often attract hundreds of people who come armed with a list of the cards they need to complete various sets and a magnifying glass to examine the edges and corners of the cards on sale. To be sure, demand is fueled by the retaliatory profligacy of middle-aged men whose mothers threw out their childhood card collections, “the Oedipal tragedy of the 1980s,” according to novelist and baseball writer Luke Salisbury.125 But the market, which demands more than $1,000 for rare or famous cards in good condition, extends well beyond that particular source of trauma. Given their value, the cards can, of course, be resold, sometimes for a considerable profit, but the typical collector would prefer to die.
The Klingons are an alien race of intrepid, relentless warriors, with ridged foreheads and leathery skin, who appeared as enemies of mankind in the Star Trek television series and early movies but in later movies became allies of the human-led United Federation of Planets in its battles with the Borg. Marc Okrand, a linguistics scholar hired by the show’s producers to develop a language for the Klingons, published The Klingon Dictionary in 1985.126 By some estimates, more than a thousand people in the Western World currently speak Klingon with varying levels of proficiency. Their efforts are supported by the Klingon Language Institute, which publishes a quarterly peer-edited journal, HolQed, discussing various aspects of Klingon language and culture. The Institute’s seventeenth annual meeting (qep’ a’ wa’maH SochDIch, in Klingon) was held in July 2010 outside Philadelphia, and Klingon speakers appear at the frequent and widely attended Star Trek conventions as well. (See Plate 21.) Of course, learning Klingon would not function as a hobby for the characters in the Star Trek movies, since it would serve a functional purpose, so Star Trek V: The Final Frontier demonstrates the modern sensibility of its characters by opening with a scene of Captain James Kirk spending his leave time free soloing El Capitan.127
Climbing El Capitan may be extreme, baseball card collecting may be neurotic, and learning Klingon may be idiosyncratic, but these particular activities are emblems of literally thousands of the sports, collections, and self-education efforts with which people in the Western World decorate their life paths. They embody the now prevailing ethos of self-fulfillment in their fusion of pleasure and planning. While properly classified as hobbies because they are articulated enterprises that people do for the pleasure they derive from the experience, not for some higher purpose such as saving their souls or serving the nation, or even for some intermediate purpose like making money or increasing their social standing, they are hardly momentary pleasures. To free climb a rock face requires a phenomenal amount of training and experience, to rebuild a baseball card collection on the ruins resulting from a mother’s negligence takes years of searching and organization, and to learn an imaginary language demands an impressive level of concerted effort. In other words, these hobbies reflect, in a unified, self-contained, and concentrated way, the attitude toward life in general that serves as the basis for the new morality.
To be sure, people in the Western World do not usually consider hobbies as a matter of morality. There is certainly no moral obligation to have a hobby, and while getting one may be something that therapists often recommend, it can also be something they need to counteract. Peer group organizations such as Disney Addicts Anonymous attempt to help people who feel that their desire to collect the multitudinous paraphernalia that the Disney Company produces is impairing their lives. But in both their positive and negative effects, hobbies not only arise from the same ethos that defines self-fulfillment morality but also reflect its essential non-judgmentalism. They are intrinsically pluralistic and nonexclusive. The very nature of a hobby is that different people will have different ones. Indeed, part of their pleasure, particularly in a mass society, is that they will be pursued by relatively small groups of people who can bond with each other around their common interest. Camp Four, the baseball card shows, and the qep’ a’ wa’maH SochDIch are communal experiences that contradict the supposedly anonymous nature of the modern world that sociologists such as Robert Putnam have bemoaned.128 At the same time, none of these activities precludes similar activities. The people in Camp Four are in no way discomfited by marathon runners, baseball card collectors need not worry about numismatists, and the Klingon speakers can readily acknowledge that other people prefer to learn Italian.
Hobbies such as these can be regarded as products of the West’s material conditions, specifically the leisure time and surplus resources that advanced industrialism has provided. That is certainly a contributing factor, as is the remarkably well-developed system of nature preserves, spectator sports, and mass entertainment that provides the content for these hobbies and so many others. It would be overly hasty to assume, however, that the average person’s life in the premodern world consisted of unremitting drudgery and bare subsistence. There was leisure time, if not as much, and resources were devoted to it, although not as many. What is truly lacking are records of how the time and resources were used. The meager iconographic and literary evidence we have suggests that people’s leisure and surplus, at least in the villages where the great bulk of the European population lived, were devoted to communal festivals. People celebrated social events such as weddings and births, agricultural events such as planting and harvesting, and religious holidays in nonreligious ways.129 It is difficult to imagine the inhabitants of a modern suburb all gathering in some communal space to dance. Some of them love dancing, of course, but they will seek out settings where people from many different communities gather for the specialized form of dance that appeals to them—disco, break dancing, ballroom dancing, folk dancing, and so forth.130 What distinguishes the modern world from its predecessor, then, is not the existence of leisure activities but the fact that these activities are matters of individual choice and distinctively developed skills.
So much for life; it is now necessary to consider death. This is a topic that is difficult to write about without at least attempting to be profound, but perhaps that inclination is itself a product of premodern morality. In the Early Middle Ages, it was important to die with honor; in the High Medieval period, it was important to die in a state of grace. Both attitudes suggest that death is a significant event and that it might be a good idea to be at one’s best when one confronts it. In the modern world, however, the preferable approach may be to reach the point where one has grown too old and sick to care. According to the morality of self-fulfillment, the circumstances of one’s death do not give meaning to life; the circumstances of one’s life achieve that goal.
As with the life path itself, considering the philosophic treatment of this subject can help locate self-fulfillment morality within modern thought and clarify its content. The branch of philosophy that has devoted the most sustained attention to the significance of death is existentialism, and Heidegger is probably the existential philosopher for whom it plays the largest role. According to Heidegger, the constant possibility and ultimate certainty of death demands that a human existence (Dasein) achieve the state of authenticity that was previously referenced. At first, this might seem allied to a premodern attitude toward death, but Heidegger does not view a person’s death as an opportunity to achieve either honor or salvation. Rather, it is an inherent feature of existence that determines the person’s attitude toward life, which is Heidegger’s primary concern. As he says, Dasein “does not have an end at which it just stops, but it exists finitely.”131 Having previously referred to the pop aphorism that “life’s a journey, not a destination,” it is perhaps not inappropriate to quote another here: “live every day as if it might be your last.” It has a more exalted origin, being attributed to Marcus Aurelius,132 but is also featured in a pop song, in this case by Nickelback.133 (The Klingon translation has yet to appear.) This common phrase captures the modern attitude that Heidegger reflects, and perhaps influenced, in suggesting that the possibility of one’s demise at any given time requires the individual to strike a balance between pleasure in the present and planning for the future.
Heidegger was a secular thinker, and several other leading existentialists, such as Nietzsche and Jean-Paul Sartre were openly hostile to religion,134 but still others, such as Søren Kierkegaard, Gabriel Marcel, Paul Tillich, and Martin Buber, were deeply religious.135 The complex thoughts of these religious thinkers on the relationship between theology and death resist rapid summary. One theme that can be discerned, however, is their lack of interest in the vulgar rewards of paradise and sadistic punishments of damnation, whose images were emblazoned over the entrance of so many churches in premodern times. (See Plate 22.) Instead, they join Heidegger in treating death as defining human finitude. It is that finitude, not the possible rewards and punishments, that poses the challenge to us about how to live our lives. As Tillich says, it requires us to develop “the courage to be.” In some sense, therefore, despite the unquestionable intensity of their faith, these existential theologians are acceding to the modern view that religion, if it is to remain relevant, should focus on the way to live, not the way to sacrifice one’s life on the altar of the soul’s salvation. Like Heidegger, they reflect the transition from the previous morality’s belief that death can serve a higher purpose to the modern belief that it is simply the end of a life whose value depends on the way that it is lived.
With respect to the relationship between morality and governance, modern morality’s concept of a life path implies that each person’s life is all he or she definitively possesses and thus is of incalculable value to the individual. Its secondary principles of incommensurability and equality insist that each person’s life is as valuable as any other person’s. These ideas are co-causally related to the solicitude for life that characterizes modern administrative government, particularly when linked to representative democracy. All the West’s modern administrative states deny any private citizen the right to kill another. While this prohibition emerged gradually as part of the second millennium’s publification process, with its abolition of the bloodfeud and its criminalization of dueling, it was not definitively imposed on societal elites until the administrative era. In the antebellum American South, for example, one of the last bastions of higher purposes morality, murdering one’s slave was illegal in theory but accepted or ignored in practice.136 It was also during the administrative era that the West’s democratic regimes definitively implemented due process and other protections for the criminally accused, thereby protecting people against false convictions for crimes that were so frequently punishable by death.
In addition, the administrative era began with the abolition of public executions in Western Europe137 and later in the United States.138 As of the twenty-first century, every Western jurisdiction but one has abolished the death penalty.139 The United States is the notorious exception, of course,140 but it is more accurately described as an exception with widespread exceptions. Executions in the U.S. had virtually ceased by the late 1960s, and restrictions imposed by the Supreme Court’s 1972 decision in Furman v. Georgia presaged a general abolition.141 But the decision was effectively overruled in 1976 by Gregg v. Georgia,142 and the practice revived during the following decades as part of a highly touted “war on crime.” Nonetheless, some eighteen states have no death penalty, and another fourteen have executed fewer than ten persons since the Gregg decision.143
Most of the 1,340 executions in the United States during this period have been carried out in the Southern states, with one state, Texas, accounting for more than a third of the total.144 This is consistent with these states’ heritage, first of slavery and then of segregation, which led to a specific rejection of the new morality’s ethos of equality and a general opposition to the national authority, which was increasingly allied with administrative governance and the new morality. Moreover, several scholars have noted survivals of the even older morality of honor in Southern states,145 perhaps because slavery and its aftermath rendered government control ineffective, as in the Chicago housing projects. The Southern slave plantations were self-contained private regimes, administering their own law to a subordinate population like Early Medieval fiefs, while the efforts to oppose Reconstruction, as Franklin Zimring suggests,146 generated a pattern of vigilantism that resembled Early Medieval bloodfeuds. The current trend in the remainder of the United States, supported and encouraged by other Western nations’ widespread condemnation of American practices, points toward an ultimate abolition of the death penalty and can be regarded as a confirmation of the new morality’s predominance.147
The increasing solicitude for human life that characterizes administrative government is limited to people who are perceived as members or citizens of the polity, for reasons that will be discussed at greater length in Chapter 6. Modern governments are no kinder to people they regard as Other than were the premodern governments that were prepared to sacrifice human life to serve a higher purpose. At the same time Belgium was treating its citizens so nicely, its king, Leopold II, was perpetrating a holocaust in the Congo Free State.148 American Southerners continued their cavalier attitude toward the lives of African Americans even after the Civil War, and the national government displayed the same attitude toward Native Americans for most of the nineteenth century.149 Even Nazi Germany, although it rejected the morality of self-fulfillment in favor a modernized version of higher purposes, as discussed in Chapter 3, needed to redefine the Jews as Other before it slaughtered them.150
The primary importance that self-fulfillment morality confers on life, from the individual’s perspective, and the solicitude toward citizens that it urges from the government’s perspective, does not mean that modern administrative regimes are unable to defend themselves. Their citizens will not fight out of a sense of honor or even to serve the nation’s purposes, but they will fight to defend their way of life. Modern nations will be able to defend themselves as long as they retain the loyalty of their citizens, that is, as long as their political regimes function as effective systems of belief. In addition, their ability to obtain the loyalty of their citizens, combined with their ability to define noncitizens as Other, has enabled administrative nations, democratic as well as nondemocratic, to engage in the same military adventurism as their premodern predecessors. But their solicitude for life may be responsible for the observation that modern democracies, which increasingly need to answer to a populace committed to the morality of self-fulfillment, do not go to war against each other.151 More generally, self-fulfillment morality seems to be exercising a restraining effect on Western nations, counteracting their prior tendency to resolve international disputes through military force.152 The United States is the outlier among Western nations in this regard as well. Even there, the tremendous opposition to the Vietnam War may have persuaded the government to conduct its aggressive adventures with volunteer soldiers, so that the risk of death is redefined as the consequence of choosing a dangerous career, like coal mining, rather than an element of one’s national duty.
Although self-fulfillment morality establishes and demands respect for life as the totality of the individual’s existence, it also provides that people’s decisions about continuing their lives are theirs alone to make. Christian morality categorically prohibits suicide, and Dante placed suicides on the seventh level of hell, where they turn into stunted trees producing poisoned fruit.153 This may have originally been motivated by the desire of the early Church to avoid its followers’ mass suicide, having painted paradise in such luscious colors,154 but by the Middle Ages it represented a sincere application of the morality of higher purposes; one’s life was God’s, and perhaps the king’s, to dispense, but not one’s own. According to self-fulfillment morality, however, life is to be treasured as the totality of existence, provided it can be fulfilling, and however tragically, may be ended when fulfillment is no longer possible. Modern morality abjures suicide for the sake of honor, and self-sacrifice for higher purposes, but its ruling principle is ultimately self-fulfillment and not life per se.
These attitudes are clearly relevant to governmental policies, with the co-causal relationship between the two producing present controversy and presaging future change. Suicide was once illegal throughout the Western World, but modern morality has led to the repeal of these somewhat futile measures. Laws against assisted suicide remain widespread, however.155 There is certainly an intrinsic oddity about them. Some communities may forbid their residents to burn autumn leaves on their front lawns, others may permit them to do so, but it would be strange if a community permitted people to burn their leaves but forbade them to obtain assistance from a gardening service in doing so. The mundane analogy is not intended to demean the significance of death, but rather to suggest that the emotive freight it carries has produced the legally atypical result of forbidding people to obtain assistance in carrying out a legally permitted act.
Prohibitions against assisted suicide cannot be based on concerns about the competence of the assistance. We know exactly which people are qualified to carry out this role, and every Western nation already has an elaborate system in place to ensure their competence. Rather, the prohibition reflects a residual discomfort with suicide itself, a remnant of the previously dominant morality of higher purposes. At present, opponents of assisted suicide tend to base their arguments on pragmatic considerations, such as the complexity of the determination or the unreliability of family members, rather than on morality.156 This substitution of self-fulfillment based arguments for previous arguments based on the higher purposes of life indicates the growing predominance of the new morality.157 It can be predicted that blanket prohibitions on assisted suicide will be repealed over time. The result will not be a complete lack of control, however, but rather more modulated rules based on the new morality. It is currently well established that a physician should not engage in any action, regardless of the patient’s wishes, that she deems unethical. Thus, the likely content of future legislation will be that a licensed physician can assist a person who wants to die if she determines, on her own, according to objective standards, that the person can no longer live a fulfilling life.158 As a practical matter, this will probably limit assisted suicide to the terminally ill.159
These considerations lead naturally to the closely related issue of euthanasia. The morality of self-fulfillment has already generated the principle that people have the right to refuse medical interventions.160 The issue of euthanasia arises for those who are no longer capable of expressing their own wishes. Because modern medicine has given us the capability to maintain certain people in some state of low-level physical survival for extended periods, hospitals must decide when such heroic efforts are pointless, cruel, or a combination of the two. The treating physician can ask authorized relatives but must ultimately make her own ethical determination. At present, the most commonly withheld treatment is cardiopulmonary resuscitation (CPR), and the most common reason for withholding it (the Do Not Resuscitate, or DNR order) is medical futility.161
To some extent, this sidesteps the ethical issue; if treatment is genuinely futile, then nothing of particular value is being withheld. More important, it probably does not cover even current practice, where treatment that can genuinely prolong a person’s life is withheld if there is no chance of recovery. It will certainly not cover all the future quandaries that will inevitably arise as medical technology advances. Self-fulfillment morality suggests a quality-of-life determination: do not intervene if the patient has no possibility of a fulfilling existence for any appreciable period of time.162 This is the same principle that suggests, as indicated earlier, that the new morality may imply an obligation to terminate a pregnancy that will inevitably produce a severely disabled child. Both suggestions are highly controversial, and their ultimate acceptance is unclear at present.163 But this is to be expected; the kaleidoscope of moral transformation turns prohibitions into accepted practices, as well as turning previously accepted practices, such as slavery, religious persecution, and public execution, into absolute, unquestioned prohibitions.