Before we return to specifics, let’s consider why we, humanity, came to think of the future as different from the past, and as different from the present, at all. Even as we celebrate very old concepts of the future, and those stock characters who see into it, we might manage to obscure how different our modern perspective on the future is from that of the ancients.
Ancient views of the future are still very much alive in our imagination today, represented in film and other media. Even when the film is set in the future, as with The Matrix (1999), visiting a traditional divinatory character—the Oracle—can be central to the plot. A prophetic vision of the future, seen at a physical location that inspires such visions, is also portrayed in the futuristic “long, long ago” world of The Empire Strikes Back (1980), in which Luke Skywalker enters a cave that enhances his connection to the future, not unlike the Oracle of Delphi. “Old man” and “crone” characters with special knowledge of the future remain staples of storytelling, allowing possible surprise to be converted into suspense as the development of the plot is revealed to us. It shouldn’t be surprising, though, that these devices for good storytelling are not appropriate historical representations of how the future was seen in the distant past. These characters and incidents fit with our worldview today, and the ancients saw the future—and oracles, prophets, and divination—quite differently.
Early oracles and prophets, from early history and prehistory, are discussed in a very interesting way in Julian Jaynes’s 1976 book, The Origin of Consciousness in the Breakdown of the Bicameral Mind. Jaynes posits that as recently as a few thousand years ago, people did not have consciousness as we experience it today. Instead, they simply acted, fully engaged with the world, in ways that did not involve the sorts of introspection that are usual now. Rather than thinking about new situations and events in the way we are accustomed to, and making decisions as we do today, they experienced auditory hallucinations from one side of their minds that advised the other side on courses of action—as if they heard the voice of a god. It was the breakdown of this previously divided mind, Jaynes argues, and the formation of the type of mind we have today, that gave rise to consciousness. There are some strong adherents to this theory today; others find the underlying concepts to be provocative and potentially useful if some of the evidence and specifics to be flawed; still others find the ideas to be of little use—opinions on this theory, we could say, are more than bicameral. However, the idea of bicamerality that Jaynes introduced is still quite present in popular culture; the influence of this theory is seen very strongly in the recent TV series Westworld (2016–), for instance.
One fascinating aspect of this theory of bicameralism is that, if accepted in some form, it can explain historical practices of divination, which according to Jaynes were undertaken to try to recover the god’s disappearing voice. This voice is said to have become silent during this period of breakdown, when our current sort of consciousness arose. Even if cognition was improved in many ways by the change, the loss of this voice was distressing, according to Jaynes, and people sought a return to the old mode of consciousness and the way in which the advice-giving mental voice would speak. In making the case that oracles and prophets result from this cognitive change, Jaynes provides a detailed and cross-cultural look at how such voices spoke in early history and how the future was spoken about or seen by the ancients.
Bicameral voices weren’t exclusively talking about the future. A prophet is often thought of as someone who foretells what is to come, but this is only one of the roles of prophecy and divination.1 Still, the ancient prophet would have been one of the most future-oriented members of society. Techniques for divination, although they could be used to locate water or to decide who has to do an unpleasant task, were also some of the practices most involved with the future as we now think of it, even at a time when the contemporary concept of the future, and certainly of future-making, had not taken hold.
Of the four types of divination—including omens, augury, and spontaneous divination from the surroundings—a particularly interesting way to learn of divine will and perhaps see into the future is by casting lots. Jaynes explains that the meaning of this activity was quite different centuries ago. “We are so used to the huge variety of games of chance, of throwing dice, roulette wheels, etc., all of them vestiges of the ancient practice of divination by lots, that we find it difficult to really appreciate the significance of this practice historically,” Jaynes writes, explaining that what we understand as “chance” is a recent invention, unknown to those who undertook these techniques: “Because there was no chance, the result had to be caused by the gods whose intentions were being divined.”2
Determinism does have its adherents today, yet the concept that things happen by chance is quite prevalent in many cultures and a commonplace of everyday discussion. We might flip a coin to decide where to go eat or which movie to see—not because we expect the gods to intervene, but just because we care more about getting on with it and less about a specific outcome, and we’re willing to leave the specific choice to chance. Or, we might unskillfully throw darts at the newspaper’s stock listings and invest in whatever we hit. (Inspired by the book A Random Walk Down Wall Street, The Wall Street Journal did this for more than a decade, from 1988 until 2002, and their investments did rather well, although their returns didn’t exceed those of stocks picked by professional investors.) We also might try to intentionally randomize our actions if we’re playing rock paper scissors or are involved in some activity similar to playing this game. In playing rock paper scissors against a capable opponent, the best one can do is to make truly random moves—the Nash equilibrium of the game is for each of the two players to independently select a move at random each turn, winning 50 percent of the time in expectation. Any deviation (shifting to slightly favor rock, for instance) leaves the player open to being exploited by the opponent. In that case, for instance, the opponent could play paper all the time and win at least slightly more than 50 percent of the time.
If the concept of chance did not exist in the time of casting lots to make decisions and predict the future, however, the practice would certainly have to be understood very differently, without any sense of arbitrariness. Every predictive token would be placed deterministically; it would just be placed for a reason that is far beyond us. Stéphane Mallarmé wrote that a throw of the dice will never abolish chance, but Jaynes believed that it did just this for the ancients, in the sense that this practice, and the culture, was without the concept of chance. This helps to explain something related in the New Testament: what it means that the selection of Matthias as an apostle was done by casting lots. Historically, it wasn’t that the apostles were gambling (in the sense we think of gambling now) or leaving things to chance, in the sense that we think of chance. Rather, they were allowing for “the gods”—or in this monotheistic case, one god—to make the determination for them. Casting lots was the way to hand over the decision to divine agency.
This sort of attitude toward the future is, of course, not future-making; it isn’t even the weaker predictive and reactive attitude in which different possible scenarios are considered. The future is as good as set, and people go through the motions of casting lots to allow the gods to turn them about, to be directed. Perhaps the best way to understand why people undertook this practice, consulted oracles, and listened to prophets is not to assume that they wanted to think more effectively and incisively about the future. Rather, they did this instead of thinking about the future. In this way, this type of decision making might be a bit like having a “torn choice” (where one alternative is about as good as the other), flipping a coin to make a decision expediently, and getting on with it rather than thinking through one’s future more fully. But the process wasn’t exactly like that. Instead of taking the attitude that it doesn’t matter either way, the ancients using processes of divination were likely thinking that what action was undertaken did in fact matter, and that because of this, they didn’t want to decide what to do.
Why would this attitude, which seems fatalistic today, have been the case in the past? Economist and historian Robert Heilbroner holds that people of early times held a static view of their society, imaging that the future was an extension of the present, just as the past was. He argues that the idea of improving society, of a future that could be different—and indeed, of progress—was not developed until the middle of the eighteenth century. While a social organization that made change unlikely was part of the reason, there were also aspects of the prevalent worldview that related to this perspective on the future.
Many centuries ago, Heilbroner explains, as history was being developed and before there was an idea of what prehistoric humans were like, societies generally imagined themselves to be of divine origin and to have always lived in a condition similar to their current state. People were most concerned not with invention, trade, discovery, or learning new things but with the natural forces that beset them, which could threaten their lives or provide for a good harvest. Nature, and these forces, might manifest differently at different times, of course. But it was thought that nature would stay more or less the same over the years. There was no reason the future would be different.
Later, when human nature rather than the natural world became central to people’s concerns, the belief in the static nature of societies persisted, with unchanging human nature taking the place of unchanging nature. Heilbroner cites Machiavelli, who wrote early in the sixteenth century: “Whoever wishes to foresee the future must consult the past; for human events ever resemble those of preceding times. This arises from the fact that they are produced by men who ever have been, and ever will be, animated by the same passions, and thus they necessarily have the same result.”3 This idea, of course, exhorts those seeking to foresee the future to look at history—but for very different reasons than we would imagine now. Reading history would not show the seeds of the current situation and help one think about how it might grow into the future. It would simply be an opportunity to look at some documentation of essentially the same stasis as that in which we currently reside, to understand people of the past who are the same as people today.
Despite our modern concept of the prophet as one who can look ahead, the ancients were not deeply concerned with even knowing the future, and certainly not with making it. As Heilbroner writes, “Resignation sums up the Distant Past’s vision of the future.”4 Even though “prophecy” is used again and again in movie titles, even though we name our commercial computing systems “Oracle” and “Delphi,” there is little evidence that ancient prophets and oracles were visionary leaders who helped to make the future of their societies. Whether we accept the idea of bicameral breakdown or not, such figures seem to be part of static, ancient societies, ritually handling concerns for the future without developing more profound ideas of foretelling, and certainly without developing ideas of future-making.
So, it seems that this was our early, resigned, deterministic stance. Before we can even take a predictive view of the future, and certainly before we can engage in future-making, we have to see how our concept of the future, as we know it today, actually came about.
To understand cultural concepts of the future, it’s worthwhile to begin by contrasting a belief in stasis—that over years and generations, human society remains essentially the same—with a belief in progress. These aren’t the only cultural worldviews. For instance, the early Greek poet Hesiod, departing from the view of resigned stasis, described the people of his time as being at the end of a period of decline, as being of the race of iron, the final, most base race of man. There is also a cyclical view, which in the long term is a static one but would involve improvements and downfalls. Still, the difference between a belief in a generally static social condition and a belief that things have improved, and will continue to improve, is an essential difference in views of the future. This shift is deeply involved in developing the idea of a future that can be co-constructed by members of society. My discussion of the idea of progress is not because I believe things will necessarily get better on their own, but because having any view that goes beyond stasis is necessary for getting to the idea of future-making.
In classical times, and even in times before, it was not really feasible to believe that one’s world was in absolute stasis. The seasons caused observable changes throughout the year almost everywhere, and a winter could be better or worse, a harvest more or less abundant. But many believed that the world was generally static, only perturbed by occasional natural disasters or conflicts. There was no cultural idea that society was radically improving—or undergoing a decline—or even that progress was possible.
Perhaps inspired by the nature of the seasons and natural disasters, both Plato and Aristotle described society as cyclical. In Plato’s Laws and in Aristotle’s Politics, human society is described as moving from the family unit through different forms into that of the city-state, which is the political realization that allows for human excellence. Of course, this is not an idea that allows for further progress, since it names the current form of government as the best. Furthermore, the city-state cannot endure, in the view of Plato and Aristotle, so society has not progressed to a stable form, but just happens to be at this point as it makes a circuit.
This seems like an odd view, given that Plato designed a society in his Republic, providing the prototype for utopian writing and the imagination of better societies. But Plato didn’t see that any society, even his carefully crafted one, was immune to collapse or would be able to continually improve.
The first major ancient philosopher to reject the cyclical view of society and advocate for a linear one was Augustine of Hippo. A cyclical view would have been hard to reconcile with Christianity as we understand it today. Christ’s coming to earth and being resurrected, according to Augustine, were singular events. Some other Christian thinkers at the time held that they were part of an endlessly recurring cycle, but that belief ended up being condemned as heretical. Augustine’s view eventually prevailed. Although these events—unique events according to Christians—resulted in redemption and improvement of some sort, there was not a particular idea of earthly progress that went along with Augustine’s idea of spiritual betterment. The idea that our social conditions were improving came about during the Enlightenment, in the eighteenth century.
The concepts of the Enlightenment were developed in that century in the wake of incredible scientific achievements, beginning with Nicolaus Copernicus in astronomy. His discoveries, along with those of Galileo Galilei and Johannes Kepler, of course suggested a new relationship to the universe, one in which the Earth was not central, nor was humankind. But the work of these scientists and of Isaac Newton also showed that the efforts of many people could be brought together to produce powerful new models of the world and insights into how the world works. It became clear that the scientific method was a powerful means of learning about the world. Since this method had been recently developed and many new discoveries had been made, it made less sense to think of the world of the Enlightenment as being essentially the same as the world had been several centuries before.
Various Enlightenment authors began to develop more specific ideas of progress. Among them were the economist Adam Smith. While he did not see a completely clear path to progress, he did describe how economies, if guided by the “invisible hand” of the free-market system, would be able to improve themselves. In the nineteenth century, there were two important thinkers who saw history as a process of working through opposition toward betterment. Georg Hegel, with his idea of ideological development through the conflict of states, was one. Another was Karl Marx, who described the rise of capitalism and predicted the emergence of communism through historical conflict.
Much more can be said about the idea of progress from the Enlightenment up through today. Certainly, Smith and Marx did not agree about how social and historical progress was to happen or even about what it meant to improve society. Still, even a brief review of these Enlightenment thinkers is enough to show that the idea of stasis, that a society could be expected to remain the same over time, had been supplanted by their time, and by Hegel’s.
Looking beyond the nineteenth century, some major currents of thought oppose the idea that society has truly made progress. Against the improvements (many of them, even if they were not universal) in daily life, health care, and availability of goods, we should consider that the twentieth century saw genocide and war on unprecedented scale. Today, in the twenty-first century, the human population of the Earth is the largest ever, and all of us inhabiting the planet face a series of massive environmental catastrophes that rational people understand we ourselves have precipitated. Those who think deeply about the future can no longer assume the optimistic position of Enlightenment thinkers; at least, this is hardly the automatic conclusion.
While our challenges today may include very grim ones, the shift in outlook does not affect the basic question of how we form an idea of the future. Is our society static, as if we are offshoots of the gods placed in an unchanging world? Or it is possible for things to change—for the better, hopefully, but really in any way at all? Let’s allow that we may discard the philosophical optimism of the past—that scientific discoveries will automatically lead to improvement, that the free-market economy will improve itself, and that conflicts of ideas and classes will lead to revolutionary, better societies. Nevertheless, we can believe that the future will not be only more of the same, and that it will not be only that given to us by divine powers. Once we see that the future can be different, as those Enlightenment thinkers did, we can begin to think about shaping and making the future. Yes, even if we don’t buy every element of the centuries-old idea of progress, the concept can help us see that change, and improvement, can be possible.5 That is a starting point, at least, from which the more powerful idea of future-making can develop.