Questioning Customs

Ignorance and a narrow education lay the foundation of vice, and imitation and custom rear it up.

—MARY ASTELL

The despotism of custom is everywhere the standing hindrance to human advancement.

—JOHN STUART MILL

Normal is what you’re used to.

It seems to be a part of human nature that customs and institutions come to seem somehow inevitable and preordained. This sense, even when it is illusory, gives a stubborn staying power to habits and systems that have been around a while—even after it’s become clear that they’re no longer working very well. This is certainly the case with the educational system that most of us have known. It’s so big that it’s hard to see around it. It’s so complexly integrated with other aspects of our culture that it’s daunting to imagine a world without it.

If we are to muster the vision and the will to meaningfully change education—to bring teaching and learning into closer alignment with the contemporary world as it really is—one of the leaps we need to make is to understand that the currently dominant educational model was not, in fact, inevitable. It is a human construct. It evolved along a certain pathway; other pathways were also possible. Parts of the system we now hold sacred—for example, the length of the class period or the number of years assigned to “elementary” or “high” school—are in fact rather arbitrary, even accidental. Things that are now considered orthodox were at various points regarded as controversial and radical.

Still, changing a system of such vast inertia and long tenure is clearly not easy. It’s not just that tradition tends to cramp imagination; it’s also that our educational system is intertwined with many other customs and institutions. Changing education would therefore lead to changes in other aspects of our society as well. It is my firm belief that over time this would be a very good thing; in the near term, however, such a prospect necessarily suggests disruptions and anxieties.

Let me offer an analogy that I hope will drive home the enormity of the challenge that we face. Consider something as basic as the habit of eating three meals a day.

Is there some biological imperative dictating that we should eat breakfast, lunch, and dinner versus two or four or five meals? Some Buddhist monks eat one meal a day at midday. There is some recent evidence that suggests alternate-day fasting might also be a healthy option.1

Why, then, do most of us cling to the habit of breakfast, lunch, and dinner, even though most of us today do much less manual labor than our ancestors who started this custom? The answer is simply this: It’s what we’ve always done, just as we’ve always sent our kids to certain kinds of schools that operate in certain kinds of ways. It’s a cultural habit that we take for granted.

Moreover, since we are social creatures and since our interwoven lives consist of many interconnected facets, the custom of three meals a day has become part of a matrix of many other activities. The workday allows for a lunch hour. Local economies depend on restaurants serving dinner, employing staff, collecting sales tax, and so forth. Insofar as families still sit down together, it is consensual mealtimes that most often bring them together.

For all these reasons, it would be exceedingly difficult to change the culture of breakfast, lunch, and dinner. The implications of such a change would be seismic. The whole rhythm of the workplace world would be altered. Entire industries would be challenged to adjust. Even the television schedule would need to shift.

As with our eating habits, so with our teaching habits.

Entire industries and some of our very largest professions depend on the persistence of our current system. Other social institutions—like giant publishers and test-prep companies—are synched to its workings. A certain teaching method implies certain goals and certain tests. The tests, in turn, have a serious impact on hiring practices and career advancement. Human nature being what it is, those who prosper under a given system tend to become supporters of that system. Thus the powerful tend to have a bias toward the status quo; our educational customs tend to perpetuate themselves, and because they interconnect with so many other aspects of our culture, they are extraordinarily difficult to change.

Difficult, but not impossible. What’s needed, in my view, is a perspective that allows us a fresh look at our most basic assumptions about teaching and learning, a perspective that takes nothing for granted and focuses on the simple but crucial questions of what works, what doesn’t work, and why. To gain that perspective, it’s helpful to look at the basics of our standard Western classroom model, to blow the dust off and to remind ourselves how the system came to be the way it is. It’s also useful—and humbling—to realize that the debates and controversies currently surrounding education tend not to be new arguments at all; similar conflicts have been raging among people of passion and goodwill since teaching and learning began.

image

The basics of the standard educational model are remarkably stubborn and uniform: Go to a school building at seven or eight in the morning; sit through a succession of class periods of forty to sixty minutes, in which the teachers mainly talk and the students mainly listen; build in some time for lunch and physical exercise; go home to do homework. In the standard curriculum, vast and beautiful areas of human thought are artificially chopped into manageable chunks called “subjects.” Concepts that should flow into one another like ocean currents are dammed up into “units.” Students are “tracked” in a manner that creepily recalls Aldous Huxley’s Brave New World and completely ignores the wonderful variety and nuance that distinguish human intelligence, imagination, and talent.

Such is the basic model—schematically simple in ways that mask or even deny the endless complexities of teaching and learning. For all its flaws, however, the standard model has one huge advantage over all other possible education methods: It’s there. It’s in place. It has tenure. The tendency is to believe that it has to be there.

Yet even the briefest survey of the history of education reveals that there is nothing inevitable or preordained about our currently dominant classroom model. Like every other system put in place by human beings, education is an invention, a work in progress. It has reflected, at various periods, the political, economic, and technological realities of its times, as well as the braking power of vested interests. In short, education has evolved, though not always in a timely manner, or before some unfortunate cohort of young people—a decade’s worth? a generation’s worth?—has been subjected to obsolete teachings that failed to prepare them for productive and successful futures.

It is time—past time—for education to evolve again. But if we hope to gain a clearer idea of where we need to go, it’s useful to have at least a rudimentary awareness of where we’ve been.

Let’s begin at the beginning. How did teaching start?

As it was succinctly put in a recent article by an educator named Erin Murphy in the Wharton School’s online journal, the Beacon, the earliest forms of teaching and learning were essentially a case of “monkey see, monkey do.” In preliterate hunter-gatherer societies, parents taught their children the basic survival skills by practicing them themselves and, whenever possible, inserting an element of play into the process. This form of teaching was simply an extension of the way other animals also taught their young. Lion cubs, for example, learn to hunt by mimicking the stalking postures and strategies of their parents, and turning the exercise into a game. In the case of both lions and early humans, the stakes in education were of the highest order. The offspring who learned their lessons well went on to prosper and reproduce. In the unforgiving environment of the savanna, the kids who didn’t pay attention or never quite caught on were not around very long. To flunk was to perish.

As human language developed—language itself being a technology that radically changed and expanded our ways of sharing information—societies grew more complex and more specialized, and there came to be areas of desirable skills and knowledge that were beyond the abilities of parents alone to teach. This gave rise, at various times and in various forms, to the apprentice system. Significantly, the apprentice system marked the first time in human history that the main responsibility for education was shifted away from the family; this, of course, gave rise to a debate that has never yet died down about the respective roles of parents versus outside authorities in the education of children. Absent the bonds of family affection, the apprentice system was also the first time there was a clear, hierarchical distinction between the master/teacher and the apprentice/student. The master taught and ruled; the student submitted and learned.

Still, the manner of learning was a long way from the passive absorption of the more recent classroom model. Apprenticeship was based on active learning—learning by doing. The apprentice observed and mimicked the techniques and strategies of the master; in this regard, the apprentice system was a logical extension of learning by imitating a parent.

The apprentice system was also the world’s first version of vocational school. It was a place to learn a trade—though in certain instances the trade in question could be extremely highbrow. Many associate the apprentice system with artisans like blacksmiths or carpenters, but it has also historically been the primary mode of education for future scholars and artists. In fact, even today’s doctoral programs are really apprenticeships where a junior researcher (the PhD candidate) learns by doing research under and alongside a professor. Medical residency programs are also really apprenticeships.

Be that as it may, the apprentice system in general represented one side of a schism—those who believe that education should, above all, be practical, aimed at giving students the skills and information they need to make a living—that has existed for thousands of years, and exists still. On the other side are those who feel that seeking knowledge is an ennobling process worth pursuing for its own sake.

The preeminent representatives of this latter point of view were of course the Athenian Greeks of classical times. Plato, in the dialogue Gorgias, ascribes to Socrates, his alter ego and ideal man, the following statement: “Renouncing the honors at which the world aims, I desire only to know the truth.” Clearly, there’s a feisty and even defiant value judgment being made here, a slap at mere practicality. Aristotle, in the very first line of his Metaphysics, asserts that “all men naturally desire knowledge.” He doesn’t say marketable skills. He doesn’t say the right credentials to get a job. He’s talking about learning for the sake of learning, and he’s positing that impulse as the very definition of what it means to be human. This is a long way from the model of apprenticeship as a way of learning to tan hides or carve stones or even treat patients.

There is much that is appealing in Plato’s and Aristotle’s pure approach to learning as a deep search for truth; this is, in fact, the mind-set that I hope to bring to students through my videos. However, there are a couple of serious problems with the model of the classic Greek academy. The first is that it was elitist—far more so than even today’s most exclusive prep schools. The young men who could afford to hang around discussing the good and the true were oligarchs. Their families owned slaves. None of these students really needed to care about how to harvest crops or weave textiles. Real work, even work that was intellectual, was beneath them.

This led to a second, more destructive problem that still exists today. Once the pure search for truth was posited as the highest good, it followed that anything merely useful would be regarded as less good. Practical learning—learning that might actually help one do a job—was regarded as somehow dirty. And this prejudice pertained even to practical subjects—as, for example, finance or statistics—that are intellectually very rich and challenging.

As a classical legacy, this perceived separation between the truly intellectual and the merely useful was perpetuated by the European universities during the Renaissance, then passed along to the early American colleges. The same set of biases persisted more or less intact well into the nineteenth century. Throughout this period, universities were generally something of an intellectual retreat for those who did not need to work in the traditional sense—future clergy, sons of the wealthy, and those devoting their lives to arts and letters (often enabled by the patronage of a wealthy family). Careers in even very intellectual professions, like law and medicine, were primarily developed outside the universities, through apprenticeships (although a few degree programs did begin to emerge in the eighteenth and nineteenth centuries). A law degree didn’t become a mainstream credential in the United States until the late 1800s, when the completion of postgraduate instruction became a requirement for admission to the bar.2 The idea that a college degree is a prerequisite to any professional career is a quite new one, only about a hundred years old. The idea that college is needed for everyone in order to be productive members of society is only a few decades old.

Let me be clear as to why I raise this point. I’m not suggesting that people shouldn’t go to college. My contention, rather, is that universities and their career-seeking students have a deep-seated contradiction to resolve: On the one hand, our society now views a college education as a gateway to employment; on the other hand, academia has tended to maintain a bias against the vocational.

Clearly, our universities are still wrestling with an ancient but false dichotomy between the abstract and the practical, between wisdom and skill. Why should it prove so difficult to design a school that would teach both skill and wisdom, or even better, wisdom through skill? That’s the challenge and the opportunity we face today.

image

But let’s get back to some history.

In terms of making knowledge available to the many, the most important technology since spoken language has been written text. It has allowed for knowledge to exist and be collected outside a human mind. This allowed for information to persist unchanged over generations and for large amounts of information to be standardized and distributed (without the distributor having to memorize it).

Writing was a huge step forward, but it did come with unintended consequences. Whenever there is a hugely empowering new technology it can increase inequity between the haves that have access to it and the have-nots that don’t. Early writings—whether it was on papyrus scrolls in ancient Egypt or in parchment books of the early Catholic Church—were nice for those who had access to them and knew how to read, but most people didn’t. So the availability of written sources, far from eliminating the elitism and class distinctions that had gone before, actually exacerbated them for a time. The privileged now had greater supplies of special knowledge and therefore greater power.

And to make it clear how privileged a thing books were in their early days, think about how they had to be produced. They would be hand-copied by very specialized people with good penmanship. Consider how much it would cost to have one of the most educated people in your town spend a few years copying, say, the Bible, and you’ll have a good sense of how expensive early books were—something on the order of a decent house in today’s terms. So you can imagine that few people had access to touch them, much less the ability to read them.

Then came primitive block printing. Now a skilled artisan could carve text and images on the surface of a wooden block, dip it in ink, and press it onto a piece of paper. This was an advancement, but books were still expensive. Depending on the number of prints, this could actually be more labor-intensive than hand-copying text. It is hard to inflation-adjust the price of something over seven or eight centuries, but roughly based on the amount of labor involved, single books would have been comparable in cost to a nice luxury car—so well-off families might have a few, but they were by no means commonplace.

Then something epic happened in 1450 in Strasbourg, a German-speaking town that is now part of France. A fifty-two-year-old blacksmith named Johannes Gutenberg decided that he could simplify the creation of the blocks for printing text. Instead of each block being separately hand-carved, he realized that the individual letter blocks or “type” could be made in metal once and put together on a block for a given page. They could then be rearranged for the next page. Instead of multiple weeks of a skilled artisan’s time being required to make a block for one page, it could now be done by a typesetter moving around type in a matter of a few hours—reducing labor costs by a factor of from ten to one hundred. Also, because the type was reused, more effort could be put into making the letters precise and uniform (thus the emergence of fonts). Because they were metal, instead of wood, they were more durable and the printing presses could be run faster. Now great works of writing would be accessible to many, many more people (although the first and only major work that Gutenberg printed at scale—the Gutenberg Bible—was still quite expensive for the time). Even more, it now became practical to print and distribute writings that weren’t holy books or great works of classical literature—it is no coincidence that the first modern newspaper emerged in Gutenberg’s Strasbourg roughly 150 years after the printing press.

In the spirit of not being Eurocentric, credit for the first movable type has to be given to the Chinese, who invented it a few hundred years before Gutenberg. He, however, was the first to create his type pieces out of materials similar to those used today. It also appears that movable type was able to spark more of a revolution in fifteenth-century Europe than in eleventh-century China and thirteenth-century Korea.

By the eighteenth century, movable type and the printing press had been perfected to the point that books were reasonably affordable. By the nineteenth century, what we now call textbooks had become viewed as a mainstay of mainstream education.

Pedagogically as well as politically, the broad distribution of textbooks raised new questions and difficulties—questions and difficulties that remain at the forefront of educational arguments today.

Before books were widely distributed, teaching was incredibly nonuniform. Teachers taught what they knew, in the manner that seemed best to them. Each teacher was therefore different, and when a teacher acquired a reputation for wisdom or originality or thrilling oratory—not necessarily accurate information—pupils flocked to him. Like a beloved village rabbi or priest, he was deemed to have something that could be gotten nowhere else. His students, in turn, came away with an education—and sometimes misinformation—unique to that particular classroom.

The mass production of books changed all that—and this is an aspect of education history to which too little attention has been paid. No longer was the teacher the sole source of information and the ultimate authority on a subject. Now there was an expert behind the expert, sharing in the teacher’s prestige as the source of knowledge. The teacher ruled in the classroom but the textbook had standing in the world beyond. What if the teacher and the text disagreed? The legitimizing power of print seemed to give the last word to the book. On the other hand, textbooks empowered teachers to expose their students to the latest thinking from the broader world. They gave students the ability to study at their own pace and come to class ready to be engaged at a deeper level by a master teacher.

What is clear, however, is that it was the wide availability of books that ushered in the age of educational standardization. Suddenly students in distant places were reading the same poems and proverbs, learning the same historic dates and names of kings and generals, working out the same problems in arithmetic.

And standardization itself was not a bad thing. In a world growing more complex and gradually interconnected, standardization was a means to inclusion; it promised a leveling of the playing field and at least the potential for a true meritocracy. It also mitigated the impact of bad instruction that would otherwise go unchecked. Now students were less likely to be misled by a one-off viewpoint or inaccurate explanation.

The challenge, however—the same challenge in the early days of textbooks as now in the wider world of Internet-based learning—was this: How can we most effectively deploy standardized learning tools without undermining the unique gifts of teachers?