Armed with all these new insights, I could now get a theoretical handle on some of the worrying cognitive trends I was observing in my classes. Why were the two articles or book chapters I was usually assigning in upper-level classes—which, until a decade ago had seemed more or less manageable—beginning to look like an overwhelming burden? Why were so many students losing interest in the social and political world I and the authors I assigned found so compelling? Why were the majority of students having difficulty accumulating the background knowledge and developing the conceptual frameworks needed to grasp and store in long-term memory new issues and ideas? Why were they finding it even harder to “transfer” concepts and knowledge acquired previously to new courses and contexts? Why was the development of cumulative knowledge and holistic understanding of larger issues—which is the raison d’être of liberal arts education—beginning to look less realistic? Why were so many students failing to develop an intuitive sense of English grammar, syntax, word formation, and pronunciation after four years of reading and writing in English, and—in many cases—time spent working or studying in the United States? I thought I had an elegant theory explaining it all.
A new study published in early 2011 under the ominous title, Academically Adrift, seemed to corroborate all these observations.1 It decried in strong terms the “limited learning” allegedly taking place in American higher education. Its authors, sociologists and education experts Richard Arum and Josipa Roksa, based that conclusion on an analysis of results from the Collegiate Learning Assessment test taken by 2,300 students at a cross-section of American colleges and universities.
The centerpiece of that test is a “performance task component” in which students have 90 minutes to draft a solution to a practical task (for example, what would be the best way to reduce pollution in a particular area?) on the basis of several documents providing relevant information. The idea behind this task is to assess some higher-order cognitive abilities, like critical thinking, analytical and integrative reasoning, and written argumentation. As Arum and Roksa crunched the test results, they reached some grim conclusions. They found that 45 percent of the students they tested did not demonstrate significant improvement in learning during the first two years of college; and 36 percent showed no such progress over four years of college education. Moreover, the improvement the majority of students did show tended to be very modest.
Arum and Roksa singled out as the main explanatory variable for these disturbing results the insufficient rigor of most college courses and the limited amount of time students spent studying. They cited results from surveys indicating that 32 percent of students each semester did not take a single course requiring more than 40 pages of reading per week, and roughly half did not take a course in which they needed to complete more than 20 pages of writing for the whole semester. As they took such undemanding courses, students spent on average only 12-14 hours a week studying. Much of that time they studied in groups, which Arum and Roksa did not find very productive. Meanwhile, students invested much time in extracurricular activities which did not seem to produce measurable educational returns. The authors found statistically significant correlations between the rigor of the courses students took and the time they spent studying alone, and improvements in learning.2
When Academically Adrift came out, it (and the swirl of articles and press releases generated by the accompanying publicity campaign) created a firestorm. More upbeat commentators dismissed the study as “statistically adrift” or questioned its reliance on a supplemental test which few students had the incentives to take very seriously.3 But there was also a broad sense that even if the numbers weren’t exactly right, the study had indeed pointed to declining academic standards. Meanwhile, other observers had already offered an explanation similar to the one Arum and Roksa put forward. For example, the 2005 PBS documentary Declining by Degree included interviews with apathetic students and frustrated professors and college administrators.
Leave Them Kids Alone!
As a typical example of reduced course requirements, Declining by Degree introduced an economics professor at a large state university who had made the textbook for an introductory course optional. Instead, he required students to read breezy articles from The Economist. He acknowledged that he would be given or denied tenure mostly on the basis of his published research, not his teaching.4 The documentary also profiled students, however, who lacked any curiosity or desire for learning, and were not particularly embarrassed to admit this on camera. College administrators interviewed for the documentary alleged that students and faculty at research universities had entered into a mutual non-aggression pact, agreeing not to be overly demanding of each other.
The problem that the majority of both professors and students would rather be “elsewhere” is highlighted in a sermonizing article by Mark Edmundson.5 His poignant message is addressed to the class of students entering college in 2011, or maybe to the very few among them who can realistically be expected to read anything like the text he wrote. Such observations regarding the priorities of most faculty members, though, may be less relevant to many lower-tier universities and colleges which place a stronger emphasis on teaching. They definitely do not apply to my place of work, AUBG, where expectations for research are modest, with teaching clearly instated as the main institutional priority.
Such explanations seem to corroborate and offer a plausible explanation for many surveys which have found that American students, indeed, have shaky knowledge about almost any social or political topic. One of the largest samples of such data is collected in English professor Mark Bauerlein’s book, The Dumbest Generation.6 Here are just some of the most striking findings he quotes. In a 2004 study, only 28 percent of 18-to-26-year-olds could identify William H. Rehnquist as the chief justice of the United States; 26 percent “could name Condoleezza Rice as Secretary of State, and … 15 percent knew that Vladimir Putin was the President of Russia.”7 Meanwhile, students who lacked this kind of knowledge, or even had trouble listing the three branches of government, recited with ease (and, presumably, stronger enthusiasm) the names of the Three Stooges and of the latest American Idol.
Calling the “Millennials” in toto “the dumbest generation” may be a bit harsh and insensitive. This offensive label is calculated to evoke an unflattering comparison to what news anchor Tom Brokaw once dubbed “the greatest generation.” Still, the massive amount of data Bauerlein has collected does seem to demonstrate that even the Millennials who have gone on to pursue a college degree have learned much less than expected about anything unrelated to the popular youth culture of the day (or their own narrower sub-culture).
Bauerlein has attributed the dearth of political knowledge among college students and recent graduates largely to the influence of information technology. In his view, electronic communication allows the majority of the Millennials, the proverbial “digital natives,” to encase themselves in a generational cocoon. Within it, they remain cut off from adult frames of reference and concerns. Engrossed in incessant peer-to-peer contact and a world of immediate realities, they can hardly be bothered to grasp and retain knowledge about larger social issues. These all seem to them distant, irrelevant, and instantly boring.
Bauerlein’s theory does seem intuitively plausible. Yet, the trend he describes seems to predate the spread of the internet and constant access to “social media.” For example, in a study conducted over two decades ago college students gave 10 million as a mean estimate of the number of Muslims in the world. Close to 70 percent of them thought there were more Jews than Muslims in the world. Incidentally, 40 percent of participants in this study were enrolled at Berkeley.8 According to another study, published by a conservative think tank a decade ago, students at some prestigious universities seemed to have lost part of the knowledge they had acquired in civics and history classes in high school.9
Bauerlein’s focus on the content of the peer-to-peer communication facilitated by the internet may, in fact, be slightly misleading. Marshall McLuhan taught us as much over five decades ago. Proclaiming that “the medium is the message,” he argued that it is the nature of the channel of communication itself that largely shapes the reception of any message.10 And this happens at a broader societal level, not just at the level of individual perception and understanding. McLuhan defined the media broadly to include “any technology whatever that creates extensions of the human body and senses, from clothing to the computer.”11 Long before the rise of the internet, he concluded that “electromagnetic technology” was giving rise to a completely mediated social environment. It was thus causing an information overload which numbed the senses, inducing exhaustion of the central nervous system and general bewilderment.
McLuhan was particularly fascinated with the explosive spread of television and the effects of the new medium on children and teenagers. In fact, he attributed the generation gap which suddenly opened up in the 1960s largely to the effects of the new but already pervasive medium. To illustrate this often overlooked point, he commissioned an iconic mural called “Pied Pipers All.” The painting depicted stylized human figures writhing in a psychedelic dance against the background of a bright TV screen. McLuhan wanted that image to represent vividly the way in which television was mesmerizing the young and leading them away from adults—as the proverbial “pied piper” had once done with the children of a medieval German town. He attributed this cultural upheaval and the related “generation gap” to the influence of television on the human brain.12
McLuhan is often dismissed as a flamboyant, media-savvy media guru who had perfected the art of the sound bite—but whose cryptic pronouncements held little analytical value. As we shall see later, some of his conclusions regarding the impact of television on society were, if anything, overly optimistic. But his overall idea that massive daily exposure to a new screen-based communication technology must have a profound impact on the nervous system, perception, and social functioning seem truly prophetic. In the 1970s, writer and birdwatcher Marie Winn raised even stronger concerns, calling television “the plug-in drug.”13 These early warnings later received support from some findings in neuroscience. On the basis of these, psychologist Mihaly Csikszentmihalyi and media researcher Robert Kubay eventually concluded that television was, in fact, addictive—in the most literal sense of the word.14 TV addiction has not yet been recognized as a clinical diagnosis, but the basic idea that it causes unhealthy changes in brain wiring and activation is probably sound. Needless to say, these changes have been tremendously intensified by the spread of the internet and video games, the rise of “social media,” and the proliferation of hand-held “platforms” allowing constant access to these (and loaded with multiple apps and games).
The extent of this digital immersion, particularly among adolescents whose brains are more easily affected by all kinds of influences, now defies the imagination. A study completed in 2009, before smartphones had become ubiquitous, found that American children and teenagers aged 8 to 18 were spending on average 7.5 hours a day interacting with one or more electronic screens. That number did not take into account multitasking, an hour and a half spent texting, and a half hour of talking over a cell phone. The title of The New York Times article which reported those findings in 2010 said it all: “If Your Kids Are Awake, They Are Probably Online.”15 Two similar studies had previously been done in 1999 and 2004. The second one had found a significant increase, and its authors “had concluded … that use could not possibly grow further.” They were in for a surprise. According to the 2009 data, “young people’s media consumption grew far more in the last five years than from 1999 to 2004, as sophisticated mobile technology like iPods and smartphones brought media access into teenagers’ pockets and beds.”16
Digital Divide 2.0
According to a study published in 2010, children whose parents lacked a college degree (which is taken as shorthand for lower socioeconomic status) were spending on average 90 minutes more per day exposed to screen-based media compared to families where at least one parent had a higher-education degree. Since 1999, that exposure had increased by four hours and 40 minutes and three hours and 30 minutes for the two groups respectively (the study double-counted hours spent multitasking). On the basis of those numbers, it seemed that efforts to bridge the proverbial “digital divide” had the paradoxical effect of creating a new gap with even more troubling implications. The New York Times article which reported these findings focused on the time children and teenagers were wasting online, but the direct impact of all that time spent staring at screens of various sizes may be a lot more profound.17
We can safely assume that over the past few years this trend has continued unabated, and may have even picked up speed with the explosive proliferation of smartphones and tablets. Another study conducted in the now distant 2010 found that 94 percent of American high school and college students had Facebook profiles. They spent on average 11.4 hours per week logged into the website. Seventy-eight percent of them accessed it using their mobile phones.18 It has become increasingly common for schoolchildren and teenagers to spend even recess glued to hand-held screens, watching video clips or interacting through Facebook. Once at home, they commonly multitask: chat with friends, watch videos or check out newly posted pictures, scroll through Facebook, listen to music, etc., while they are doing their homework.
With all these data in mind, it can only be expected that the problems troubling the allegedly “dumbest generation” of “academically adrift” college students start earlier. According to a 2006 study, just over a quarter of American teenagers graduating from high school were “proficient” in civics. One third, on the other hand, did not have even “basic” knowledge in that area.19 Results in U.S. history were even worse. Only 13 percent of students were proficient, and over half did not have even basic knowledge.20 Though the reports emphasized some tiny improvements as compared to the previous studies done in 1998 and 1994 respectively (and the way “proficiency” is “operationalized” can be questioned), they still drew a fairly bleak picture.
It might be tempting to see such results as a uniquely American problem. When compared internationally, however, it seems that “on average, American youth perform fairly well in international comparisons of civic and political knowledge.”21 A study of the political knowledge and attitudes of young Europeans published in 2002 revealed a similarly disconcerting picture. Most of them “believed that citizens should vote and obey the law.” They also expressed support for “social movement activities.” But they showed little enthusiasm for “conventional political activities” like the “discussion of political issues.”22 Curiously, a recent study has found a striking degree of ignorance among Belgian teacher trainees. For example, “when asked which political ideology stood for the redistribution of wealth, higher taxes and state involvement, only one in two answered socialism.” Also, “among final year teaching students involved in the study, one in three could not identify the United States on a map and almost half did not know where the Pacific Ocean was.”23 The problem of disengagement or intellectual and existential distancing from larger issues—despite recent protests and other forms of youth mobilization in many countries—thus seems to be part of a much broader syndrome.
The failure to develop a strong interest and to accumulate sufficient knowledge in such issue areas has gone hand in hand with another significant trend—lagging reading proficiency. A 2003 study already found an “inexplicable” decline in the reading proficiency of college graduates over the preceding decade. Only 31 percent could “read a complex book and extrapolate from it,” as compared to 40 percent in 1992. Among graduate students, only 41 percent were rated as “proficient” in reading—a decline of 10 percent.24 The same study also revealed another unexpected development—more college students demonstrated intermediate reading abilities. That result led the authors “to question whether most college instruction is offered at the intermediate level because students face reading challenges.”25 That suspicion was shared by Dolores Perin, a reading expert at Columbia University Teachers College. After sitting in many high school classes, she had become convinced that there was a seldom discussed but “tremendous literacy problem among high school graduates.” As a result, “the colleges are left holding the bag, trying to teach students who have challenges.”26
In recent years, 40 percent of incoming first-year students at American four-year colleges have needed to take at least one remedial class in reading, writing, or math.27 It should then be hardly surprising that, as philosophy professor Carlin Romano has noted, “for too many of today’s undergraduates, reading a whole book, from A to Z, feels like a marathon unfairly imposed on a jogger”—an attitude which has predictably brought about “the disappearance of ‘whole’ books as assigned reading” in college courses.28 The proverbial “crisis in education” can thus be seen, at a more basic level, as reflecting primarily a crisis in reading.
Many studies suggest that over several decades reading for pleasure has declined considerably in the United States. As writer Caleb Crain has summarized the data, “we are reading less as we age, and we are reading less than people who were our age ten or twenty years ago.”29 He has also pointed to “indications that Americans are losing not just the will to read but even the ability.” According to one study, between 1992 and 2003 the proportion of adults who qualified as proficient readers (who could, for example, compare the viewpoints expressed in two editorials) declined from 15 to 13 percent.30 These are again not exclusively American problems. A study found that in the Netherlands reading for pleasure had similarly declined since the 1950s as TV viewing had increased. In the mid-1990s, college graduates born after 1969 were reading less than people without a college degree born before 1950.31
There must be many contributing factors to such a steady decline in reading, reading ability, and social knowledge. An increasing immersion in a virtual environment, however, is probably among the most important. The original “virtual reality” was created decades ago as television became truly pervasive and a ubiquitous babysitter. More recently, constant access to the web, video games, and numerous apps has provided the last straw (for now). Once serious reading turns into a boring obligation, the pull of virtual distractions is likely to become even stronger, creating a vicious circle that is all but impossible to break. This compulsion is likely to persist after teenagers enter college and beyond, contributing to pervasive learning problems.