MY HUSBAND, KEN, REMEMBERS STUDYING TRIGONOMETRY IN middle school around the time that the first solid-state, pocket-sized electronic calculators came on the market. They were pricey, about $250. All the smart kids were thrilled by the amazing things you could do with them.
Calculators were banned from his math class.
The students were required to pretend these exciting new devices didn’t exist—at least, not in the classroom. Their well-meaning teachers were sure that if they allowed the students to use calculators, the students would become dependent upon them and their math skills would suffer ever after. Instead, Ken was taught to do trigonometry using a slide rule.
In retrospect, that seems ridiculous. Why would calculators hurt your ability to do trigonometry, but slide rules would not? The answer can be summed up as “technophobia,” a fear of the new and a fear of change as embodied in new technology, especially technology that the young seem to master with ease but that makes their elders feel clumsy, out-of-date, and yearning for the good ole days. Ken’s teachers weren’t focused on their students’ ability to do mental calculations (a good, foundational math habit) but were worried about them relying on a new device instead of the old one. It is common, in the history of technology, for adults to be sure that the latest device—the one that wasn’t around when they were growing up—will somehow do irreparable damage to the younger generation. For teachers, a common technophobic response is to ban from the classroom devices that are becoming ubiquitous everywhere else.
We are in this situation once again. It’s not electronic calculators that are being banned; many schools find those quite acceptable. And of course it would be difficult to find a slide rule in any classroom these days. But many educators and pundits have spent the last two decades banning (or calling for bans of) laptops, tablets, and mobile phones from classrooms. They draw from and are abetted by researchers whose studies—some more convincing and thoughtful in experimental design than others—“prove” that college students who have an electronic tablet in the lecture hall don’t pay as much attention to the lecture. (Is that really surprising?) Or they show that taking notes on a laptop makes for lower grades on final exams.
The results of these studies have been popularized in best-selling books that portend doom for the device-obsessed younger generation. These books warn that “Google makes us stupid” and predict that social media will make our children “alone together” (“if you don’t learn how to be alone, you’ll always be lonely”). Whether in the lab or in advice books, technophobia always starts from a baseline of nostalgia. What is new is measured against some glimmering memory of a golden past before the Internet, when everyone was smart and self-sufficient and no one felt lonesome.
I’m skeptical of these technophobic arguments. My personal history plays a part in my skepticism because I spent a decade of my life researching the impact of and reaction to the last information age in human history—the industrial age that ushered in mass printing and that, in the United States, coincided with the drafting and passage of the US Constitution. With the advent of steam-powered presses and machine-made paper and ink, books became cheap to print and for the first time in history middle- and working-class people could own (or borrow from the new lending libraries) popular books. Before that, you might own a family Bible, a Psalter, and a primer, but suddenly reading was a “thing,” and the young couldn’t get enough books, especially popular novels featuring dashing heroes or victimized heroines whose everyday work and lives resembled their own. Thomas Jefferson and John Adams were only the most prominent people who insisted that wasting your time on such sensational tales made you stupid and, yes, lonely and isolated from the “real world.” Pundits in 1790 wrote articles like “Novel Reading, a Cause of Female Depravity.” Novels were thought to “mesmerize,” “capture,” and “tyrannize” a reader’s attention and volition. Many of the technophobic arguments one hears against modern digital technologies and their cultural offshoots—tablets or video games, Google or Wikipedia—sound to my ears like the Founding Fathers railing against the rapt young novel readers of their era.
In our day, some professors have not only banned devices in lecture classes but have written about how and why everyone else should, too, with all the earnestness and good intentions of Ken’s middle school trig teacher. For the sake of argument, let’s say taking notes longhand is “better” for students than taking notes on a laptop. Even if that is true, what good is forbidding students from using laptops in class if everywhere else in their lives—in their days outside of school and in their future work—they are using some computing device to take notes? Doesn’t it thus make good, common sense to come up with ways of teaching the best, most efficient, and smartest ways of using their devices, rather than issuing a blanket rule to “ban” them from class?
Our students carry smartphones more powerful than the IBM 360 mainframe computers that NASA used to put men on the moon. It’s no surprise that a standard-issue lecturer loses the competition for students’ attention. Sometimes, of course, we should tell students to put their phones away, but we also need to be rethinking the lecture, which, if I’m remembering my undergraduate days correctly, often couldn’t compete successfully with the student newspaper for my attention. The real lesson for the new education is that we need more active, creative ways of teaching that put some of that computing power to good pedagogical use. It’s odd and even irresponsible that formal education is the one place where we’re not using the devices on which we do our learning all the rest of the time.
If you want students to succeed not only on a final exam but in everything else in life and work, it makes little sense to ban the devices essential to life and work outside of school. It seems sensible to, instead, teach the skillful, critical use of these tools. If the purpose of formal education is to prepare students for what comes after graduation, we should not be forbidding learning of the kind that they will experience beyond school.
It seems so logical, and yet to make this switch requires a reexamination of our deepest assumptions about the role and function of formal education. As we have seen, especially in the modern college and research university, every structure and infrastructure of academe puts the institution at the center—not the students, not the professors. All of those inputs and outputs, selection processes and graduation requirements, entrance exams and certifications that were designed between about 1860 and 1925 are now veritable fortifications against professors dedicated to progressive, student-centered, active learning. This is less the case in community colleges, but everywhere educators are skittish about innovation that makes them feel insecure about their skills. We all (including parents) came up through and were trained by a system of formal higher education in which students were rewarded for mastering and emulating the expertise of their professor. What happens when a young person is more adept at a shiny new learning device than are her parents—and her teachers?
Well-meaning educators have created a lot of anxiety, in students and even more in their parents, around the Internet. After all, they learned the good old way, before computing devices—and they are in control, the authorities at the front of the classroom calling the shots, knowing what’s best.
My students were typically born after 1993, the year the Internet exploded, and they find this stance as silly as Ken and his fellow trig students found the edicts against calculators. Ken and his classmates would have been delighted to learn that, when the first slide rules were invented in England in the seventeenth century, most educators responded with alarm. The math profs who invented and used them typically did so covertly. They were afraid powerful dons or magistrates or religious leaders would find the slide rules sinister, even profane. Many Christians of the time believed the use of mechanical devices that allowed humans to exceed their God-given abilities was heretical. As Galileo and others could testify, life didn’t go well for scientists who were thought to be of Satan’s party or who dared defy the authority of the Church. Even Sir Isaac Newton feared public reaction. He and his students used their slide rules in secret, behind closed doors.
The impulse toward technophobia dies hard. And we’re all complicit, not just educators. Every older generation defends cherished practices. This is a problem for those of us invested in educational reform because, for thousands of years, the role of the educator has been to institutionalize and authorize traditional ways of knowing as a bolster against the incursions of the faddish, the distracting, the shiny, and the new. For all our talk of innovation, on some level, most parents want educators to maintain standards, which often means being the gatekeepers, testers, and sorters of the worthwhile from the dreck, protecting tradition and authority, saving us from our flightier impulses. The citizens of Athens forced Socrates to drink hemlock as punishment for poisoning the minds of youth. That lesson haunts most profs.
Yet innovation happens, and adventurous students and professors always seem to come up with a way to use it. Ken’s middle school friends did as Newton had done: they practiced math the accepted way in school and, in private, away from school, they played all kinds of trig tricks on their calculators.
Here’s the paradox: Ken’s teachers were positive they were helping to arm students for their future by not letting them use the device of the future.
Here’s a second paradox: Ken’s slide rule–versus–calculator tale isn’t his only allegory of education and technological change. In high school, he was given a choice in his driver’s ed class between learning to drive a car with a manual transmission and learning with an automatic transmission. Following logic similar to what he followed in trig, he went with an automatic. To this day, he grumbles about that decision, especially when we find ourselves in a foreign country having to pay double to rent an “American-style” car to accommodate his need for an automatic shift. Slide rules no longer exist. Stick-shift cars still prevail just about everywhere but in the United States. That’s the trouble with technological change. It’s never easy to tell what will or won’t last. This is a reminder of why we invest formal education with the authority to be a bulwark against technological fads. How do you decide what is the bathwater and what is the baby?
If the goal of higher education is to prepare young people for the world beyond, it’s clear that technology must have a role. At the same time, we must be more thoughtful about what that role should be. Transforming higher education for the twenty-first century is not about throwing a lot of tech into classrooms. What we need instead is to rethink higher education so that our students are digitally literate—so that they understand, gain insight into, and maybe even exert more control over the technologies that have changed and sometimes dominated our lives and will do so even more in the future.
TODAY’S MATH CLASS FOCUSES ON RELATIVELY PRIME (OR “COPRIME”) numbers. Professor Derek Bruff of Vanderbilt University explains these are numbers that share no common prime factors. 34 and 45 are relatively prime because 34 = 2 × 17 and 45 = 3 × 3 × 5. They have no prime factor in common.
Follow? If you do, you fall into the category Bruff calls the “mathematically gifted 10 percent” that take his class on cryptography. He teaches for those students and also for the rest of us.
Trim, clean-shaven, and affable, with unfussy steel-rim glasses and an enthusiastic, engaged speaking manner, Derek Bruff is a born teacher. In addition to being a professor of mathematics, he directs the Center for Teaching at Vanderbilt University. Improving the student learning experience is what gets him out of bed in the morning. He’s on a mission. He loved college, and especially math, but knows that “nine out of ten students have math horror stories.”
Math is a subject for which each thing you learn builds on what you already know. “One bad math teacher,” he says, can ruin someone’s interest in math for life, so his aim is to do the opposite. He wants to ensure that students leave his class with the same good experience he had as a student. His teaching methods derive from his mathematical training as well as his reading in the work of the renowned educator Benjamin S. Bloom, who reversed the old idea that students had innate abilities or inabilities and replaced it with the idea of “mastery learning.” In mastery learning, you take each student from where she is and build her up to a point of mastery of the next most complex concept, then build on that. Failure is not a constant state but a problem of the instructional method that fails to grasp what the student knows or doesn’t know in order to help her find the path to greater and greater mastery. A student doesn’t only learn math. She learns how to learn, how to take what she knows and extrapolate answers to other questions. It’s a skill that serves her in every subsequent challenge she faces.
In a traditional math lecture on relatively prime numbers, the prof would stand at the white board scrawling equations. It’s Bruff’s estimate that 10 percent of his students understand such equations already and they’re bored to death. Ten percent struggle to learn. Eighty percent are so lost they give up the struggle and tune out. Since Math 101 is typically a flunk-out course for students hoping for a STEM major and a career in science, there’s often a feeling of despair in the class. The ones who want to succeed badly enough, perhaps so they can go on to medical school, hire a tutor to get them through the final exam. The rest stare into their smartphones, wishing they were anywhere but there.
That’s not what happens in Prof Bruff’s math class. He often teaches with clickers, a relatively simple technology that collects student responses to his questions and projects their answers for everyone to see and analyze together. Most lecturers scoff at clickers and might even scoff at them as being the stuff of game shows, not higher learning. Yet they are gaining currency and respectability, thanks in part to Harvard physics professor Eric Mazur’s advocacy of them. Clickers are often used in “flipped” classrooms, where students read the material beforehand and then, instead of sitting passively in a lecture, respond to questions or problem sets posed by the professor. The professor can then see how many came up with the right or wrong answers, respond, give feedback, and then pose a similar problem so students can try again. This turns a large lecture class into a two-way dialogue, with real-time feedback and students working in pairs or teams to find solutions to increasingly complex problems.
Professor Bruff sets them a task: “Generate pairs of three-digit relatively prime numbers.” He has them submit their answers on the clicker polling system and he submits an answer too—a deliberately wrong one. All of the answers are then projected on a screen overhead.
Everyone examines the rows of answers, looking for their own. Professor Bruff next informs the students that at least one pair of numbers on the screen is incorrect and has them work in small groups to identify it. The room is suddenly abuzz with activity as each student huddles together with a few classmates. As it turns out, today there’s a second wrong answer, submitted by one of the students. Even better! Everyone is talking, calculating, testing, totally absorbed in the exercise. Cell phones sit ignored on desks.
Keep in mind that Professor Bruff has never lectured this class on how to determine relatively prime numbers. The students are exchanging information, teaching one another, giving one another hints and advice. Within a few minutes, he asks for a volunteer and calls on a student to explain how she confirmed 493 and 611 shared no prime factors.
“How did your group figure it out?” Bruff asks her.
She says that 493 = 17 × 29 and 611 = 13 × 47 so that makes them relatively prime. She explains that her group divided each number by larger and larger primes. The method they used gives him an opportunity to extrapolate to a larger principle that he wants the students to understand: “Multiplying numbers is easier than factoring numbers.”
Bruff is pleased with how this class session is going. He next asks a few more students to explain their methods for arriving at correct answers. Educators call this “metacognition” or “reflection”: pausing to think about how you learned, understanding different ways of solving the same problem, and extending your method to other outcomes and applications beyond the immediate problem you are solving. Some theorists believe the most important learning of all happens through reflection because this is where the student begins to grasp her own principles and best practices for future learning. These are key ingredients to learning beyond the test, to learning that can be applied to the rest of your life.
It turns out that, in their group work on relatively prime numbers, Bruff’s students hit upon a method that, structurally, is important to a much larger mathematical problem with vast practical implications. “It’s the main idea behind public key cryptography,” he says. They discovered how to use prime numbers to generate codes, one of the most important security systems of the open Internet, used for digital signatures and for legal and other sensitive documents, among other things.
Bruff will build on this foundation of knowledge, trusting collaboration, and growing confidence all semester. He is quick to point out that the clickers are only a tool. They aren’t magical devices that do learning for you—or evil ones that prevent you from learning. The key is to use what advantages technology offers to make the most effective learning possible. Peer instruction is his principal pedagogical method. Everyone is engaged, everyone participates, and the devices aid the interactive learning process. If they didn’t have clickers, they could come up with a workaround using paper and pencil. The point is a method of interaction that requires students to actively engage with the material and try to figure it out for themselves. The opposite method—using a fancy digital tool, but in a way that does not require active student participation—is not effective. Without a well-designed new pedagogical method, technology alone does not help students learn. It is no different from lecturing and scrawling equations across the blackboard. Slide rules or calculators? The lesson here isn’t what tool you use but how you use it.
Derek Bruff writes about his teaching methods on his blog Agile Learning. The term comes from principles of agile software development. These principles date back to the late 1950s, when developers writing computer code elaborated a method by which they could “iterate” together. They would self-organize into teams composed of people with different skill sets who could add to one another’s code, allowing for continuous feedback and improvements. The ideal was not to labor alone in attempts to write perfect code. That never happens anyway, because no one person can see all of his own mistakes. Coding requires other “eyeballs,” in programmers’ parlance. The ideal was to publish code quickly and then to rely on one another to see any bugs and correct them.
Bruff uses agile software development as the basis for how he teaches math. “Learning is social,” Bruff notes. “You can spot one another’s mistakes and help one another to learn much more easily together than you can on your own. It takes away some of the anxieties around failure and math.”
Bruff applies agile learning methods even in the more daunting courses he teaches, like linear algebra. “No one’s there because they want to learn about matrix algebra, vector spaces, and eigenvalues,” he laughs. “They’re only there because it’s required. My objective is for them to understand why linear algebra is useful, why it is relevant, why they are being required to learn what they are learning.”
For this course, he doesn’t start with clickers but with photocopies of the Monopoly board. He points out that photocopies are a technology, too.
“Monopoly is a terrible board game, really. There’s no strategy to speak of. Basically, the only decision you have to make in Monopoly is whether or not to buy a property. Everything else is just rolling dice. That means you need to know if a property is going to make you money. You have to know how frequently a given property will be landed on by other players—and how much they will pay you in income if they land on your real estate. You have to model how often people land on property relative to how much you make from them.” As he explains, Jail has a huge impact on probability because people spend a lot of time in jail. The squares right after Jail get hit most frequently.
As Bruff tells me, “You can model all of this with Markov chains but, rather than begin with a lecture on what those are, I have the students work in pairs and strategize. I tell them to ignore Go To Jail and Community Chest, to simplify the model, and come up with odds for how often you land on any square.” He notes that they tend to easily figure out that, when there are no contingencies, all the spaces are landed on equally over time. Even the most anxious and math-averse students get it right. “That’s a confidence builder,” he smiles. “Then I have them pair up, throw in Jail, Chance, and Community Chest as factors and add a few other rules. We talk about how that changes our model. I give them problems and prompts along the way and pretty soon they are figuring out probabilities for every square, for every roll of the dice.”
One hundred students, annoyed they have to take a required math course, are now excited to learn something that has obvious practical implications, not only for winning their next Monopoly game but also in other real-life situations they know they will encounter. They’ve seen the tip of the iceberg and can’t wait to see the rest. Statistics is a puzzle, a problem, a mystery, and, instead of being baffled, they now are seeing statistics as something they can use.
“After they begin modeling the ways different rules change various probabilities, I then introduce the terminology of a matrix,” Bruff says. “They’ve just seen one. They’ve helped build one themselves. They didn’t even know that was happening. They are making models that help them understand interesting problems. They’re even making predictions. Next I have them Google ‘Markov chains’ and find out all the more complex models that are based on this one very simple formulation. You can model populations with what they are learning from Monopoly. In fact, you can understand Google itself since the page rank algorithm uses the same method.”
Vanderbilt is far removed from the world of community college, and yet Derek Bruff’s goal is consonant with the inclusiveness of community college learning. He wants everyone to learn and comprehend regardless of their prior training or ability, and his ambition is to show students how classroom learning can be applied significantly in the world beyond. He doesn’t care whether he breaks the curve with his classes; he’s not worried about “grade inflation.” His goal is to provide an atmosphere in which students learn and feel confident speaking up about what and how they’ve learned.
Vanderbilt is a highly selective research university, and he admits that some of his mathematically gifted students don’t need his flipped class to succeed. Some even resent it at first. “Quite frankly,” he says, “some of these students are very good at ‘doing school.’ For some of my students, the math comes really quickly. I could teach directly to those students, but I find that 90 percent of students do not know what they are doing. So, I figure, why not teach so everyone learns more, the ones who are good at math and the ones who struggled with math in high school and think they got in to Vanderbilt as some fluke.”
As he tells it, some students “just want to come hear me lecture so they can take notes, pass the class, and never think about linear algebra again. College is traditionally supposed to be about teaching—it’s all about the prof, the prof’s status, the prof’s ideas—not about what you, as a student, are learning.” Learning requires more engagement from the student; it also requires more imagination from the faculty than simply standing at the front of a lecture hall for a few hours and reciting what you already know. “There’s what some educators call a ‘mutual nonaggression pact’ between faculty and students: you don’t ask too much of me, I won’t ask too much of you. But even the students who begin resenting having to work in class, to talk, change about halfway through. I can feel they are eager to be there—they realize they are actually learning, they can see how mathematics is useful, they’re finding examples in everyday life.”
Ample research supports his pedagogical method. In a 2014 analysis of 228 different studies of STEM teaching and learning comparing the efficacy of lectures (“continuous exposition by the teacher”) to active learning (“the process of learning through activities and/or discussion in class, as opposed to passively listening to an expert”), active learning won hands down, yielding greater success rates, completion rates, and higher exam grades than the traditional lecturing methods. It also took less time for students to master the material and methods.
Great professors who use technology intelligently in the classroom do so to spark new connections and knowledge. Middle school teachers might use a calculator as an active learning tool to allow their students to find sine, cosine, and tangent trigonometric functions. This becomes a starting point, from which they go on to learn more complex trigonometry and to apply it to areas they are passionate about: astronomy, programming, acoustics, optics, biology, chemistry, computer graphics, and other subjects far beyond the syllabus of eighth-grade math. In other words, technology is a tool—and so is trigonometry.
Derek Bruff smiles with pride. “About 99 percent of students leave the class convinced this is a better way to learn. The fifty minutes we’ve spent together every Monday, Wednesday, and Friday has been worth their time.”
AS IMPRESSIVE AS DEREK BRUFF’S PEDAGOGY IS, IT’S NOT LIKELY to convince more traditionally minded teachers that new technologies can contribute to learning. Technophobia is not only a fear of what technologies might do to our brains, or our social lives, or our ability to learn. It’s also a fear of losing a useful skill. This is a perfectly rational fear, because that is indeed often a consequence when we embrace new technologies.
As a case in point, in 1837 when a geometry professor at Yale introduced the radical new technology of the blackboard, insisting his students use it to draw conical sections and write out equations, his students felt this demeaned their vaunted abilities at rote memorization and mental calculation. They rioted. Yale stuck by the prof. Thirty students were suspended and threatened with expulsion if they did not apologize. Contrite, they eventually returned, and were reinstated to good standing. The chalkboards stayed.
The Yale students weren’t entirely wrong in their anxiety about those chalkboards. What they portended was a new relationship between intellect and industrialization that was about to turn higher education on its head. Memorization and declamation could carry one a long way in a ministerial world where the elite was small and homogenous, but it certainly could not do so in a world increasingly shaped by steam engines. For that was what was implicit in the chalk marks. Not only was science becoming too complex for simple memory but also the value of memorization and oration was declining as books, newspapers, journals, and magazines packed with information became ubiquitous. The Yale students feared a loss of status. Recitation, rote memorization, and Latin grammar were no longer sources of power in a world where the wit and wisdom of the popular author or the professional journalist held greater sway than the minister at the pulpit.
What features of the twenty-first-century world outstrip our pre-Internet capacities? Professionalization and specialization were higher education’s responses to industrialization in Eliot’s day. For our era, the ability to search and research—sorting, evaluating, verifying, analyzing, and synthesizing abundant information—is an incredibly valuable skill. With the advent of Twitter and fake news, as well as the digitization of vast archives made accessible for the first time, these active learning skills should have a far larger role in higher education today.
What is often seen as a pernicious development—our dependence on our devices—has an upside. Whereas research confirms that we are, in fact, tied to our devices now, it also reveals that our devices serve as “transactive” memory aids, enabling us to encode, store, search for, and retrieve information in a way that helps us find answers more efficiently than we were able to do without these devices. We now rely on our devices for information we used to commit to memory, but we also, because of our devices, have incomparably more information at our fingertips than ever before. Some traditionalists insist, as did those Yale students in 1837, that memorization should still be important. It’s clear that memorization is no longer as significant to our daily lives as it once was, and that’s a good thing because it is also clear that we are no longer memorizing all the facts that, a few decades ago, it was thought necessary to commit to memory.
Research also reveals that memorizing less does not damage our cognitive abilities, as we once thought it did. Needless to say, myriad things still need to be memorized and become habitual to be effective in one’s life (language learning, basic math principles, how to drive a car, dance steps, game rules, etc.). However, many things we once memorized we now just Google—and we’re not necessarily the worse off for doing so. We are now in the second decade of testing for effects of the Internet on cognition, and the results are looking much more positive than they did in the earlier, more transitional stage. To summarize dozens of recent studies: the Internet does a poor job helping us do the things we did before the Internet existed in the same way that we did them before the Internet existed. Which is almost a tautology. The converse is also true: the Internet does an excellent job helping us do the things we do with the Internet and other forms of new, interactive technology.
As with just about anything else—learning to walk or play tennis, write C++ computer code, or perform brain surgery—practice matters. Not practicing tennis means your tennis game declines. Practicing tennis, though, doesn’t improve your ability to write C++ computer code. Ken’s eighth-grade teacher was perfectly correct that calculators did not help him improve his ability to use his slide rule. However—and here is what educators so often miss—Ken and his pals would have been able to apply trigonometry in so many more exciting ways had they been able to rely on their calculators for certain functions that they could then build upon. In metaphoric terms, the calculators would have freed up their brain space to conceive of more challenging, complex trigonometric applications based on their calculations.
In short, we should allow devices in classrooms more frequently than we now do because sustained, careful, critical practice with devices helps us use them better—and that’s a good thing for us and for society. Finding the most creative, engaged ways to employ technology in the classroom helps bring Charles Eliot’s university into the twenty-first century—but only if the technology introduced is part of a classroom and pedagogical redesign that takes maximum advantage of the transactive capacity suddenly at the students’ disposal. The professor can’t just opt out by introducing technology but must think deeply about what the technology can do, what the students can learn with it and about it, and how devices can help students think together, remix one another’s ideas, iterate, respond, and contribute to an evolving whole.
In my classes, I allow my students not only to take notes on their laptops but also to take collaborative notes on a web tool such as Google Docs or the open-source platform Hypothes.is, where they all contribute notes, annotate them, drop in links for further reading, and basically set up a back channel for questions they want answers to during class. Whether I’m teaching Survey of American Literature, This Is Your Brain on the Internet, or The History and Future of Higher Education to beginning students or advanced doctoral students, everyone learns more when everyone is learning and contributing together. One cannot underestimate the social component of learning. I find my students learn more and dig deeper when they all contribute. As in Professor Bruff’s classes, they also learn better methods for finding and mastering content, including web literacies that will serve them in their future.
To the horror of technophobes, I like to say that if we profs can be replaced by a computer screen, we should be. I mean that as a challenge, not as an admission of impending defeat. Every class should be an opportunity to do that which no screen can do—including offering students the opportunity to understand, create with, and also critique all that is going on in the background of our screens.
You teach students to be literate in a digital age by doing, by interacting, by evaluating technology—not by banning it in formal education, the one place where they should be learning the wisest ways to use it.
STANFORD UNIVERSITY PROFESSOR EMERITA ANDREA LUNSFORD is everyone’s ideal English professor. Graceful, with snowy upswept hair and reading glasses perched on her nose, she has a face both reserved and kind, with the wise half-smile one associates with Victorian portraiture. Don’t let her look fool you. She is one of the most innovative researchers in the country. She studies the impact of the Internet on student literacy and all the ways this generation’s particular transactive and digital literacy skills can be maximized to help them lead more successful lives, in school and beyond.
Lunsford has studied student literacy all across the nation, leading numerous research projects with different methodologies. The most famous and extensive is the five-year Stanford Study of Writing. It was designed to address this question, often raised by worried alumni and parents: Are digital devices destroying the literacy of students at Stanford? If the digital age is harming students at one of the most selective universities in America, the rest of us are sunk.
The Stanford Study of Writing, begun in September 2001, is one of the most extensive studies of this generation’s actual knowledge and how it influences what they think and write. Two hundred forty-three first-year students were invited to participate by submitting their writing, whether expository classroom writing or personal writing. One hundred eighty-nine students committed to the project. They also agreed to participate in an annual survey, and a fifth of the pool was interviewed in depth annually. In 2006, once all of the data were collected digitally, assessment of the massive archive of over fifteen thousand pieces of writing plus the interview and survey data began.
That students were willing to grant access to such a quantity of academic and personal writing is extraordinary. They submitted their research papers as well as emails, blogs, social media writing, journals, creative writing, and even scripts for videos. Lunsford was curious about not just how well these high-performing students wrote in their classes but also how they felt about assigned writing. She also wanted to investigate how often, and in what styles, students wrote outside of class.
Lunsford found that, contrary to all the hand-wringing, students today, including those who reported spending significant amounts of time online, are good writers. However, she found their writing had a distinguishing feature that makes it different from the writing of previous generations she has studied, a difference that she and her colleagues attribute to their time online: this generation is unusually adept at kairos, a rhetorical term meaning the ability to assess your audience and shape your style, tone, language, and technique for that audience.
“On social media, audiences are everywhere,” Professor Lunsford wrote in a paper analyzing the data. “Online, it is hard to tell who is the writer and who is the audience because response is instantaneous. This generation of students understands that well, and we saw it in the way they adjusted the tone, vocabulary, forms of address, even the humor depending on to whom they were writing. That’s an exceptionally sophisticated kind of literacy, to be able to control for effective communication to a specific audience.”
Interestingly, Lunsford found few traces of the traits that other pundits assume characterize student writing today—shallowness, stupidity, distraction, or loneliness. She and her team simply did not find, in any of the qualitative or quantitative methods they used, any evidence for an erosion of literacy due to the time students were spending online. The only clear negative she came across is that students enter Stanford with a higher level of confidence in their writing ability than they possess when they graduate, possibly, Lunsford conjectures, a consequence of their professors reminding them repeatedly of how social media use is destroying their intellect. She surmises it may also be a consequence of all the individual feedback their writing receives. At Stanford, students are given an enormous amount of individual attention. Some 35 percent of classes have fewer than ten students. The students write a lot and receive expert, individual, and often critical feedback. They become stronger writers in the process. They hear and absorb the critique but don’t seem to believe that their writing ability has improved because of it.
Another of the study’s findings is that students do not do particularly well in writing papers just for the sake of writing papers. Rather, students value writing that “makes something happen in the world.” From a traditional writing teacher, this is a serious criticism of this generation. In the typical five-paragraph essay, for example, the writer employs a prescribed method, almost a formula, to shape each section of the essay, and you don’t deviate from that structure even if your audience changes. Nor do you need to because, in the traditional five-paragraph essay, the audience is unchanging: it’s the professor. Students learn to write essays that only they and their professor will be reading, in a form and format that are rarely used beyond the classroom.
For anyone dedicated to higher education designed for what comes next, students’ desire to write to make something happen in the world is a cause for excitement. That’s how Professor Lunsford views this finding. She encourages her students to discover the best ways to use their writing for a purpose. Perhaps, instead of a conventional term paper, they might conduct a research study about colon cancer and turn it into a white paper and a poster for a community public health event. Or maybe they write an op-ed for the local newspaper on how Stanford might change its cafeteria purchasing options to increase relationships with neighboring farming communities. She seeks to bolster her students’ already strong sense of kairos by encouraging them to write on important topics and then define and address an audience beyond her, beyond her classroom.
Virtually every serious study of Millennial reading and writing habits confirms what Professor Lunsford has learned in her quarter century of studying digital age student literacy. It turns out, for example, that Millennials read more than any generation since World War II. But, again, you would miss the depths of their literacy if you used traditional assessment methods. What they read most voluminously during their teen years is young adult literature, a publishing category that barely existed before the Internet. Many of those youngsters waiting at midnight at the local bookstore for the latest Harry Potter volume had video games in their hands and didn’t see that as a contradiction. On the other hand, those avid young digital readers have also grown up to read distinguished, contemporary prize-winning books, including graphic novels, works by young immigrant and minority writers, books by writers from other parts of the world, and even poetry, especially by new authors; their annual book reading, as adolescents and as adults, outpaces that of preceding generations. Like that first generation of novel readers in the information age, which began around the time Thomas Jefferson was president, young people clamor to read literature relevant to their lives, reflective of those lives, and relative to the social and technological challenges they face.
What Andrea Lunsford sees in her classes is that students’ constant interactive lives online, rather than making them dumber or lonely, make them more connected and engaged with one another, with the larger culture, and even with printed, old-fashioned books. In her field of rhetoric, this manifests in a kind of literary urgency. This generation wants to know what is important and wants to communicate what they know. Writing well is important to them, and they judge themselves more and more strictly the longer they are in college. Writing a term paper for the sake of writing a term paper, one that will be read only by the instructor, seems pointless, even ludicrous. Why write just for the sake of a grade on how you write? That is antithetical to the purpose of writing: communicating, connecting, persuading, interacting. Kairos.
Working from similar assumptions about the writing and reading commitments of students today, Professor Juana Maria Rodriguez has found ways to restructure her classes so that her students engage in original research and writing with an active social purpose. In the Department of Women’s and Gender Studies at the University of California at Berkeley, Professor Rodriguez teaches LGBT 146, “Cultural Representations of Sexuality.” In this course, students not only read a number of complex, dense theoretical texts but also conduct original archival and ethnographic research. However, where formerly they wrote up their findings in research papers, now they use results to identify and correct weak, erroneous, or missing Wikipedia articles on major lesbian, gay, bisexual, and transgender (LGBT) scholars, theorists, and performance artists.
As recently as 2007, some departments, colleges, and universities were still “banning” the use of Wikipedia (not that this was enforceable). Technophobes assumed Wikipedia was bad, just as they felt suspicious of clickers—it had to be because it was produced in a way antithetical to top-down, scholarly, expert-driven, peer-reviewed scholarship. Wikipedia is crowdsourced, and you do not have to be a certified expert with a PhD to contribute to it. Many, therefore, opposed its use.
Professor Rodriguez reasoned otherwise. If it is bad but open to editing, why not have students use their research to contribute to the world’s most widely used encyclopedia? She challenged her students to make improvements in entries. She enlisted the help of the editors at Wiki Education Foundation, a nonprofit that bridges Wikipedia and academia and that is dedicated to helping professors craft classroom research and writing assignments to improve Wikipedia’s scholarly accuracy and coverage. WikiEdu also works with professional librarians to expand the public’s access to library resources and academic professional associations to find ways to ensure that fields are being covered as thoroughly as possible.
In LGBT 146, Professor Rodriguez teaches students professional research skills, including how to authenticate accurate and reliable information sources, how to address controversial subject matter with professionalism and accuracy, and how to cite sources properly, skills our society as a whole needs desperately. She is often surprised by the results even though she has high expectations of her students. For instance, her students discovered that the influential cultural critic José Esteban Muñoz, whose tragic death at the age of forty-six rocked the LGBT academic community in 2013, had been addressed by only two scant paragraphs on Wikipedia. “José Muñoz finally got a page that might begin to approach his significance,” Rodriguez says. “My students also created pages for Essex Hemphill, Justin Chin, Martin Wong, Gil Cuadros, and some local Bay Area queer luminaries: Adela Cuba, Chili Felix, Cecilia Chung, and a beautiful page for tatiana de la tierra. They also added to a host of other pages. They are a start towards making Wikipedia a more queer, colored, inclusive, and accurate space.”
Most people, looking from outside, might think that a gender studies course, one on queer performance art, would have no relevance to the students’ future jobs. Yet, Professor Rodriguez has taken a subject area about which her students are passionate, with a commitment that grows out of a deep sense of personal or political identity, and has designed a course that, like Bruff’s, is based on mastery learning that will influence how they learn everywhere—in other classes and in their future work lives. The research, reading, writing, and online-editing skills they gain give them an advantage at competitive entry-level job positions. Because Wikipedia provides analytics that track every change to every entry, her students explore the methods and power of data analysis. In the course of creating their entries, they compile annotated bibliographies of all of the work they do, and they use an open-source tool called Zotero to collect their citations and references and to make them available online to fellow researchers beyond their classroom. Many entry-level positions now require web and digital skills that many students at elite institutions simply do not possess. Rodriguez’s students aren’t just mastering a subject matter about which they care deeply but are also defining and managing a complex project and working through, individually and collectively, the best way to realize it, from inception to completion. They are learning how to write for a real-world audience and how to contribute meaningfully within a set of institutional rules and practices.
Uniting the projects of Derek Bruff, Andrea Lunsford, and Juana Maria Rodriguez is a deep understanding of how the traditional classroom can be redesigned to help prepare students for the challenges their generation faces. What also unites them is a conviction that a professor should use the technology best suited to the needs of students. It is the ideal way to empower the next generation to use the avalanche of information at their fingertips in a purposive, responsive way to make possible their own future success and, ideally, their contribution to a better society.
EVAN MISSHULA IS A PROFESSIONAL PROGRAMMER WHO IS ALSO earning his doctorate in computer science. An open source and open access advocate, Misshula is passionate about coding, and especially about the need for more women and minorities to enter this field that is changing all of our lives. His goal is to train the next generation of programmers and technology designers to care not just about innovation but also equity. He has finished his coursework at The Graduate Center (CUNY) and, while he writes his dissertation, is teaching two courses. One, “Databases and Datamining,” is offered at the John Jay College of Criminal Justice (CUNY).
John Jay is 40 percent Latino and 21 percent black. Nearly half of the students are first-generation college students. Misshula uses open-source content in his classes and to develop new software, but he hadn’t applied the Internet’s open-source, agile methods to his teaching before. He decided to give it a try. Instead of a traditional final exam, he challenged his undergraduates at John Jay to come up with an app that made a “public contribution.” He left the nature and scope of that app up to them.
Two of his students, Nyvia DeJesus and Marta Orlowska, asked if they could develop an app for recently released prisoners. They developed Jailbreak-my-life, a mobile resource guide that provides interactive information to help former prisoners schedule their lives. After years of confinement, where everything is scheduled and choices are made for them, released prisoners can find the challenges of managing time and scheduling meetings formidable. Some wind up back in prison for missing meetings with parole officers or other required appointments. DeJesus and Orlowska decided to incorporate a Google Maps application programming interface so users could find the nearest, essential resources such as food, jobs, free tutoring, counseling, and health care. They used a modern stack of Node JavaScript, HTML5, and React programming languages to build their app.
Taking the kairos principle to the next level, Misshula registered his students for the Women in Technology Hackathon for Good. A hackathon is a largely unscheduled free-for-all. Hackathons are invention marathons where participants, gathering in spontaneous teams, sometimes work feverishly all night, everyone learning from everyone learning, as they proceed together toward a goal. Some people come to pitch projects. Others walk around, find something they are interested in, and sign on to work on that project with others for the day. There aren’t directors or referees of any kind. It’s all a bit haphazard and works more because of the goodwill and energy of the participants than from any careful planning by the organizers. That fluidity is what you sign up for. There’s usually a purse and a prize for the winning teams, but the process, the new skills they learn on the fly, and the networking are the real reason so many programmers flock to onsite and online hackathons.
Thousands or even tens of thousands of hackathons happen around the world every year in education, in the nonprofit world, and in industry. New York City even maintains a website devoted to all of the hackathons going on at any given moment—civic hackathons, environmental hackathons, technology hackathons, music hackathons, educational hackathons, community activism hackathons, diversity hackathons—all in one weekend in one city.
When you go to a hackathon with a project, you typically set a goal for the concrete product you want to “ship” at the end of the day; on the hackathon website you post specs about your project, your team, and your ideal collaborators. People drift by and tune in to your conversations and decide on the spot whether they want to work with you. You post the open-source code as you are iterating throughout the day and, after the meet-up, a virtual hackathon continues online, with others pitching in to improve and finish a project you all care about. If you are used to conventional business models and project development, this process sounds problematic, to say the least. Amazingly, it works.
The open-source hackathon reverse-engineers just about everything in Charles Eliot’s university. There’s no hierarchy; no credentials are required. Hackathon participants don’t ask and don’t care whether you have a PhD or work for Microsoft. They want to see whether you are good at working with others, soliciting contributions and participation from everyone, to arrive at a goal. At a hackathon, the goal is to do as much as you can in a limited amount of time. Because the project is also open source and online, you can return to it later, after the hackathon is over, to improve it through iterating, modding, remixing, and morphing until it reaches the right stage of excellence.
The purse for the Women in Technology Hackathon is small, but finding potential partners, employees, and other talent for future collaborative projects is the main reason so many people show up. Professor Misshula’s students, DeJesus and Orlowska, were seeking partners with whom they could work to finish their Jailbreak-my-life app. They ended up pairing with two professional developers, Sara Morsi and Igor Politov, one of whom was a recent alumnus of another CUNY school.
And they won. They won the best overall award for a female-led team and the prize for the best use of the Harmon API (application programming interface).
Winning wasn’t the end of the matter. After the hackathon, all have continued to work together at the intersection of computer science and law enforcement. The project has turned out to have an indelible impact on all of their lives. The undergraduates reached out to Jeffrey Coots, director of the From Punishment to Public Health initiative at John Jay College of Criminal Justice, to try to raise money to produce the app and make it free for recently released prisoners. DeJesus, who moved to Texas with her family a year after the hackathon, determined she would continue her career as a programmer working with the criminal justice system. Orlowska accepted an internship at the Defense Intelligence Agency and was offered a permanent position in DC. Evan Misshula now teaches computer science to undergraduates at John Jay and has also begun teaching the computer programming language Python to detainees at Rikers Island, as well as data management and descriptive statistics to students in the New York Police Department’s Executive Master of Public Administration program. It is hard to imagine a more edifying conclusion to a trip to a hackathon by a professor and two of his star students.
The steam engine and other industrial age inventions harnessed the strength, speed, and power of the machine to do jobs that no one human could perform alone. The Internet and all of the computational technologies developed since its invention harness the interactivity, connection, participation, and access of massive numbers of humans and all the data that we produce, to accomplish work at a scale and speed almost unimaginable. It is hard to think of any aspect of our lives that has not been changed by this technology.
The new education must not only recognize this reality but reimagine higher education that takes advantage of the digital skills our students bring to college while also training them to be full, critical, creative, and even skeptical participants in this technology-driven age. Technophobia is no longer acceptable—in the classroom, in the structures of higher education, in the curriculum, in the pedagogy. Technophobia hamstrings our youth instead of preparing them. It limits them instead of arming them to deal with the complexities of a world, a workplace, and a future that most of us cannot begin to grasp or predict.