4 AGAINST TECHNOPHILIA

IT IS JANUARY 2013. THE NEW YORK TIMES HAS PROCLAIMED 2012 to be “The Year of the MOOC”—the massive open online course. Best-selling author and Times columnist Thomas L. Friedman hails “a budding revolution in global online higher education.” He insists MOOCs will transform all of higher education and slash the skyrocketing cost of tuition. Why stop there? “Nothing has more potential to lift people out of poverty,” Friedman insists, than the MOOC.

Three elite universities—Stanford, MIT, and Harvard—are leading the MOOC charge, spinning out for-profit (Coursera and Udacity) and nonprofit (edX) companies that will host and stream video lectures delivered by eminent professors from a consortium of a dozen or so top universities.

My university—Duke—is one. We’ve partnered with Coursera, a company started by two former Stanford computer science professors, Daphne Koller and Andrew Ng, with an undisclosed amount of venture capital. MOOCs deliver “courses” in the form of a sequence of digitized video lectures posted to an interactive Learning Management System hosted by Amazon Web Services. These courses are free to auditors or, if you wish, you can pay a $100 fee to take a verified “Signature Track” that offers a certificate to those who watch all of the videos and pass the multiple-choice, machine-graded tests. The MOOCs are open admission, available to anyone older than eighteen years the world over, not just college students. I have been invited to teach a MOOC on the history and future of American higher education.

I feel a bit like Ken’s middle school trig teacher. I am skeptical of this newfangled technology whose advocates promise, in the business jargon of our day, to “disrupt” the status quo in higher education. The MOOC pedagogical model of experts pontificating in half-hour videos feels retrograde to me, a digitization of the tired, passive, broadcast model of education, the one-way transmission of information straight from the expert’s mouth to the student’s ear. Ample research shows that this model has very little impact beyond the test. I want much more for my students: I want them to ask hard questions, instead of merely memorizing answers, and I want them to leave college with confidence in their ability to tackle the most complex problems they encounter. The multiple-choice tests and the idea of a certificate of completion are offensive. This is not my idea of college. I’m suspicious of the MOOC business model, too. Coursera pays me a modest fee to make the six half-hour videos, an add-on to my regular Duke job. I don’t see how this arrangement by a for-profit company could lower Duke’s operating expenses or reduce student tuitions. Will students and faculty profit intellectually as the investors accrue monetary gains? And the idea that MOOCs will lift the masses out of poverty is ridiculous—technophilia at its worst.

If technophobia means a fear of slide rules, blackboards, calculators, clickers, or laptops in the college classroom, technophilia can lead to giddy and unrealistic notions about what technology can do, cure, and offer. Technophilia can make you lose your critical marbles, so to speak, cede away your rights, your data, your privacy, and just about everything else to Google, Apple, Microsoft, or any other company that powers your ebooks, your GPS, your children’s toys, your appliances, your transportation, even the pacemaker that keeps you alive. Ownership activists Aaron Perzanowski and Jason Schultz discovered just how smitten people are with technology in an experiment they concocted using a dummy website that promised free software downloads to customers who signed a terms of service agreement. Ninety-eight percent of users assented to terms whose first sentence ceded away all rights to their “first-born child.”

Our complacency in the face of the technologies that run our lives should make us even more determined to ensure that our students have the computational literacies they need to prepare for their future. Technology—now more than ever—should be the starting point for deep, thorough, critical analysis. Videos of famous professors from elite universities packaged by other famous professors who formerly taught at elite universities will not “transform” higher education except perhaps in an unfortunate way, reinforcing and spreading the nineteenth-century passive, hierarchical models of teaching and learning.

Seymour Papert, a pioneer in the field of artificial intelligence, was an influential learning theorist and a passionate advocate of student-centered learning. He dubbed his particular computer-inspired variation on student-centered, active learning “constructionism.” He championed the idea that the best way to learn—especially in the post-Internet world of interactive communication—is literally by constructing something: making, doing, exploring, experimenting, trying, failing, analyzing the failures, and trying again. For K–12 learning, he wanted to put younger children and older children with a range of skills and backgrounds in the same classroom so that they could learn together. He wanted kids with physical or cognitive impairments to be part of the “constructing,” too; if you grow up with a disability, your whole life is a creative workaround. As cofounder of the MIT Media Lab in 1985, Papert mixed experts and nonexperts, specialists and novices, computer scientists and artists. The Media Lab was a place where students and faculty, together, could explore, reflect, build, design, and improvise new tools inspired by new ideas and new tools. He believed that a teacher’s role, especially in the computer age, was to provide “the conditions for invention.”

Watching video lectures is the opposite of constructivist learning. I dig in my heels with Coursera. I’ll make the course videos, I say, if they let me try an experiment. I want to see whether I can work with a group of actual, face-to-face onsite students to, together, turn the conventional MOOC form into something student-centered, not just for the onsite students but also for the thousands taking the MOOC online. Instead of conventional term papers, I will challenge the onsite students to design creative and challenging ways to wrangle the eighteen thousand online registrants of the MOOC into an interactive learning community. Papert warned that “nothing could be more absurd than an experiment in which computers are placed in a classroom where nothing else is changed.” If MOOCs simply repeat industrial age hierarchies, curriculums, and pedagogy, they will only replicate the current absurdity of those traditions on a massive scale.

To their credit, Daphne Koller and Andrew Ng do not oppose my ambition to include thousands of online students in an interactive seminar. As engineers, they are interested in what we might learn from the project. They put me in touch with others who are experimenting with MOOCs. At the University of Pennsylvania, Professor Al Filreis is running “ModPo,” a MOOC in modern poetry that doesn’t use videos. Filreis gathers a dozen students together each week. One class member “assigns” a poem of her choosing to all Coursera participants. Once a week, they videocast their live discussions on the Coursera platform and invite anyone in the world to call 215-746-POEM to contribute to the conversation. Difficult, arcane contemporary poets who previously had a small audience are suddenly selling chapbooks and gaining notice worldwide. International participants nominate poems by their favorite poets, too, contributing to a rich syllabus of world poetry.

That format won’t work for my history-based course, but Prof Filreis’s deviation from the MOOC model is a useful precedent. Before the onsite seminar students arrive for the semester, I assemble a small team to create the videos. Some graduate student assistants and a few technical staff, a first-time producer, and I all set out to make a MOOC. Instead of hiring professionals, Papert-like, we learn together, writing, filming, editing, recording, and uploading the six half-hour videos to the Coursera platform. I’m the one on camera for these DIY videos, so, at the end of each one, we have an “extras” feature where we turn around the camera to show viewers everyone who helped make the week’s lesson. We explain the various tools we used, from video cameras and software editing programs to cue cards and websites like Lynda.com, which offers instruction in all of these things. In addition, we invite viewers to construct their own videos in response to ours and upload the link to our course website.

Our first four videos provide some Anglo-European context for what is mostly a history of US higher education, with each half-hour video concentrating on a key moment in that history: the founding of the land-grant universities, the redesign of the Puritan college into the modern research university, the GI Bill and the Great Society’s expansion of higher education, and the defunding of higher education over the past forty years. The final two videos focus on the future, proposing new ideas for transforming higher education. They offer techniques for active, student-centered learning designed to prepare students for the tumultuous, interactive time in which we all now live, and they invite members of our worldwide MOOC community to contribute their own alternative pedagogies.

In addition to the proprietary Coursera platform for interaction, we build an open-source online web space on which the onsite seminar students will host their forums, blogs, wikis, and whatever other kinds of online interaction they come up with. We make that content available publicly and for free to anyone who wishes to access or contribute to it. I hope that both the content of the videos and our student-centered methods will transform the static MOOC format and influence methods used in lecture halls and seminar classrooms taught by our eighteen thousand students all over the world. We’re evangels for this new education. We decide to give our course a jocular name to signal our grand ambitions beyond the standard MOOC: “The History and Future of (Mostly) Higher Education: Or, How We Can Unlearn Our Old Patterns and Relearn for a More Successful, Fruitful, Satisfying, Productive, Humane, Happy, Beautiful, and Socially Engaged Future.” We give it a Twitter hashtag: #FutureEd.

After the MOOC videos are finished, I turn my attention to the actual face-to-face seminar and the students who are charged with leading this grand experiment. Like Papert, I believe the nineteenth-century model of “age-graded” and discipline-restricted education limits and homogenizes the learning experience, so I open my seminar class to undergraduate, graduate, and professional school students in any discipline and extend invitations to students at all the universities in the area. My course description is intentionally vague and provocative. I am looking for twelve to fifteen truly daring and technologically adept students who will take this brand-new experimental MOOC form “to the next level.” I warn that this will be the most demanding, creative, and time-consuming course they’ve ever been part of.

Students sign up from Duke, the University of North Carolina, and North Carolina State University. From the registration permission list, I select fourteen gifted and vastly different students for this challenge. Among them are a nineteen-year-old psychology major, a twenty-year-old aspiring educational reformer, a thirty-something Iraq War vet working toward a master’s degree while rehabbing combat injuries, doctoral students in fields ranging from Caribbean history to computer science, a former philosophy professor returning to earn a degree at the Sanford School of Public Policy, and two students (a photographer and a graphic artist) working toward master of fine arts (MFA) degrees.

When the Chronicle of Higher Education finds out what we’re up to with #FutureEd, the editors invite me to write a column documenting each week’s activities, both in the onsite seminar and in the eighteen-thousand-person MOOC. I suggest instead that, because we’re promoting student-centered learning, the seminar students should write the columns, not me. Although this is unprecedented, the Chronicle editors agree to experiment, too. Contrary to Daphne Koller’s frequent pronouncement that higher education hasn’t changed in two thousand years, everywhere we go, we are finding people who want to be “disrupted,” including in ways more conceptually and pedagogically innovative than with MOOCs.

“Let the revolution begin!” Thomas Friedman intoned. We’re ready.

PROUD AND A LITTLE NERVOUS, FOURTEEN STUDENTS ENTER OUR classroom for the first session of #FutureEd. They don’t know much about what the semester holds except that they have been selected to lead a revolution in the university as we know it.

I don’t say a word as they take their seats; they haven’t seen a syllabus. Yet, they are primed, eager to disrupt the disrupters, laptops at the ready.

I hand out pencils. And index cards.

“Pop quiz,” I proclaim, tapping my pencil imperiously against a blank index card.

“Who invented the printing press?” I ask, in my most conventional, professorial lecture voice. This is the old one-way transmission model of education, where the professor poses the questions and the students supply the right answers.

I produce an analog timer from my computer bag. “Put your pencils down when you finish. You have ninety seconds,” I say in the same officious tone.

They exchange puzzled glances, but once the kitchen timer starts ticking loudly, they turn to their index cards and write. These would-be disrupters of higher education, these eager revolutionaries, snap into Good Student form.

This is how we are training youth today. It’s the pattern we most need to break. If the school bell was the symbol of nineteenth-century public, compulsory education—training all those farmers to be machine-like factory workers—the timer is the symbol of twentieth-century, high-stakes summative testing. With the advent of Taylorism, it was no longer enough to think. You had to be able to think on time, to pick the “right” answer among specified choices fast, to supply as many right answers as you could before the timer went off.

The timer epitomizes the outmoded production model of learning that confuses standardization with high standards and reduces success to the answering of questions simple enough to be graded by a Scantron system—when, in fact, our era demands dexterous, innovative, and interactive thinking. This is a ludicrous and irresponsible way to prepare students for a world in which online attackers can shut down a website and totally disrupt our lives by bringing down central servers. Before the October 22, 2016, cyberattack on the New Hampshire–based Internet performance management company Dyn, most of us weren’t even aware that the much-touted Internet of Things had transformed our toasters and other household goods and even our kids’ toys into unwitting collaborators of malicious hackers. What else is our technology up to that we don’t know about? The complexity of the world we live in is far beyond the imaginings of most of us old enough to remember a time before the Internet, and our educational system is doing little to change that for the younger generation. We subject the average American schoolchild to approximately 110 timed, high-stakes, standardized tests on their way to high school graduation; most of these tests were developed in a format that has not changed much since 1914. The misalignment between how we are training our youth and the world they must contend with could not be more glaring.

All the students have put down their pencils. They didn’t need ninety seconds to write “Gutenberg.”

They have no idea where I’m going with this. They are more than a bit disappointed. None of this feels exciting, experimental, or revolutionary.

#FutureEd is #sad.

ONLY A VERY FEW YEARS LATER, WEVE FORGOTTEN THE MOOC madness that took over higher education in 2012. Technophilia does that: you become so dazzled by all the hype about the future that you forget the past and lose your way in the present. Even when technology dashes your hopes, you are willing to buy the next shiny toy, and the next, and the next. You forget the consequences of the last failed promise. Surely this technology will solve all our problems. Isn’t the problem with higher education that professors are stodgy and resistant to change? Isn’t it better to hire CEOs and programmers and for-profit companies to disrupt higher education using the best new tools? Don’t we all really need to step aside and let Silicon Valley make a real difference in our colleges and universities?

We now know that MOOCs did not do what they promised—slash tuition costs, offer first-rate education to anyone in the world, end global poverty. In fact, they have in some cases had pernicious effects. At one major institution, the University of Virginia, MOOC mania so engulfed the Board of Visitors that they precipitated an unexpected resignation of a respected and effective new president, Teresa Sullivan, allegedly because she didn’t jump onto the MOOC bandwagon during the summer of 2012. It took Eliot and his colleagues sixty years to redesign the research university. Perhaps they reasoned that, in the Internet age, with MOOCs, surely UVA could do it in sixty days! Although faculty and student protests led to the reinstatement of President Sullivan, the notion that higher education was on the “brink of a transformation” because of MOOCs was an opinion held far beyond Virginia.

MOOCs aren’t the shiny new thing anymore, but large and powerful companies will dangle other technologies, promising innovations. Technophilia and technophobia harm in both directions. Jumpy administrators and faculty can make hasty, poor decisions to adopt the latest technological gadget so as not to seem old-fashioned or passive. Or they might resist all new ideas—even good ones—because the previous technological miracle they invested in flopped.

That’s why I began my first meeting with the #FutureEd seminar with this archaic routine of index cards, pencils, and timer. It wasn’t sadism but a classic pedagogical technique called “defamiliarization,” a method that puts students in a situation where suddenly they question what they thought they knew and examine what they take for granted. Before they took over a leadership role with thousands of international MOOC registrants, I wanted them to think critically about several key characteristics of the higher education we’ve inherited: technology, the timed test, pat answers like “Gutenberg,” the teacher’s authority, the broadcast model of pedagogy—all components of higher education that the so-called revolutionary MOOC reproduces on a massive scale. Although these were curious, risk-taking students—they would not have enrolled in this seminar otherwise—they were also excellent students, accustomed to A and A+ grades. They were well trained to the assumptions and practices of formal education.

Sir Francis Bacon insisted that knowledge is power. The French social theorist Michel Foucault admonished us to think about who controls knowledge and who gets to make it, enforce it, and rebel against it. Following them and other thinkers, I wanted my #FutureEd students to be asking: Whose knowledge? Power for whom? What does it mean when venture capitalists want to transform higher education by reproducing at scale the one-way model of teaching? Though such a “solution” might have been appropriate in the age of the emerging broadcast technologies of the telegraph, radio, and film, a defining quality of the Internet is that your audience can speak back to you, regardless of your status and power. Why aren’t we educating for an era in which the skills of connecting and constructing are more important than ever? Is the MOOC really a revolution in learning or, like another era’s horseless carriage, a mechanized version of what already exists? Can canned lectures delivered by elite professors help students think and learn for themselves? Are we even wise to be teaching online, making content—our information, our secrets, our consumer habits, our bank accounts—available not only to Google and Coursera but also, possibly, to government surveillance and malicious hackers? Are we ceding to Coursera and other for-profit technology companies the educational equivalent of our “first-born child”?

“I have a proposition for you,” I tell the fourteen students in #FutureEd. “If you believe what you wrote on that index card, you can hand it in now and you have a guaranteed A for the course.”

“But what happens if our answer is wrong?” Claire asks.

“Well, you’ll flunk, of course,” I say.

There is some uncertain laughter.

“Is there an alternative?” Max asks.

“Sure! You can turn over the card, take another ninety seconds to answer the question using any devices you brought with you. Do some research. Go beyond Wikipedia. Use any method you want—just make sure to verify your findings. If you don’t believe in your current answer, see if you can come up with a better one! Who really invented the printing press? Ninety seconds.”

Before the timer starts ticking, the students are talking to one another, strategizing who will do what. I hear one student say she can create a Google doc where they can put any information that seems relevant. Another suggests that they use the open-source Zotero reference management tool to track their citations.

I’ve done a version of this student-centered research exercise probably twenty times now in different classes and workshops, and the same thing happens every time. If students know it is permissible, they turn to one another even before they turn on their laptops. Millennials have recently been dubbed the “We Generation” (a revision of the earlier “Me Generation”). I think the collective term is more accurate. In my long teaching experience, they are by far the most accomplished generation at productive collaboration. Whenever I hear commentators say that smartphones and laptops have made students isolated and narcissistic, I know they haven’t sat in on a class designed to encourage students to work together creatively.

“So, did Gutenberg invent the printing press?” I ask after the minute-and-a-half burst of energetic, engaged research and dialogue.

Everyone laughs. Over the next ten minutes, they fill me in on what they’ve learned. They found that Bi Sheng developed the basics of movable type in AD 1040, during the Song Dynasty, using porcelain tablets that could be rearranged to produce and then reproduce over a hundred thousand Chinese characters. This technology was exported to Korea, where, later in 1377, metallic typefaces were invented and used to publish the Jikji, the world’s oldest extant books printed with movable type. Chinese and Korean prints and printed books, along with several versions of the printing technology itself, traveled between East and West on the Silk Road with spices, fabrics, ideas, art, math, and science. In 1450, Johannes Gutenberg created his printing press for the limited number of alphabetic characters of European languages. The Gutenberg Bible was published in 1455, and the rest is history—Western history.

“Did you know any of this before?” I ask.

No.

“Have you ever been challenged in a classroom to use all the information you have available to you on your phones and laptops to supplement or even question the authority of the knowledge you’re presented with?”

No, again.

I note that they would not even have had access to most of what they discovered in ninety seconds before Wikipedia offered us all a platform on which anyone in the world can contribute just about any kind of knowledge, as long as it could be verified by external sources. Even our most extensive research libraries are dominated by Western accounts of history that often (even in scholarly sources) fail to acknowledge non-Western innovation. Wikipedia and other open sources are vast, global stores of crowdsourced knowledge that provide alternate versions of the history we’ve come to accept.

Unfortunately, this hasn’t changed the syllabus of formal education very much. Technology is ideology. Technology is epistemology. How we know shapes what we know, but we have not yet fully grasped the different ways that our new access to global and local stores of knowledge should be revolutionizing higher education. We can be doing so much better.

“What if I were one of those professors who bans laptops and other devices—?” I ask.

“Gutenberg would have invented the printing press,” Claire quips, before I can even formulate the whole question.

“And if you are asked about the inventor of movable type on the Graduate Record Exam?”

“Gutenberg,” they say almost in unison. The tone is angry now. You don’t have to tell students today that a lot of what they learn in college, despite the high cost of tuition, is antiquated.

“If we do it right, everything in our MOOC on the history and future of higher education will address these concerns,” I insist.

“Do you know who invented the index card?” Barry, a computer science doctoral student, asks. It’s a question an open-source programmer would raise, because it’s about the tools we’re using. The index card, he informs us, has quite a lineage, and one that is relevant to our experimental class. It was invented around 1760 by none other than Carl Linnaeus, the father of modern taxonomy. The Swedish physician aspired to categorize and arrange in hierarchies all the world’s plants and animals—including humans—assigning everything a phylum, a genus, and so forth for his Systema Naturae. He cut heavy paper into standard-sized cards and stored a discrete bit of information on each, which enabled him to reorder his data while keeping each datum distinct.

In Systema Naturae, Linnaeus also divided humanity into four races based on continent of origin and skin pigmentation. He specified “temperaments” for each race, and, because of his hierarchical assumptions, those who were not white northern European Christians were assigned demeaning and lesser characteristics. He called Asians “luridus” (yellow), intending all the etymological meanings of lurid: sallow, ghastly, wan, horrifying. He had a fifth category of humanity beyond these four racial categories: “monstrosus.” Into this category went people with all imaginable disabilities, as we would now call them, including genetic differences, cognitive differences, even cultural differences.

Barry tells us how Linnaeus’s index cards and his hierarchical classification system were taken up in the 1870s by Melvil Dewey, a librarian at Columbia University, creator of the card catalog and the Dewey decimal system. In the 1890s, edge-notched cards were invented that could be sorted and hung from a long, needle-like rod. In the 1960s and 1970s, these notched index cards became the basis of early computational databases. Some of the culturally biased classification hierarchies, trailing all the way back to Linnaeus, persist deep in the binaries—1s and 0s—of modern computing. Technology can often seem objective, as if it operates without or beyond prejudice. That’s not true. If humans are biased, so are human-created algorithms. It takes systematic analysis to understand the biases technology retains, automates, and replicates.

“How many index cards did it take to make Wikipedia?” Barry asks provocatively.

After a silence indicating that the other students aren’t exactly sure, Barry reminds us that Wikipedia was created without a taxonomy. He’s been an editor on Wikipedia and knows that one of its distinguishing features is its lack of preestablished hierarchies and taxonomies. There are over five million separate entries in English Wikipedia alone. You can create an entry on the history of the VCR, Marvel comic characters, queer performance artists, or just about anything, and it will “stick” as long as you can cite an existing secondary source. There are plenty of value judgments and lots of contested spaces on Wikipedia, but there aren’t rules that make some entries count as “knowledge” and other entries “superstition.” That’s how an entry on movable type can include Bi Sheng’s contributions—it didn’t have to fight against a taxonomy that categorized Asians as “luridus” and therefore incapable of having contributed to Western “inventions.”

“Wikipedia is a free online encyclopedia that aims to allow anyone to edit any article and create them” is the first line in the “Wikipedia” entry on Wikipedia. Any topic is eligible as long as it has had mainstream media or academic journal coverage.

Barry recites some basic (and rigorous) rules that Wikipedia enforces before it will allow information to be published as “verified.” Some criticize Wikipedia for having too many regulations for reliable content; as of 2014, the complicated governing policies ran to more than 150,000 words. Barry initiates a discussion of the differences between peer-reviewed and crowdsourced scholarship and between reliability and regulation.

Leslie, one of the two undergraduate students in the class, raises her hand. She’s been conducting research on her iPhone as we speak. “Only 8 percent of Wikipedia editors are women—over 80 percent of American librarians are women.”

“So ‘open’ doesn’t necessarily mean egalitarian,” I summarize, on the first day of our massive open online course. Epistemological lightbulbs are going off everywhere.

The index card gambit worked. I don’t need to lecture them. That’s the point of active, student-centered learning. You set the conditions for invention, set challenges, and let students go from there. MOOCs structure knowledge in one way and broadcast from one source to many participants, reiterating a very old model of knowledge and power. It is our task to figure out how to set the conditions for invention for the thousands of registrants of #FutureEd.

When I challenged the onsite students to find the best ways to turn this MOOC into an active learning experience for the Coursera students, they decided to coordinate online office hours so that, day or night, a MOOC participant anywhere on the globe could have a real-time interaction with an actual onsite #FutureEd student. Sleep deprivation became a key issue with the course because just as a student’s three or four office hours were ending in the wee hours of our morning, some other country was waking up. New participants would see the dialogue that had unfolded in another time zone and would jump in. My students often came to our actual class meetings exhilarated and exhausted from a global twenty-four-hour workday.

Together, the onsite students devised many different ways to counter blind technophilia and one-way pedagogy, turning Coursera’s for-profit MOOC into a DIY peer-to-peer learning experience. They joked that we were leading a Meaningful, Ornery, and Outrageous Course. Using social media and an open educational blog, we put out a worldwide call for “co-teachers,” and seventy professors and academic administrators volunteered to host their own local seminars each week on some aspect of the MOOC’s subject matter. Each was like a fast-food franchise, except no uniformity was required; on the contrary, we urged these distributed, worldwide seminar leaders to augment the US history with their own supplementary or alternative national, regional, and local histories, using the Coursera platform, Facebook, Twitter, and the open-source blogging websites they developed to remix the content and supply a counterhistory.

In Dunedin, New Zealand, a group of educators watched the videos each week at a McDonald’s because that’s where the bandwidth was reliable and free. They ran Twitter chats after each video, adding content about the impact of New Zealand’s colonial history on its forms of higher education as well as information about Maori language inclusion and resistance. Other groups convened in Bangkok, Cairo, London, Rome, Oslo, Cape Town, and Lima. Often they posted their own forum topics in response to the Coursera videos, usually several a week, and sometimes dozens of MOOC participants would pile on, adding facts and opinions on subjects such as alternative grading systems, language reform movements, and the role of missionary schools in indigenous communities. Two different discussion groups began meeting independently in the online virtual world Second Life. In Ecuador, a participant named Vahid Masrour made beautiful mind maps each week that charted the video lesson and its global evolution.

In addition to the array of other social media they used, the #FutureEd students orchestrated two digital hackathons on an open website, each lasting approximately twenty-four hours. The first was an extension of an assignment I use to begin almost all of my classes: students use an online tool to collectively write a “class constitution,” essentially the terms of service for our semester together. I’ve done this with up to fifty students at a time, but inviting eighteen thousand to participate was something new. In the end, nearly four hundred people contributed ideas and language to the #FutureEd constitution, resulting in dozens of surprising “index card” moments.

One of the most provocative discussions came after an American participant wrote: “Knowledge is a public resource that should be open and accessible to all people, allowing for freedom of expression, dissent, and critique.” That sounds logical, but a student at the National University of Singapore pointed out that “critique” is a Western style of rhetorical argument and isn’t valued in the same way in Asian cultures. She reported on a joint program between NUS and Yale University in which the Singapore students balked at the term “critical thinking.” The Asian students found what Americans thought of as critical thinking to be rude and naive. They believed it presumptuous for students in an introductory course to critique eminent philosophers or world historical figures. There ensued an eye-opening side forum on the relationship between higher education and the European Enlightenment, or Age of Reason. A Japanese participant added that his country had proudly rejected the idiosyncratic Western divisions of “mind,” “body,” “emotion,” and “intuition.” There is no Japanese equivalent of the word or concept of rational. In Japanese, you say rashonaru. One epistemological consequence is a body of Japanese knowledge that Westerners were unable to see as anything but “superstition” until relatively recently. Acupuncture and meditation are two of many examples. In the #FutureEd Constitution, “critical thinking” was emended with a communitarian alternative: “collective, evolutionary thinking.”

The second #FutureEd hackathon was a “World Wide Timeline of Educational Innovation.” After scouring the Library of Congress catalog, the onsite students determined that there was no such resource currently available. Even Wikipedia was sparse in this area. Using the collaborative online tool, one participant kicked off with: “Sometime between 3500 BC and 3000 BC, some unknown Sumerian geniuses invented a new system for storing and processing information outside the brain.… This data processing system that the Sumerians invented is of course writing.” Someone else added an entry for 2370 BCE: “The scholar Ptahhotep completes the writing of the ‘Instruction of Ptahhotep’ as a guide to living and reflections for the instruction of others.”

Like the complex answer to “Who invented the printing press?” it turned out that “What was the first university?” had a history different from what we believed. We learned about universities throughout the ancient world, in places like Taxila, Jixia, Beirut, Odisha, Constantinople, Bihar, Fes, the Balkans, and Cairo, that had all been going strong before 1088 and the establishment of the University of Bologna, typically called “the first university.” At one point, someone corrected an entry submitted to the time line to read: “1088, Foundation of the University of Bologna, the first European university.”

In the course of this second hackathon, one of the contributors, a student from Indira Gandhi National Open University, noted that there were over three million students at her university, the world’s largest. Having an opportunity to discuss education and pedagogy with only a half dozen students at one time—even if it was remotely—was priceless. Two other participants—one at Bangladesh University and another at Anadolu University in Turkey—noted that approximately two million students attended their universities. None of them had ever experienced such a degree of intellectual intimacy as they did during the online office hours or the hackathons.

I went into #FutureEd a MOOC skeptic, and I remain one today. Yet I also saw how an intensely interactive, student-centered online community could empower students, both those in the room and those joining us from thousands of miles away. What the #FutureEd class experienced was in some ways similar to and different from study abroad. You can go to another country your junior year, live in a dorm with other Americans, have a great experience, but never really come to understand how people from another culture interpret the world. In #FutureEd, we never left North Carolina but we traveled an enormous intellectual distance by engaging with others internationally in ways that were deep and challenging.

I cannot prove it, but I have a hunch that, because the interaction was virtual, it made it easier for participants to both speak up and disagree, and perhaps even allowed more dialogue—even conflict—than if students had inhabited the same physical space. The real-time conversations were so intense that our syllabus kept changing, with different kinds of questions and ideas constantly intruding into the neat lesson plan presumed by the six initial videos. This dialogic way of knowing is foundational to the new education, a key to preparation for better lives and careers beyond school, and a way to make the university into something that is not, as the saying goes, “merely academic.”

In the series of articles the onsite students wrote for the Chronicle of Higher Education, they ended up striking a balance between technophobia and technophilia. They were more clear-eyed, analytical, and perceptive than many journalists or professors of the day who had either abetted or attacked the MOOC craze. They took seriously their responsibility as public intellectuals who had firsthand experience with a commercial enterprise developed from outside academe and that had shaken the foundations of higher education. They didn’t shy away from controversial topics. One article addressed the problems in Coursera’s changing and unclear terms of service agreements and the especially problematic conditions for use of student data. Another addressed the labor issue implicit in automating college teaching. Wouldn’t MOOCs make it even more likely that doctoral students pursuing a career teaching in higher education would be consigned to adjunct, part-time positions for their careers? If the point of MOOCs is to cut costs by cutting faculty positions, that doesn’t bode well for the future of higher education. Yet another article addressed past educational technologies that had not lived up to their hype and predicted that, once MOOCs lost their luster, Silicon Valley venture capitalists would invest in the next new technology touted as cheaper, faster, better than actual professors.

The fourteen students had succeeded in ways none of us expected in turning an overhyped, costly, and poorly conceived idea—that watching videos was the same as learning—into what was, in fact, an active, student-centered experience that would serve them well the rest of their lives. All around the world, there were formal partings as the MOOC came to an end, sad farewells among the students who had joined us for six weeks from Australia to Alaska, Cape Town to Cardiff, Dunedin to Daejeon, and who had had intense interactions in the all-night office hours that my students had orchestrated. Against odds, they had turned a MOOC into a community.

“Here’s what I see when I look outside my window,” one of the MOOC students in Thailand posted to our class blog one night. He sent a cell phone image of his computer with the #FutureEd website on the screen. In the photo, you see the fingers of his left hand on the keyboard and, beyond that, his room, and, beyond that, a window. The lights of Bangkok glitter in the darkness outside.

“Here’s what I see,” one of the students in our onsite seminar responded, posting a photograph too: screenshot, keyboard, fingers, room, window, the world of Durham, North Carolina, beyond. Dozens of similar photos began to appear, virtual postcards from elsewhere, insisting: “I was here.”

ONLINE LEARNING WILL NEVER FULLY REPLACE BRICK-AND-MORTAR institutions, but it is also certain to get better over time. Right now, over 95 percent of colleges and universities that enroll more than five thousand students also offer for-credit courses online. Unlike MOOCs offered for profit, the vast majority of these courses are taught by their home institution and fully accredited, fulfilling the same requirements as face-to-face courses. Some involve partnerships with corporations that have a beneficial effect. When Starbucks surveyed its workers to determine possible new benefits employees desired, over 80 percent listed help with finishing a college degree as a number one benefit of choice, especially if it could be accomplished with the flexibility of online instruction. Nearly two thousand Starbucks employees registered in an online program designed by Arizona State University, and Starbucks committed to pay tuition for tens of thousands more by 2025.

Yet, there is a significant downside, even, to online learning that originates within the institution: cost. In 2015, colleges and universities spent over $6.6 billion on technology, about 40 percent of it on institutional systems and 60 percent on systems with instructional and research capacity. More to the point, aggregated technology costs (from security measures to licensing agreements to hardware updates) increase every year, despite the technophiliac delusion that technology is cheap or even free. Demand and expectations have grown so great that funds tend to be assigned to technology needs often before any other category, even before investment in faculty. The hiring of underpaid, part-time adjunct instructors instead of full-time professors is a crisis now of epic proportion in higher education. As funding sources continue to shrink, universities and colleges find they must keep up their technology infrastructure even while cutting faculty and course offerings. That’s a terrible trade-off.

And it’s a trade-off that is crucial to consider as we design the new education. Charles Eliot understood it wasn’t possible to redesign just one aspect of Harvard. Everything had to be remade at the same time. Now, we need to include technology in higher education because, simply, there is more and more technology in the world we send graduates into. Because of the importance of computational technologies, we also must design programs that train students to develop and use them wisely. We need to think critically and systematically about what works, both pedagogically and financially, and what does not.

MOOCs are an example of not only the false promises of technophilia but also how initial hype can obscure both the promises and the limitations of technology. The MOOC hysteria ended in September 2013 when the bubble burst as quickly as it had inflated. California governor Jerry Brown had hired the for-profit company Udacity for an undisclosed fee to design three online courses that would “solve” the seemingly intractable problem of too many students failing introductory and remedial math, algebra, and statistics courses in the California State University system. Unfortunately, Sebastian Thrun, the computer science wizard who invented the self-driving car and who started the MOOC craze by opening his Stanford artificial intelligence course to 160,000 members of the public, did not do better than the professors at San Jose State who had been trying to raise the pass rate for years. In fact, the National Science Foundation assessed the three shiny new Udacity courses and found that the overall student pass rate was 33 percent—half the success rate for comparable courses taught in the conventional way by San Jose State University faculty.

The media reaction was swift and brutal. Professors all over academe were gleeful. The great, flashy, high-tech corporate MOOC miracle—the proclaimed disrupter of higher education as we know it—had fizzled. When you overpromise, you have to outperform. Thrun was notably candid in admitting defeat: “We have a lousy product,” he said in a postmortem with Fast Company. “These were students from difficult neighborhoods, without good access to computers, and with all kinds of challenges in their lives.” He allowed that, for this group, the online platform was “not a good fit.” Technology had not made learning easy and automatic, had not revolutionized how we teach students facing the greatest challenges. The San Jose State experiment exposed the fantasy that technology will “make” learning easy. It won’t. That’s just not how learning works.

But the profs were wrong, too. MOOCs aren’t over. They have not gone away. And there is still a role and a place for good online learning. MOOCs are no longer “hot”—both Thrun and Daphne Koller of Coursera have moved on to other for-profit ventures outside of higher education. MOOCs rarely make the front (or even the middle) pages of the New York Times anymore. Yet, today, more students are enrolled in MOOCs than were taking them at the height of the hype. In January 2016, Coursera alone was offering more than fifteen hundred courses across twenty-eight countries to over fifteen million users, with twenty-five million students having at least started a Coursera course—that’s four million more students total than are enrolled in all colleges and universities in the United States today.

Completion rates remain dismal, however. On Coursera’s Signature Tracks, which offer certificates to paying students, less than 4 percent of those who watch the first video actually earn a certificate. However, 4 percent of millions is a lot of students, and, of those who complete certificates, a substantial number report that they did, indeed, realize some career or professional enhancement for their effort. MOOCs clearly offer benefits to those who have no other options and who have the tenacity to complete a full course on their own. As one commentator notes, “Online education isn’t succeeding because it’s better than Oberlin, it’s succeeding because it’s better than nothing, and nothing is what’s currently on offer for millions of people.”

MOOCs have also turned out to offer tangible benefits for brick-and-mortar education institutions. Recently, admissions recruiters have been looking at the students who succeed at MOOCs, especially those from low-income backgrounds. If you have the persistence to complete a self-paced MOOC, you are likely to do well in a traditional college or university, no matter how impoverished your background. Another benefit has been to smaller liberal arts colleges and universities that simply do not have the funds to staff advanced courses in all fields, such as the lesser-taught languages. MOOCs allow them to expand and extend their course offerings. Finally, millions of lifelong learners around the world have also been enriched by taking free courses online, not for college credit but for pleasure or self-improvement.

George Siemens, the Canadian online learning innovator who invented the term “MOOC” in 2007, well before Coursera and edX and Udacity got into the virtual ed tech business, sees the fortunes of the MOOC as an example of the natural process of higher education reform. After the MOOC hype has passed, the next stage is to find more creative ways to use online learning to offer revolutionary opportunities to more and more students. He argues that MOOCs “woke us up” and, even with their faults, made college professors see more clearly the limits of our inherited, traditional forms of one-way, transmission-style teaching. “Now that we have the technology to teach a hundred thousand students online,” he says, “the next challenge will be scaling creativity, and finding a way that even in a class of a hundred thousand, adaptive learning can give each student a personal experience.”

By building a student-centered experience together, all of us in #FutureEd learned more about the actual world we live in, a world of technological and human connection. We gained a better understanding and a greater sense of realism, if not actual control. That’s crucial to my students’ future, whatever careers they end up in. They mastered a complex set of insights and communication skills that empower them in the workplace and in life, as they raise their own children and contend with whichever technology comes next.

Who invented the printing press? I began the #FutureEd MOOC with this exercise so that my students would be cautious and thorough whenever they faced new technologies, whether encountered in the classroom or in their future careers. Skepticism was a first step. But we didn’t end there. I challenged them to take charge, to make change, to turn their ideas about the best forms of learning into actual practices online. This meant they had to understand the back end of the technology, including data analytics, enough to remix and mod it. It meant they had to take the lead in using the raw material of the MOOC form to design a new kind of learning that went beyond what they were given, to mold and shape something innovative, interactive, personal, and meaningful. They were not bedazzled by that which is new and shiny. Nor were they technophobic ostriches, heads in the sand. They engaged the technology, evaluated it, and found ways to use it creatively, to fulfill the objectives they designed for it, and not simply be awed by its hype. That’s not a bad start for the new education.

IN THE SUMMER OF 2016, I DROVE TO FREDERICKSBURG, VIRGINIA, to take part in a five-day workshop on critical technology. It was boot camp for academics interested in learning how to use the best educational technologies in the most responsible and exciting ways. More than a hundred faculty, administrators, IT professionals, librarians, technology designers, computer programmers, entrepreneurs, learning center directors, and graduate students (and one undergrad) gathered at the University of Mary Washington, a public liberal arts college in the University of Virginia state system. We were there to learn from researchers, technology designers, activists, learning theorists, artists, computer scientists, and a few bona fide hackers. We filed into the Convergence Center, a stunning, state-of-the-art building with cherry-red chairs that spin like tops and an airy, light-filled breakfast room that somehow morphs, like a gigantic Transformer toy, into a high-tech multimedia auditorium with stadium seating and every imaginable tech gizmo.

The participants were ready for transformation, too. Because we were in the midst of the first presidential campaign waged on Twitter and Facebook, shaped by “fake news,” the participants believed their job, as educators, was to prepare students to be wiser than the often gullible traditional media. There were workshops on topics such as search engine optimization and how seemingly “objective” Google searches could be and had been rigged. At the time of the workshop, the first ten entries that came up when one searched for “Jew” were all produced by anti-Semitic and neo-Nazi sites pretending to be objective and informative. As the election unfolded, these educators had become increasingly alarmed that students, educators, the general public, and, shockingly, the traditional media didn’t know the basics of authenticating information. Almost daily, reputable newspapers and television news shows were repeating lies that had been deliberately and carefully planted by trolls. Many of the educators in Fredericksburg were in a state of distress, feeling as if they had failed in their obligation to educate the public at large as well as professional journalists, whose responsibility it is to inform and alert a gullible public. “I’m not sure the Internet was worth it,” one participant said sardonically in one of our small group meetings. This was the director of instructional technology at a major university. Someone else mentioned Orwell’s novel 1984. We were all shocked by the dystopic technology allegory unfolding in our democracy in real time.

The conferees were not technophobes. Nor were they prone to technophilia. Rather, they wanted to learn as much as possible about the technologies available, both the upside and the down, so that they could begin the necessary reimagining of higher education. Participants came from all over—Yale, Middlebury College, Smith College, as well as the University of Southern California and Stanford. Major public and private universities were represented as were regional state institutions, small liberal arts colleges, and community colleges. Change was in the air. University of Colorado at Denver sent six faculty and technology administrators as a cohort. They even had a dean along. They meant business, determined to make institutional change when they returned to Denver.

The keynote speaker was Dr. Tressie McMillan Cottom, a professor of sociology at Virginia Commonwealth University, the fastest-growing and most diverse university in the Virginia system. With a quick smile and mirth in her eyes, Dr. Cottom exudes an incisive wit and authority. She has thirty-five thousand Twitter followers and a new group of graduate students, and she was there to tell us how tweeting and teaching go together. She is one of the most persuasive activists on what is sometimes called #BlackTwitter, a loose network of public intellectuals who use Twitter to give issues important to African Americans more visibility and intellectual and political heft than they receive in traditional media. For example, when Georgetown University professor Marcia Chatelain was frustrated that the children of Ferguson, Missouri, were missing school because of the protests there in 2014, she used social media to develop “The Ferguson Syllabus,” a free, crowdsourced reading list on race, civil rights, policy, and African American history. #BlackTwitter took up the challenge, and people from all over began contributing, rapidly creating an extensive and diverse reading list, using provocative hashtags, including #IfTheyGunnedMeDown. The contributions came from every field. One chemist offered essays detailing the long-term effects of tear gas.

Known online as @tressiemcphd, Cottom is fearless in the face of white supremacist trolls and hackers who frequently assail her with racist and misogynistic insults. She maintains her professionalism and systematically critiques or exposes them, one by one. Few of us have such courage in such a vicious public forum. She sees her role on Twitter as educational and, conversely, believes education needs to be far more “woke” (in the parlance of #BlackTwitter) to the troubling political agendas of many on social media. It is for this reason that she keynoted the Digital Pedagogy workshop and why she cofounded the nation’s first program in Digital Sociology, an online master’s program that is part of Virginia Commonwealth University’s Department of Sociology. She and her colleagues are starting small, with just nine students in what is being described as a “rigorous degree program [that] will prepare graduates to shape emerging local, national and global conversations about big data, privacy, algorithms, inequality and social movements.” The program invites students not just to study the digital world but to help shape it.

Professor Cottom is as much a star in academe as she is on social media. She is an impressively prolific scholar. Like her Twitter commentary, her new book, Lower Ed: The Troubling Rise of For-Profit Colleges in the New Economy, pulls no punches. It is a devastating analysis of the ways for-profit universities exploit students, especially impoverished African American women who turn to them for one last chance to make a better life.

“There is much to critique,” Cottom tells me over coffee, “but I learned two things from for-profits that we have put into our Digital Sociology master’s degree. One, to keep choices limited so that students know where they are at all times and are going to a clearly delineated and attainable goal with a cohort of others on the same path at the same time. The other is to make sure students always understand why they are learning what they are learning and how it connects to a future career that they can attain.”

Cottom believes her pragmatic bent dates back to her undergraduate years at North Carolina Central University, a historically black college and university (HBCU). She calls her education there the “information Underground Railroad,” a world of knowledge that sometimes circulates out of sight of mainstream America, including on Twitter, blogs by black activists, and various websites. Understanding those different realms and how they interact (or fail to) is what she does now as a professional sociologist. “Inequality is what sociologists study,” she says. “Disparity is sociology’s bread and butter. What is unfair online mirrors what is unfair in the real world, and vice versa.”

Although she was invited to VCU to create a program in digital sociology, she is convinced that, in a few years, it won’t even be necessary to use that adjective. “What in the world doesn’t have some online component now? Globalization of resources, outsourcing jobs, algorithms to identify who is or isn’t a criminal. To sign up for the Affordable Care Act, you had to go online. Try to apply for a job at McDonald’s without filing an online application. I’ve started an online sociology program and all of the first students are full-time professionals—in journalism, in health care, in law, in education, in computer science. They all do something with ‘data’ in their jobs and they are seeing disparities, but they are more and more suspicious about what those inequalities mean. That’s sociology.”

Widely recruited even in the currently tough academic job market, she decided to accept an offer from Virginia Commonwealth University, which made it clear that it wasn’t just interested in her teaching about the sociology of African Americans or women. The university wanted Cottom to create a transformative new program for the Department of Sociology, to create a new field. The American Sociological Association didn’t have a Digital Sociology section. Now, because of her program, it does.

Cottom’s research on for-profit universities concentrates on the shifting of tax dollars from public institutions to for-profits and the fact that the highest burden of debt is carried by the nation’s poorest students at its worst institutions. “The for-profits are designed to feed the loan industry, not for student advancement and learning.” She is interested in the kind of person who takes classes online, especially at for-profit institutions, as well as those grasping at the so-called end of college credentialing offered by pay-for-play services such as Udacity, which promise jobs, but not necessarily jobs with any kind of benefits, full-time work, or security. Apple funds Udacity to train programmers in Swift, its programing language. Students pay $299 a month to take the courses and are guaranteed jobs as Apple programmers. Is it a great system, helping lift people from poverty—or yet another formerly middle-class occupation now demoted to casual labor with what Karl Marx called “surplus-value,” where an employer is in a position to extract more labor from a worker for less than market value because the employer controls the market (and the training for that market)?

“These are key sociological questions, and it is astonishing that we haven’t had a single program that was designed to frame and ask and do research to answer those questions before,” Cottom says. “Don’t you think it’s overdue?” Her work is dedicated to understanding how the for-profit technology sector exploits ignorance about computational tools, the soaring cost of higher education, naiveté about privacy and personal data, and the perennial fear that higher education isn’t keeping up and that the next generation will be left behind. “If you believe technology is the answer to everything that plagues higher education, you probably don’t understand technology or higher education,” she insists.

Cottom understands both. Intimately. She—not some Silicon Valley ed tech booster—is the perfect person to be starting the nation’s first degree program in digital sociology. “We wouldn’t have made the program if we couldn’t be sure that our students would find jobs, good jobs, better ones than they came with,” she says. The students who come to the master’s program bring skills, experience, and training in fields that have been hurt or altered dramatically by technology, typically through outsourcing, offshoring, or automation, fields including journalism, publishing, radio, law, and accounting. “We know that there is a greater need for people who understand technology, who understand data, who can use it and make it—and who know enough to be critical,” Cottom explains. The students learn how to use theory and analysis as deeply and thoughtfully as they wield and interpret statistics, data analytics, and social network analysis. The old “qualitative versus quantitative binary used to rule my profession, with one kind of sociologist dismissing and demeaning the other—none of that matters in the real world and, in our program, we want students to be proficient in both.”

All of the students in the Digital Sociology master’s program will start together and take the same courses together, in a cohort model very similar to that of ASAP, the highly successful community college program pioneered at the City University of New York. The VCU program’s creators want 100 percent of the students to complete the program, and that means using the sociology and sociality of cohort learning to motivate and inspire students to help one another stay in school. We know from multiple studies—whether with Harvard students or teenagers from the most impoverished parts of inner-city Chicago—that learning with a cohort, in study groups with peers progressing along the same path at the same time, is one of the best ways to guarantee completion of a degree.

As Cottom explains, “When they are reading the classical sociologists—Emile Durkheim, Max Weber, Georg Simmel, W. E. B. Du Bois—we want them to be talking to one another, excited, collaborating, even though they may never meet one another face to face. Everything is geared to be as constructivist as possible, where the sociology theory they read connects to their own life and work experiences and vice versa. We want these students to understand what it means to be a sociologist, and that requires a cohort. This is not about unmaking our field. It is ensuring what is most relevant and important about our field is used to understand one of the most important issues in modern life.”

Students will also read race, gender, and intersectional theory, including that from nontraditional sources such as black newspapers, magazines, and websites. Cottom is convinced these sources, often out of sight of mainstream media, offer practical insights into how members of marginal communities gain access to digital resources. From HBCUs, #BlackTwitter, community colleges, tribal colleges for Native Americans, vocational schools, guilds, nonprofit educational organizations, and other institutions where inclusion is the mission, students will learn more deeply about access and the digital divide than they can when “inequality” is studied only theoretically. Research coming out of the most well-funded universities often misses connections across and within institutions in black communities, for example, both online and in the world.

“How do we create space for critical thinking and engagement in the modern university? How do we create space for critical engagement online?” Cottom asks, then answers: “Finding the best ways is my responsibility in my role as a conductor on the information Underground Railroad.”

Dr. Cottom is passionate about what her distance learning program will offer. She is adamant that this isn’t a MOOC, a for-profit endeavor that will disappear once the profit does. It’s an online program that is deeply embedded in a stable institution that has been in existence since 1838. VCU won’t be acquired tomorrow by some conglomerate whose mission isn’t education but minority recapitalization of profitable companies.

Cottom believes our students’ lives are too precious to be squandered on an institution that exists for someone else’s profit. For a for-profit technology purveyor, the bottom line is the bottom line. For higher education, the bottom line must be preparing students for the challenges they will encounter in the rest of their lives.

Tressie McMillan Cottom, @tressiemcphd, has a higher mission. She invokes a famous quotation from Frederick Douglass: “It is easier to build strong children than to repair broken men.” For Dr. Cottom, in turn, that goal of higher education could not be more urgent and crucial. Being a principled, informed public intellectual on Twitter is one kind of teaching. She puts herself out there, knowing that attacks will come but also aware that what she is doing is necessary for a better future. Teaching wisely, well, and defensively—against the misinformation and exploitation—is, for her, a moral imperative. “Not only can you create spaces for critical engagement in higher education,” she insists. “You are morally obligated to do so.”

The new education must include programs like Digital Sociology, not just for a few students and at one campus but everywhere, on every campus. Digital or web literacy is no longer something we can consign to computer scientists. The power and influence of online interaction are rapidly becoming the most important factors in every aspect of our political, personal, and economic lives. Professor Cottom is positive that the students in the new Digital Sociology program will find jobs. They will possess skills the world needs—including how to discern and combat the ever-proliferating abuses of technology that permeate contemporary life.

The workplace demands far more graduates from programs such as Digital Sociology. We need educators to find a way between the twin pitfalls of technophobia and technophilia so that they can wisely navigate our technology-obsessed era—and prepare our students to do the same.

As it turns out, the jobs of the future require deep understanding of the technologies changing our workplace and our society. That understanding requires experimental and experiential training and the kind of grounded academic thinking that lets big ideas soar. The programs, the faculty, and the universities that understand that challenge will lead us all.