Conflicting Results in Educational Technology
India has pole-vaulted onto the global stage as an IT superpower, but only a thin stratum of the country’s educated elite is a part of that phenomenon. The rest – as many as 800 million people who live on less than two dollars a day – are lucky if they can work as servants to the rising middle class. There is a cavernous skills gap. Within the glass pyramids and shiny domes of Bangalore’s tech acropolises, recruiters struggle to find qualified engineers in a country with four times the population of the United States. Large IT firms like Infosys are so desperate for technical talent that they hire history majors on the basis of IQ tests and then put them through five-month courses in computer programming. Every year, more than 20 million Indians turn twenty years old, yet too few receive the foundational education required to fill the several hundred thousand technical jobs that corporations post each year.
In a country brimming with information technology but lacking in basic education, it seemed natural to investigate how personal computers could support learning. So, that was one of the things I focused on when I moved to India in 2004. I hired a team of designers, engineers, and social scientists. We began projects in education, agriculture, health care, governance, microfinance, and so on. For education, we started by spending time in rural India’s government schools. They were blighted by absent teachers, broken toilets, and unquestioning parents.
Desperate administrators often turned to technology as a solution. A startling number of rural schools had computer labs. Because of small budgets, though, the labs were limited to a handful of PCs. Joyojeet Pal, one of our first interns, visited twelve schools across four states and returned with photo after photo of students piled on like rugby players around a single PC.1 There were never enough terminals for all the children. One dominant child – often an upper-caste boy – tended to monopolize the mouse and keyboard while others crowded around, hoping to have a chance to interact.
It was a perfect opportunity for innovation: What if we plugged in multiple mice per computer, each with a corresponding cursor on screen? As with a video game console, many children could engage simultaneously. Udai Singh Pawar, a smart young researcher in my group with a boyish sense of fun, ran with the idea. He quickly prototyped what we called MultiPoint, along with its own educational software.
Students loved it, and formal experiments confirmed its effectiveness. Pawar verified that for activities like vocabulary drills, students learned just as much with MultiPoint as with a single PC all to themselves.2 One child, enthralled with the prototype, asked, “Why doesn’t every computer come with multiple mice?” We filed a patent, convinced Microsoft to release a free software development kit, and imagined that schools around the world would benefit. We declared victory, and temporarily forgot about the lack of toilets, the silent parents, and the absent teachers.
Projects such as MultiPoint won us awards and recognition. Children inevitably smiled in front of new technology, and politicians loved photo-ops where they handed out new gadgets. I often found myself in teak-paneled boardrooms discussing technology strategy with government ministers, World Bank officials, and nonprofit luminaries. Our research seemed to offer proof that there were technological solutions to developing-world education.
We were not alone. An even bigger splash was made by a nonprofit with an ambitious name that reflected its ambitious plan: One Laptop Per Child. Led by Massachusetts Institute of Technology (MIT) Media Lab founder Nicholas Negroponte, the organization sought to design a $100 laptop that could be sold to developing countries a million units at a time. At the 2005 World Summit on the Information Society, Negroponte shared the stage with United Nations Secretary-General Kofi Annan. They unveiled what looked like a green-and-white Fisher-Price toy that boasted a fully operational PC with kid-friendly software. Annan gave an unabashed endorsement: “These robust and versatile machines will enable kids to become more active in their own learning.”3
Negroponte summed up our credo as technologists: “It’s not a laptop project. It’s an education project.”4 By inventing and disseminating new, low-cost devices for learning, we believed we were improving education for the world’s less privileged children. But were we?
Geeks Bearing Gifts
The success of our MultiPoint trials encouraged us to expand its use, and we went looking for schools that could benefit from it. At private schools funded by wealthier parents or philanthropic donors, principals would lead us to sparkling classrooms with rows of well-kept computers. But those weren’t the schools that needed a boost. Their students would do well with or without MultiPoint. In the schools where help was most needed – where administrators were apathetic or underfunded, where teachers were absent or overloaded, or where students learned little and rarely graduated – it was impossible for MultiPoint to gain a foothold.
One visit I made to a government primary school just outside of Bangalore illustrates why. The headmaster unlocked a large metal cabinet to show me where he kept the school’s personal computers. Inside, desktop PCs, monitors, and keyboards were piled shoulder high, somehow caked in dust even though they weren’t out in the open. He explained that the PCs had been allocated to each school in the district two years before. The equipment had been received with excitement. The headmaster had cleared a room in his spartan cement-block building for a computer lab. Classes visited the lab one after the other, and students, crowding five or six to a PC, found games to play. The teachers, however, complained that the games didn’t follow the curriculum, and in any case, they didn’t know how to incorporate digital tools for teaching. Then, within weeks, the equipment began to fail. Power surges were probably to blame.5 The school had no IT staff, and there was no budget for technical support. Soon after, the machines were locked away, and the computer lab was repurposed. The PCs were just taking up storage space, but the headmaster couldn’t get rid of them. As state assets, they might be subject to inspection.
The situation wasn’t unusual. Many schools had neither staff nor finances for ongoing technical support. Computer budgets in education tend to pay for hardware, software, and infrastructure, but they neglect the ongoing costs of storage, upgrades, troubleshooting, maintenance, and repair. And PCs need a lot of care in the hot, dusty, humid conditions of rural Indian schools.
Meanwhile, teachers who had PCs dumped into their classrooms felt like seafaring captains suddenly asked to pilot a jumbo jet, all while the unruly passengers are given free access to the controls. For teachers already struggling to keep their students engaged, a computer was less help, more hindrance.
In the course of five years, I oversaw at least ten different technology-for-education projects. We explored video-recorded lessons by master teachers; presentation tools that minimized prep time; learning games customizable through simple text editing; inexpensive clickers to poll and track student understanding; software to convert PowerPoint slides into discs for commonly available DVD players; split screens to allow students to work side by side; and on and on.6 Each time, we thought we were addressing a real problem. But while the designs varied, in the end it didn’t matter – technology never made up for a lack of good teachers or good principals. Indifferent administrators didn’t suddenly care more because their schools gained clever gadgets; undertrained teachers didn’t improve just because they could use digital content; and school budgets didn’t expand no matter how many “cost-saving” machines the schools purchased. If anything, these problems were exacerbated by the technology, which brought its own burdens.
These revelations were hard to take. I was a computer scientist, a Microsoft employee, and the head of a group that aimed to find digital solutions for the developing world. I wanted nothing more than to see innovation triumph, just as it always did in the engineering papers I was immersed in. But exactly where the need was greatest, technology seemed unable to make a difference.
Textbooks of the Air
We were hardly the first to think our inventions would transform education. Larry Cuban, a veteran inner-city teacher and an emeritus professor at Stanford, has chronicled the technology fads of the past century. As his examples show, the idea that technology can cure the ills of society is nothing new. As early as 1913, Thomas Edison believed that “the motion picture is destined to revolutionize our educational system.”7 Edison estimated that we only learned 2 percent of the material we read in books, but that we could absorb 100 percent of what we saw on film. He was certain that textbooks were becoming obsolete.
In 1932, Benjamin Darrow, founder of the Ohio School of the Air, made a similar claim for radio. He said that the medium would “bring the world to the classroom . . . [and] make universally available the services of the finest teachers, the inspiration of the greatest leaders.” Radio would be “a vibrant and challenging textbook of the air.”8
In the 1950s and 1960s, it was television. President John F. Kennedy convinced Congress to authorize $32 million for classroom television programs. For a few years, American Samoa based its entire school system on televised instruction. President Lyndon B. Johnson approved. “The world has only a fraction of the teachers it needs,” he said. “Samoa has met this problem through educational television.”9
All of these predictions sound achingly similar to today’s claims for digital technology. If history is a guide, new technologies will be absorbed by schools but will do little in the end to advance education. Audiovisual teaching aids are common in modern classrooms, to be sure, but they have hardly revolutionized learning. It now seems quaint, even silly, to think that a generation hung its educational hopes on the boob tube. Television was supposed to uplift millions. Instead, millions sit in thrall to the Kardashians.
Maybe, though, digital is different. After all, real education involves two-way interaction, while broadcast media is only one-way. Don’t computers, the Internet, and social media offer something that television doesn’t?
Rigorous studies say no. The economist Ana Santiago and her colleagues at the Inter-American Development Bank found no educational advantage in a One Laptop Per Child program in Peru. Three months after an enthusiastic nationwide rollout, the novelty factor had worn off, and each week saw less use of the laptops. Even after fifteen months, students gained nothing in academic achievement.10 Another team of researchers found similar results in Uruguay: “Our findings,” they said, “confirm that the technology alone cannot impact learning.”11
Economist Leigh Linden at the University of Texas at Austin conducted experimental trials in India and Colombia. He found that, on average, students exposed to computer-based instruction learned no more than control groups without computers.12 His conclusion? While PCs can supplement good instruction, they don’t substitute for time with real teachers.
One of our research partners in India was the Azim Premji Foundation, a nonprofit organization that at the time ran the world’s largest program involving computers in education. In 2010, its CEO, Anurag Behar, published a brave article in an Indian affiliate of the Wall Street Journal. Casting doubt on his own group’s work with computer labs in over 15,000 schools, he wrote, “At its best, the fascination with [information and communication technology] as a solution distracts from the real issues. At its worst, ICT is suggested as substitute to solving the real problems.”13
The lessons of a place with circuit-frying electrical supply and no running water might seem irrelevant for those of us who live in the developed world. American schools, though, suffer similar fates with technology. In 2001 and 2002, Mark Warschauer, a professor at the University of California, Irvine, and one of the world’s experts on technology in the classroom, led a study of eight schools in California that spanned rich and poor socioeconomic groups.14 Foreshadowing my experience in India, he found that US schools also had problems maintaining technology and using it meaningfully.
Warschauer heard many teachers complain that computers in the classroom had doubled their workload. Not only did teachers have to design lesson plans involving computers, they also had to write low-tech backup plans in case of technology failures, which were frequent.
Even when it worked, technology wasn’t necessarily being used well. In one class, Warschauer witnessed students typing names of countries into a search engine, clicking on whatever webpages came up, and aimlessly copying snippets of text into word-processing software. He wrote, “Although the students could be said to be performing the task of searching for material on the Web, they were not developing any of the cognitive or information literacy skills that such a task would normally involve.”
Warschauer also found that poorer districts had more difficulty with the equipment. What mattered wasn’t the technology – all of the schools had about the same number of computers per student and similar access to the Internet. But, he wrote, “placing computers and Internet connections in low-[income] schools, in and of itself, does little to address the serious educational challenges faced by these schools. To the extent that an emphasis on provision of equipment draws attention away from other important resources and interventions, such an emphasis can in fact be counterproductive.”15
Other scholars, journalists, and educators have taken a hard look at electronics in the American classroom and found them wanting. In The Dumbest Generation, Emory University professor Mark Bauerlein cites statistic after statistic showing that “digital natives” – millennial children who have never known a life without the Internet – aren’t doing any better in school than their parents did. He rails against the fetish we make of technology: “It superpowers [students’] social impulses, but it blocks intellectual gains.”16 Todd Oppenheimer, in The Flickering Mind, grieves over his visits to computerized schools across the country. All too often, he finds digital education to be about cutting and pasting graphics into PowerPoint.17 The school board president of the Liverpool Central district near Syracuse, New York, Mark Lawson, canceled a disappointing school laptop program after a run of seven years. There “was literally no evidence it had any impact on student achievement – none. . . . The teachers were telling us when there’s a one-to-one relationship between the student and the laptop, the box gets in the way. It’s a distraction to the educational process.”18
In other words, even in America, where infrastructure is reliable and technology is plentiful, computers don’t fix struggling schools.
Nevertheless, when I returned to the United States in 2010, I came home to a country that was on a whole new kick about technologies for education. Secretary of Education Arne Duncan gave talk after talk urging more technology in the classroom. One keynote he delivered in 2012 was indistinguishable from a tech-company sales pitch. In it he mentioned technology forty-three times: “technology is the new platform for learning”; “technology is a powerful force for educational equity”; “technology-driven learning empowers students and gives them control of the content”; “technology . . . provides access to more information through a cell phone than I could find as a child in an entire library.” (In the same speech, he mentioned teachers only twenty-five times.)19
Marc Prensky is the consultant who coined the term “digital natives.” He claimed that “today’s students think and process information fundamentally differently from their predecessors.” Immersed in devices from birth, they are growing up in a new world that their digital-immigrant parents don’t fully understand. His recommendation? We should teach digital natives in the language they were born in: “My own preference for teaching Digital Natives,” he wrote, “is to invent computer games to do the job, even for the most serious content.”20
Egged on by the chorus of support, America is in an orgy of educational technologies despite scarce evidence that they improve learning. In 2013, the Los Angeles Unified School District announced a $1 billion program to distribute iPads to all of its students.21 Donors flock to support the online Khan Academy, where the disembodied voice of Salman Khan accompanies video-recorded blackboard instruction. And MOOCs – massively open online courses – from Harvard, MIT, Stanford, and other universities boast about the millions of people from around the world taking their free classes.
The fever is contagious. Despite everything I learned in India, I wasn’t immune to it. I was once on a panel at MIT with Negroponte where I outlined my hard-won lessons about technology for education. He didn’t like what I said, and he went on the offensive. But he did it with such confidence and self-assurance that, as I listened, I felt myself wanting to be persuaded: Children are naturally curious, aren’t they? Why wouldn’t they teach themselves on a nice, friendly laptop?
As I heard more of the technology hype, however, I realized that it didn’t engage with rigorous evidence. It was empty sloganeering that collapsed under critical thinking.
Take Negroponte’s claim that children are natural learners who will teach themselves with well-designed gadgets. Its subversive edge is part of its charm. Pink Floyd lyrics echo in the mind: “We don’t need no education; we don’t need no thought control.” But even casual observation suggests that the truth is otherwise. When left alone with technology, few children open up educational apps. What they really want is to play Angry Birds. And teenagers are not that different. Los Angeles’s iPad initiative hit an early glitch when older students hacked the tablets’ security software and gained access to games and social media.22
Another highly touted project that doesn’t measure up is called the Hole-in-the-Wall. Its main proponent is Sugata Mitra, a professor of educational technology at Newcastle University. He regales audiences with what he says happened when he embedded a weatherproofed PC in the wall of a New Delhi slum. Without any supervision, local children started using the computer. They taught themselves to open applications, draw pictures, and use the Internet. In later studies, Mitra made the astonishing claim that his brand of “minimally invasive education” allowed children in poor villages to learn English and molecular biology entirely on their own.23 Mitra went on to become an internationally celebrated speaker and won the 2013 TED Prize.
But some who visit Mitra’s Hole-in-the-Wall sites find that they are unused, defunct, or occupied by older boys playing video games.24 Payal Arora, a professor of media and communication at Erasmus University in Rotterdam in the Netherlands, found one village where instead of the reputed computers, there was only a “cemented structure in which there are three gaping holes.” Several years after the computers were installed, “few of the people [in the village], including the students, had any recollection of the project.” One local teacher “recalled a few boys using these kiosks, but ‘usually for things like games, that’s all.’” Confronted with these points, Mitra softened his position, admitting, “It is certainly incorrect to suggest that free access to outdoor-located PCs is all that is involved” in real learning.25
A 2013 study by Robert Fairlie and Jonathan Robinson – economists with no stake in technology – slams a heavy lid on the sarcophagus for the quixotic idea that children will teach themselves digitally. In an experimental trial involving over 1,000 students in grades 6 through 10 in America, they found that students randomly selected to receive laptops for two years certainly spent time on them, but that the time was devoted to games, social networking, and other entertainment. And whatever merit these activities might have in theory, in practice those with laptops did no better “on a host of educational outcomes, including grades, standardized test scores, credits earned, attendance, and disciplinary actions,” than did a control group without computer access at home.26 In other words, unfettered access to technology doesn’t cause learning any more than does unfettered access to textbooks.
Technology advocates ignore studies like this. Instead they prey on parental fears. Secretary Duncan insists that “technological competency is a requirement for entry into the global economy,” hinting that our children will be at a disadvantage if they grow up without computers.27 But do students need to be steeped in technology to be competitive? Duncan is himself proof that it’s not. By his own admission, he grew up in a “technology-challenged household.” Apparently, when he was young, his family didn’t even own a television, to say nothing of a PC.28 Like most of today’s leaders above the age of forty-five, Duncan wasn’t exposed to digital technology when he was young, yet we can be sure that his mother is proud of his accomplishments in the twenty-first century.
Years of data from the world’s most credible educational yardstick show that technology in the classroom has little to do with good scores. The Program for International Student Assessment (PISA) is the Olympics of formal education. Participating countries administer standardized academic achievement tests to their fifteen-year-olds, allowing cross-country comparisons in several subjects. South Korea happens to be both high-tech and high-performing, but Finland and China consistently outperform other countries despite low-tech approaches.29 In a 2010 report, PISA analysts wrote that “the bottom line is that the quality of a school system cannot exceed the quality of its teachers,” regardless of available educational resources such as computers.30
Anyone can learn to tweet. But forming and articulating a cogent argument in any medium requires thinking, writing, and communication skills.31 While those skills are increasingly expressed through text messaging, PowerPoint, and email, they are not taught by them. Similarly, it’s easy to learn to “use” a computer, but the underlying math skills necessary for accounting or engineering require solid preparation that only comes by doing problem sets – readily accomplished with or without a computer. In other words, there’s a big difference between learning the digital tools of modern life (easy to pick up and getting easier by the day, thanks to improving technology) and learning the critical thinking skills necessary for an information age (hard to learn and therefore demanding good adult guidance). If anything, it’s less useful to master the tools of today, because we know there will be different tools tomorrow.
What Wise Parents Know
For about a month in the spring of 2013, I spent my mornings at Lakeside School, a private school in Seattle whose students are the scions of the Pacific Northwest elite. The beautiful red-brick campus looks like an Ivy League college and costs almost as much to attend. The school boasts Bill Gates among its alumni, and there is no dearth of technology. Teachers post assignments on the school’s intranet; classes communicate by email; and every student carries a laptop (required) and a smartphone (not).
In this context, what do parents do when they think their children need an extra boost? I was there as a substitute tutor for a friend whose students spanned the academic spectrum. A few of them were taking honors calculus. They were diligent but wanted a sounding board as they worked on tough problems. Others, sometimes weighed down by intensive extracurricular activities, struggled in geometry and algebra. I would review material with them and offer pointers as they did assignments. Yet another group required no substantive help at all. They just needed some prodding to finish their homework on time. Despite their differences, the students had one thing in common: What their parents were paying for was adult supervision.
All of the content I tutored is available on math websites and in free Khan Academy videos, and every student had round-the-clock Internet access. But even with all that technology, and even at a school with a luxurious 9:1 student-teacher ratio, what their parents wanted for their kids was extra adult guidance. If this is the case for Lakeside students with their many life advantages, imagine how much more it must be the case for the world’s less privileged children.
If the Labors of Hercules had an intellectual equivalent, it would be modern education. By the end of high school, we expect a student to know about 60,000 words; read To Kill a Mockingbird; learn the Pythagorean theorem; absorb a national history; and have peered through a microscope. Advanced students will put on Greek tragedies; rediscover the principles of calculus; memorize the Gettysburg Address; and measure the pull of gravity. In effect, students have twelve years to reconstruct the world’s profoundest thoughts – discoveries that history’s greatest thinkers took centuries to hit upon.
This is not casual play, and it requires directed motivation. It doesn’t matter what flashy interactive graphics exist to teach this material unless a child does the hard internal work to digest it. To persevere, children need guidance and encouragement for all the hours of a school day, at least nine months of the year, sustained over twelve years. Electronic technology is simply not up to the task. What’s worse, it distracts students from the necessary effort with blingy rewards and cognitive candy. The essence of quality children’s education continues to be caring, knowledgeable, adult attention.
Miracle or Mirage?
Yet, it can’t be true that technology never helps education. That doesn’t square with what my team found with the MultiPoint pilot, or, for that matter, with any project where formal trials prove a technology’s value. There is plenty of reliable research in which students with technology gain something over those without.
Indeed, Negroponte is persuasive because he speaks with deep conviction. He’s a true believer. Negroponte’s backstory involves a rural Cambodian village where out of a charitable impulse he handed out laptops to twenty children. When he found both the students and their families making innovative use of them, One Laptop Per Child was born.32
And Professor Warschauer at UC Irvine, no utopian when it comes to technology, found that in some American schools with one-to-one laptop programs, the students became better writers. They wrote more, revised more, and got more frequent feedback from teachers.33
And what about the evidence under my nose? The proof of technology’s value in my own learning as a research scientist? It is thanks to the Internet that I can look up papers without a trip to the library. It is thanks to email that I can stay in touch with colleagues on the other side of the world. And it is thanks to Wikipedia that I can brush up on knowledge long forgotten or never learned.
So in some cases, technology does help – but not with the consistency required to fix larger social problems. The MultiPoint experience was repeated in all of my team’s other projects – in agriculture, in health care, in governance, in entrepreneurship. On the one hand, it was easy to develop innovations that had some merit; on the other hand, those same innovations rarely led to large-scale benefits. How could this be? It was a paradox. I was missing an explanation of how and why machines contribute – or don’t – to social change.
Actually, modern society as a whole lacks a good framework for thinking about technology’s social impact. As children, we learn how our bodies work in biology classes and how our government works through civics lessons. Computer courses, though, only teach us how we can use the devices, not how the devices affect us. As adults, we’re inundated with news about Facebook revolutions in the Arab Spring, long queues for the latest iPhone, and email spying by the National Security Agency. Yet we have no consensus view of the technologies’ net effect.
Toward the end of my five years in India, I had a glimpse of a hypothesis. I knew there was a way to make sense of the apparent contradiction whereby isolated successes weren’t easy to replicate elsewhere. But since I worked at a company whose soul was software, I kept wanting to see the technology prevail. I felt disloyal doubting its value. As Upton Sinclair said, “it is difficult to get a man to understand something, when his salary depends upon his not understanding it.”34 I needed some distance, and I needed some time. So in early 2010 I left Microsoft to join the School of Information at Berkeley. The dean, AnnaLee Saxenian, had arranged a research fellowship for me. At her school, people not only built technologies but also studied how they affected society. Technology’s impact was complex, but I hoped to find a concise way to understand when it was good, when it was bad, and when we could know in advance.