Screens and the Educational Industrial Complex
There is a new Wild, Wild West in education: education technology, predicted to become a $60 billion industry by 2018.1 Yes, that includes things such as smart boards and data systems . . . but the real gold rush, which has attracted the deep pockets of entrepreneurs and tech companies alike, is the tablet—more specifically, the idea of a tablet for every student in America—and the expensive educational software and annual licensing fees that go with it.
To be sure, there is a place for technology in education—and for screens in the classroom. But most education experts agree that tech alone is not the cure to what ails education. And we must be very careful with how and, most importantly, at what age and grade level the screens are rolled out.
Unfortunately, for some just looking to cash out, that hasn’t mattered. As with any gold rush, some of the speculators are more unsavory than others.
* * *
The story of technology in the classroom is a fascinating one.
As a story, it has all of the elements of a real page-turner: greed, corruption, betrayal. However, it’s more than just a story—it’s the real-life betrayal of our children by a combination of greed, incompetence, hubris and ego. In that sense, the story of tech in the classroom reads more like a Greek tragedy.
Let me set the stage before we meet our dramatis personae.
There are educational reformers—also known as edupreneurs—who are selling the false narrative that the current educational system is so broken that only their technology snake oil can fix it. Some of these edupreneurs are driven strictly by a desire for profit; others are driven by ego—the misguided, messianic fervor that they can be “the ones” who can transform education, research and reality be damned.
This mix of ego and greed is driving the education technology juggernaut at the top of its food chain. In the midlevels are school principals and superintendents. Alas, this is an “emperor’s new clothes” phenomenon, in which many who actually know better—who realize that education can’t be fixed by gadgets—stay silent to save their careers. No one likes a dissenting voice.
What, the emperor has no clothes? You mean the millions that we’ve spent on worthless and ineffective devices that are being hacked by students or are sitting in storage rooms have been a total waste? Be silent or be reassigned!
Others are driven by an effort to keep up with neighboring districts in a misguided tech arms race. Westhampton has tablets K-12? Quickly—tablets for everyone in this district! Or, worse yet, some clueless administrators have bought the tech companies’ pitch hook, line and sinker and are blinded by the glow of the shiny new devices. Do they work? Do they help kids become better students? Who cares? Just look at how shiny they are!
When speaking to school administrators in the course of giving presentations at tech-effects workshops, I’ve encountered various versions of all of the above. A select few seem to genuinely get it and are even prepared to halt—or at least slow down—the march of screens into the younger and younger grades. Others, not so much.
My sense is that parents must speak up in a unified voice and ask more questions like: does all of this tech in the classroom actually help my child learn? And, more importantly, can some of these tablets even be hurting my child developmentally and psychologically? Until parents begin to speak up to protect their children, school administrators will be led by tech company Pied Pipers.
And now, I present Act I of Greed in Education.
An Unholy Alliance: Rupert Murdoch and Joel Klein
Joel Klein, former New York City schools chancellor, has become the leading voice in “transforming” our broken educational system via technology. He sees the solution as being a tablet for every student in America, K-12—the digital equivalent of a chicken in every pot—a plan that, conveniently, his education tech company Amplify is ready, willing and able to carry out for every school district in the nation.
Yet for years Klein has been dogged by conflict-of-interest allegations and has been accused of using misleading and erroneous information to claim that the current educational system is more broken than it actually is.2
But just who is Joel Klein? We should know, because this man could very well shape the educational landscape for generations to come—as he becomes a very, very rich man.
Klein had never taught in a classroom or studied education. Before being appointed as chancellor by Mayor Michael Bloomberg in 2002, Klein had been a Harvard-educated lawyer.
He was in private practice before founding his own law firm; then, in the 1990s, before being appointed as Assistant Attorney General in charge of the Anti-Trust division, he served in the White House Counsel’s office in the administration of Bill Clinton before being appointed as a U.S. assistant attorney general in the Department of Justice. After leaving the Justice Department, he became legal counsel to Bertelsmann, an international media group.
There was not a whiff of any educational bona fides in Klein’s career before he was handpicked to oversee the education of New York City’s 1.1 million students. During his tenure, he spearheaded a series of initiatives, including breaking up larger schools and working with the Gates Foundation to open a series of 43 small high schools. After some initial accolades regarding improved graduation rates, he was accused by New York University professor and education policy analyst Diane Ravitch, among others, of cooking the books in order to obtain those positive outcomes.3 According to the journalist Bob Herbert, Bill Gates later admitted that breaking up those schools was a mistake: “Simply breaking up existing schools into smaller units often did not generate the gains we were hoping for.”4
Klein’s other major accomplishment as chancellor was spending $95 million of New York City taxpayers’ money on a tech albatross: in 2007 Klein oversaw the implementation of ARIS (Achievement Reporting and Innovation System), a data collection and student tracking computer system. ARIS immediately got blasted by critics, teachers and parents as being slow, clunky and largely unutilized. Klein then awarded Internet start-up Wireless Generation a $12 million annual contract to fix and maintain his broken and expensive clunker.
Now this is where it gets good, but you have to follow along closely, as the muddy ethics make the waters a little murky. In 2011 Klein stepped down from his $225,000-a-year position. And why not? He had a better offer; he took a $2 million-a-year job offer from Rupert Murdoch—complete with a $1 million signing bonus—in order to head up Amplify.5 And what is Amplify, you might ask? Amplify is the ed tech company that used to be Wireless Generation—yes, the same company that had gotten the $12 million contract from Klein while he was schools chancellor to fix his broken ARIS data albatross.
That’s right, Klein gave a private company a lucrative public contract to fix a disaster that he created, and then he went to work for—correction: he went to run—that private company, making almost ten times what he had made toiling for the board of ed. But Klein wasn’t paid all that money by Murdoch just to muck around with ARIS and data collection; Murdoch had invested almost a billion dollars in Amplify in pursuit of the educational holy grail—an Amplify tablet (for only $199!) in the hands of every student in America.
A public sector employee cashing out in the private sector? Happens all the time in politics: poor public-servant-as-congressman cashes out as a lobbyist; move along, nothing to see here. Some might even say, God bless him—this is America. Who are we to begrudge a man a chance to reach for the gold? But in education, selling out to the private sector can be problematic. The question that needs to be asked is: is he cashing out at the expense of our kids’ learning and, even more problematically, their well-being?
We know Rupert Murdoch’s motives. Never one to be confused with a saint or a person with unshakable ethics, Murdoch has been known to bend and even break the law in pursuit of profit. Executives at his now-defunct tabloid newspaper News of the World were accused of phone hacking and police bribery. In the ensuing criminal investigation, it was revealed that not only were the phones of celebrities, politicians and members of the British Royal Family hacked, but so were those of murdered schoolgirl Milly Dowler, relatives of deceased British soldiers, and victims of the July 7, 2005, London bombings—all to sell more newspapers.
This paragon of virtue and ethics was now Klein’s new boss in an effort to transform American education. Entrepreneurial rascal that he is, Murdoch had always been keen to exploit new media opportunities, and he had been attempting to cash in on ed tech for some time. Wireless Generation presented the perfect opportunity. Larry Berger had started Wireless Generation in 2000, and by the time Murdoch purchased it in 2010, for $360 million, it had turned into a thriving, 400-employee company that focused on analytics, data and assessment.
But Murdoch wasn’t interested in analytics and data assessment. He saw Amplify as a firm through which he could replace the lucrative textbook market with shiny new tablets—fully loaded with expensive educational software.
This was now possible because a couple of key changes in the education world were making it very attractive for entrepreneurial gunslingers. In the old days, McGraw-Hill, Houghton Mifflin Harcourt, and Pearson ruled the $7.8 billion textbook and curriculum development market. But textbooks and curricula had to be customized to meet individual state standards—a very expensive and time-consuming endeavor.
Then, in 2010, came the development that would change everything—and make the whole education field finger-licking good for profit-motivated, ethically challenged entrepreneurs like Murdoch: the Common Core of Standards, aka the Common Core.
The Common Core created a set of curriculum and textbook standards that were adopted by 45 states. There was no need to muck around with different standards for smaller markets like, say, Alabama. Now, a company that could create K-12 curricula that adhered to the Common Core requirements could sell its materials across the country. Better yet, a company that could create a tablet that could be programmed with all of this new Common Core goodness would make textbooks obsolete—and all with annual licensing fees. Ka-ching!
But Murdoch needed a good front man; the King of Fleet Street couldn’t very well be seen as the man who could transform American education. Along came Klein—a bargain at only $2 million a year. By rebranding Wireless Generation as Amplify and hiring Klein, Murdoch had found the high-profile “education expert” he needed to shill his new tablet-based education company.
The company was divided into three divisions: Amplify Learning, which was to develop and provide Common Core–based curricula for K-12; Amplify Insight, which was to provide analytics and data assessment; and Amplify Access, which was to sell a customized Android tablet with a 10-inch Gorilla Glass screen.
But with Klein at the helm, things got off to a rough start.
The poor, taken-advantage-of New York City Department of Education (DoE) had finally had enough and decided to cut their losses and scrap the whole $95 million ARIS disaster. According to a DOE spokesperson: “The Education Department has decided to end our contract with Amplify as a result of the extremely high cost of the ARIS system, its limited functionality, and the lack of demand from parents and staff.”
A letter from the office of Thomas P. DiNapoli, the New York State comptroller, also pointed toward Murdoch and the phone-hacking scandal as part of the reason to shed Amplify: “In light of the significant ongoing investigations and continuing revelations with respect to News Corporation, we are retuning the contract with Wireless Generation unapproved.”
“Good news they’re junking it,” said Arthur Goldstein, an English teacher at Francis Lewis High School in Queens, in an interview with the New York Daily News. “They spent $95 million on that thing and my kids are in trailers. What they did with that money is criminal.” Sure, the kids, parents, teachers and taxpayers had to take a hit, but Klein got to keep his $2 million salary and set his eyes on the bigger prize: tablets for all.
Amplify went to work hiring hundreds of the best twenty-somethings to develop their tablets and software. Let’s not forget that kids simply cannot pay attention to something unless it’s a video game, so hundreds of video game designers were hired to “game” the educational software—game points for everybody! Meanwhile, dozens of “product tester” kids were hired (and paid weekly in $100 Amazon.com Gift Cards) to test drive the new edu-games.
The Amplify mission statement was, “Amplify is reimagining the way teachers teach and students learn.” They sure were. But not everyone was crazy about a videogame classroom.
Douglas Clark, an associate professor at Vanderbilt University’s Peabody College, one of the top schools of education in the country, was bothered by this gaming approach; as he told Travis Andrews for the Web site Mashable: “Points are extrinsic motivations, and ‘when [kids] get bored with extrinsic, they stop.’”
Even more problematically, as we explored earlier in this book, video games can be dopaminergic and addicting, and as with educational tools like The Oregon Trail, the child tends to focus on the points-accumulation aspect rather than the educational content.
But beyond video games, the bigger question was: does any of this even work? Is there any research that any of these expensive new screen gadgets are educationally beneficial? Some supporters will point to studies that indicate increased pattern recognition and spatial awareness as well as some increased word retention with the use of iPads and tablets, but many other education researchers believe that those positive outcomes are greatly overstated. But even if we cede that there may be a beneficial pattern-recognition increase or word-retention effect, do those effects lead to better educational outcomes—do they lead to students’ becoming better learners?
The more comprehensive research doesn’t bear that out.
In fact, the research on technology is clear: an exhaustive 2012 meta-analysis, which systematically reviewed 48 studies that examined technology’s impact on learning, found that “technology-based interventions tend to produce just slightly lower levels of improvement when compared with other researched interventions and approaches [emphasis mine].”6
Whatever minimal gains were shown couldn’t be causally linked to tech. Instead, the study concluded that technology can be a useful tool in already effective schools with effective teachers—but, in and of itself, tech was not the educational panacea.
“[I]t is not whether technology is used (or not) which makes the difference, but how well the technology is used to support teaching and learning.” the researchers wrote, concluding: “Taken together, the correlational and experimental evidence does not offer a convincing case for the general impact of digital technology on learning outcomes.”
That idea is echoed by Greg Anrig, author of Beyond the Education Wars (2013): “None of these studies identify technology as decisive.” Anrig also points to the importance of good teachers collaborating with students and administrators as key to successful student outcomes.
Dr. Kentaro Toyama, an associate professor at the University of Michigan’s School of Information and a fellow of the Dalai Lama Center of Ethics and Transformation Values at MIT, came to a similar conclusion. No Luddite, he had received his Ph.D. in computer science from Yale and had moved to India in 2004 to help found a new research lab for Microsoft. While there, he became interested in how computers, mobile phones, and other tech could help educate India’s billion-plus population and aid in learning.
While he had been hopeful to find that tech could solve many of education’s problems, he came to understand what he began to think of as technology’s “Law of Amplification”; unlike Klein’s Amplify, Dr. Toyamo saw that technology “amplified,” all right—but not always in a good way. He found that technology can help education where it’s already doing well, but it does little for mediocre educational systems, and, worse, in dysfunctional schools it “can cause outright harm.”
The main problem, according to Dr. Toyamo, is that technology does not address fundamental issues of student motivation. Without that key human ingredient, all the shiny tech is meaningless.
As Dr. Toyama writes in his commentary, “Why Technology Will Never Fix Education,” which appeared in the May 19, 2015, issue of the Chronicle of Higher Education: “One problem is a widespread impression that Silicon Valley innovations are good for society. We confuse business success with social value, though the two often differ.” He adds: “Any idea that more technology in and of itself cures social ills is obviously flawed. . . . Unfortunately, there is no technological fix, and that is perhaps the hardest lesson of amplification. More technology only magnifies socioeconomic disparities, and the only way to avoid that is non-technological.”
Even as far back as 1983, educators understood that teaching was more important than the medium. In a sort of inverse corollary to Marshall McLuhan’s “the medium is the message,” research by Richard Clark showed that pedagogy—and not the method of delivery—was all-important. He said that instructional media that delivered the educational content were “mere vehicles that deliver instruction but do not influence student achievement any more than the truck that delivers our groceries causes changes in our nutrition.”7
The well-respected Alliance for Childhood, a consortium of some of the nation’s top educators and professors, put out a report back in 2000, “Fool’s Gold: A Critical Look at Computers in Childhood,” that also shared a skeptical view of technology in the classroom. They concluded: “School reform is a social challenge, not a technological problem . . . a high-tech agenda for children seems likely to erode our most precious long-term intellectual reserves—our children’s minds.”
Dr. Patricia Greenfield, distinguished professor of psychology at UCLA, agrees. A January 2009 article for UCLA Newsroom, “Is Technology Producing a Decline in Critical Thinking and Analysis?,” states that Greenfield analyzed more than 50 studies on learning and concluded that “technology is not a panacea in education, because of the skills that are being lost.” She points out that reading for pleasure among young people has decreased in recent decades, which is problematic because “studies show that reading develops imagination, induction, reflection and critical thinking, as well as vocabulary . . . in a way that visual media such as video games and television do not.”
She is also opposed to Internet-wired classrooms, citing one study in which students who were given access to the Internet during class and were encouraged to use it during lectures subsequently did not process what the speaker said as well as those students who did not have Internet access. Indeed, the Internet-connected students did more poorly on tests after the class lecture. Dr. Greenfield concludes by unequivocally stating, “Wiring classrooms for Internet access does not enhance learning.”
There has also been some surprising research out of Canada countering the narrative that kids prefer e-learning over traditional education.8 A study conducted by the Canadian Higher Education Strategy Associates on 1,289 college undergraduates found that students actually had a preference for “ordinary, real-life lessons” rather than e-learning or the use of technology. Those results surprised the researchers: “It is not the portrait that we expected, whereby students would embrace anything that happens on a more highly technological level. On the contrary—they really seem to like access to human interaction, a smart person at the front of the classroom.”
Imagine that. Is it possible that we are actually projecting our own infatuation with shiny tech and gadgetry and just assuming that our little digital natives—little Johnny and Suzy—would prefer to learn that way when they actually crave human contact and teaching? The Canadian study would seem to bear that out.
Teaching preferences aside, we are also left with the educational consensus that the high-tech classroom simply isn’t producing better student outcomes. Leonie Haimson, executive director of Class Size Matters, a nonprofit advocating smaller class sizes, puts it more bluntly: “There’s absolutely no evidence showing online learning works, especially K through 12.”
In fact, she thinks it’s actually detrimental. “This trend is likely to undermine education,” she says. “Somehow, [people believe] the idea that putting kids on tablets or computers and giving them software programs to work on is personalizing learning rather than depersonalizing it.” She also points to the profit motive: “Murdoch wants to make money off of public education, so it’s no surprise that Amplify is pushing forward with no evidence that this works. My concern is this is taking money away from proven reforms.”
There was also one other very important concern raised about Rupert Murdoch’s ownership of a company that sought to create all of the educational content that an entire generation of students would use: would his political ideology shape or influence academic curricula?
As the owner of conservative media outlets such as FOX News, Murdoch is famous in part for his political leanings. In an article in Forbes magazine titled “Conflict of Interest Behind News Corp Tablet” (2013), technology commentator Roger Kay speculated that Murdoch could potentially use ed tech as another media market to spread his political gospel to kids:
“From my point of view, the problem with [Murdoch’s] News Corp. being in this business is that it creates a channel to our youngest, most vulnerable minds for a guy with extreme politics and highly questionable ethics.”
Yes, curricula would have to adhere to Common Core guidelines, but as most news readers know, “objective” news isn’t always “fair and balanced.” News—and academic content—can be shaped by editorial bias.
Kay concludes by saying: “I don’t know about you, but I don’t want those guys anywhere near the controls of a conduit that funnels ‘learning materials’ to my kids. . . . School systems should be very wary of buying anything from this source.”
All of the concerns that I’ve just cited are the views of education researchers, educators, education experts and even technology experts.
But none of that deterred Murdoch’s lawyer-for-hire, Joel Klein. Like a modern-day P. T. Barnum, he went on a media barnstorming campaign, carnival barking in his shrillest voice about how his magic tablets would “transform” the broken educational system.
In a 2013 New York Times interview (“No Child Left Untableted”), Klein repeats this mantra, glowingly talking about the marvels of Amplify’s tablets and saying that education is “ripe for disruption.” Meanwhile, the article’s author, Carlo Rotella, director of American Studies at Boston College, wryly notes: “Entrepreneurs sound boldly unconventional when they talk about disrupting an industry, but they also sound as if they’re willing to break something in order to fix it—or just profit from it.”
Klein then goes on to make several sweeping statements about the dire state of American education. He needs to make this point convincingly before he can persuade the public to buy his cure. As Richard Rothstein, former national education columnist for the New York Times, says in an article he wrote to rebut Klein: “The assertion by school reformers—that their treatments are necessary because the patient is dying” is central to “a belief that public education needs to be transformed by the technology he is selling.”
So the first part is to convince everyone that education is on life support, and the second part should be: my cure works. Klein misses on both counts; that’s not to say that public education can’t be improved, but it’s not as broken as Klein claims. And like many experimental “cures,” this one just might kill the patient. At the very least, it will only widen the achievement gap, because, as we have seen in the London School of Economics cell phone study and Dr. Toyama’s Law of Amplification, marginalized students and poor schools suffer the most when distracting technology is allowed in the classroom.
So is the patient dying?
They are, according to Klein. In a September 2013 New York Times Magazine interview, Klein says: “K-12 isn’t working . . . and we have to change the way we do it . . . Between 1970 and 2010 we doubled the amount of money we spent on education and the number of adults in the schools, but the results are just not there. Any system that poured in as much money as we did and made as little progress has a real problem. We keep trying to fix it by doing the same thing, only a little different and better. This [tablet-based instruction] is about a lot different and better. . . . We’ve spent so much on things that haven’t worked.” He then made a list of failed solutions, including underused computers, obsolete textbooks, useless layers of bureaucracy and smaller class sizes.
Richard Rothstein counters these exaggerated or misleading claims in the Washington Post. Yes, money spent on education has doubled since 1970, but half that amount has gone to providing education services to disabled and special needs children—kids who in 1970 were not acknowledged as being entitled to free public education. Rothstein claims: “It is foolish, as Mr. Klein in effect does, to claim that because we are now spending so much money on children with disabilities, schools must be failing because the spending has not caused the achievement of regular students to improve.”
Even more importantly, Rothstein claims that Klein is wrong that achievement hasn’t improved since 1970: “Our only sources of information about trends in academic achievement are two sampled tests sponsored by the federal government, the National Assessment of Educational Progress. One . . . shows that academic achievement for black children has improved so much that black fourth graders nationwide now have average basic skills proficiency in math that is greater than that of white fourth graders in 1970.
“The other, a test requiring original computations and written answers, shows the average academic achievement of black fourth graders to be greater than that of white fourth graders in 1990. Improvements have also been substantial in reading, and for eighth graders. White students have improved as well, so the black-white test score gap has not changed very much, narrowing only to the extent that black achievement has been rising faster than white achievement.”
Rothstein also points out that there has been improvement at the high school level as well, noting that in the last four decades, the share of young adults who graduate from both high school and college has doubled.
With regard to “failed” interventions such as smaller class sizes, Rothstein says: “This, too, is only the incantation of conventional wisdom, but is not what the research shows. The only scientifically credible study of class size reduction, an experiment conducted in Tennessee 20 years ago, found that smaller classes were of particular benefit to disadvantaged children in the early grades . . .”
He concludes by saying: “Of course, like any institution, public education should be improved. We should be able to do much better. But some, perhaps many of the things American schools have been doing have turned out to be quite successful. By making a blanket charge of failure and proposing to overturn the entire enterprise, whether in favor of tablet-based instruction, charter schools . . . or private school vouchers, the reformers may well be destroying much of what has worked in favor of untested fads.”
Interestingly, Klein often brings up his own overstated biography of a poor-kid-from-Queens-housing-projects-makes-good-thanks-to-great-teachers as further “evidence” that things are broken in New York City public schools. According to Klein, great public school teachers were responsible for his path to success. Yet he implies that today’s disadvantaged kids fail because those opportunities no longer exist, as that once-glorious public school system full of wonderful teachers has crumbled and now exists only in memory.
The solution? A tablet in every pot.
Aside from the claims of several people that Klein has exaggerated the conditions of an essentially middle-class upbringing (assertions I agree with, having grown up less than ten blocks from where he was raised), the New York City public education system that was good enough to get him into Columbia and then Harvard has not changed that radically.
Eighteen years after he did, I also made it from New York City public school to Ivy League campus. Today, 30 years later, thousands of kids pull off that same trick every year. Yes, of course it’s an imperfect system that needs plenty of work. But Klein needs us to believe that the entire system is broken beyond repair—that it’s “ripe for disruption”—in order to sell us his digital cure.
But the Amplify story does not end there—there is an interesting epilogue.
Amplify failed. The company, never able to sell as many tablets as it envisioned, subsequently bled money. After losing over $371 million in 2015 alone—not to mention the $1 billion that he had invested since 2010—Murdoch decided to cut his losses and put the whole thing up for sale.
Finally, after laying off two-thirds of its staff—approximately 800 employees—the sputtering entity was sold in October 2015 to 11 Amplify executives, including Joel Klein. The terms of the sale were not disclosed.9
But in an interesting restructuring, original Wireless Generation founder Larry Berger took over as CEO, and Joel Klein was kicked upstairs to the board of directors. They’re getting back to basics: they are retaining the curriculum (Amplify Learning) and the analytics and data assessment (Amplify Insight) arms of the business. And the tablet? In the garbage. The failed Amplify Access has been discontinued.
As educational tech consultant Doug Levin said in Education Week, Murdoch and Klein’s foray into the K-12 marketplace “was another example in a long history of education entrepreneurs who have crashed on the rocks because the market was not what they thought it would be.”
And now, I present Act II of Greed in Education—the West Coast Version.
The Los Angeles School District and the $1.3 Billion iPad Fiasco
While the largest school district in the United States did its best to fend off Joel Klein, Rupert Murdoch and the invasion of the glowing screens, the second-largest district didn’t fare so well. It succumbed to the glow—to the tune of 1.3 billion wasted dollars.
This West Coast Version of Greed in Education has served as a cautionary tale throughout the land. It even led to this colorful headline on Mashable: “L.A.’s ‘iPad for Every Student Program’ is a Complete Sh*t Show” (April 17, 2015).
Where to begin?
Superintendent John Deasy thought that it would be a great idea to have every single student in Los Angeles Unified School District—all 650,000—get an iPad loaded with educational software goodies from Pearson, one of the country’s biggest educational publishers. All for the low, low cost of $1.3 billion.
School officials and tech advocates who pushed for this framed the debate, obscenely, as a civil rights issue—using those actual words: “This is a civil rights issue. My goal is to provide youth in poverty with tools that heretofore only rich kids have had. And I’d like to do that as quickly as possible.” Superintendent Deasy said in a promotional video that he made, interestingly, for Apple in 2011.
Move over, Rosa Parks—an iPad needs to sit next to you.
With messianic fervor, Deasy said that the tablets would lead to “huge leaps in what’s possible for students” and would “phenomenally . . . change the landscape of education.”
Deasy wasn’t alone in embracing this misguided idea that student access to the Internet had somehow become a right as inalienable as the right to life, liberty and the pursuit of happiness. In a June 2010 Boston Globe article, the writer Rebecca Tuhus-Dubrow not only discussed Internet access as a basic human right, but even suggested what the government’s role should be in securing that “right”:
“Increasingly, activists, analysts, and government officials are arguing that Internet access has become so essential to participation in society—to finding jobs and housing, to civic engagement, even to health—that it should be seen as a right, a basic prerogative of all citizens. And in cases where people don’t have access, whether because they can’t afford it or the infrastructure is not in place, the government should have the power—and perhaps the duty—to fix that.”
Predictably, media executives who may have had a financial interest in proclaiming Internet access a human right were quick to agree: “Access to the internet is akin to a civil rights issue for the twenty-first century. It’s that access that enables people in poorer areas to equalize access to a quality education, quality health care and vocational opportunities,” was the noble social-justice perspective of Comcast senior vice president David Cohen.
One man who does not think that access to the Internet is a civil right is the man who invented it. No, I’m not talking about Al Gore. I’m talking about Dr. Vinton G. Cerf, a legendary engineering pioneer widely known as one of the “fathers of the Internet.” Cerf was the co-designer of the TCP/IP protocols and architecture of the Internet; in December 1997 President Bill Clinton presented him with the U.S. National Medal of Technology, and in 2005 he was given the Presidential Medal of Freedom by President George W. Bush for his work in helping to create the Internet.
In a January 4, 2012, New York Times op-ed article titled “Internet Access Is Not a Human Right,” Cerf had this to say on whether or not access to his progeny—his invention—was indeed a right:
“That argument, however well meaning, misses a larger point: technology is an enabler of rights, not a right itself. There is a high bar for something to be considered a human right. Loosely put, it must be among the things we humans need in order to lead healthy, meaningful lives like freedom from torture or freedom of conscience. It is a mistake to place any particular technology in this exalted category, since over time we will wind up valuing the wrong things. For example, at one time if you didn’t have a horse, it was hard to make a living. But the important right in that case was the right to make a living, not the right to a horse.”
His point is well taken. “Things,” like tablets, cars or, as he wryly notes, horses, are not human or civil rights. No technology is.
But Superintendent Deasy was very passionate about the “right” to have an iPad and spread his zeal to an agreeable school board, which voted for the plan to give every student an iPad. The district had estimated that it would cost about $500 million to obtain more than 600,000 tablets and the accompanying software and an additional $800 million to install wireless Internet and other infrastructure at more than 1,000 schools and offices. Unfortunately, the cash-strapped district didn’t have that kind of money lying around—so it had to sell public bonds in order to raise the money.
In hindsight, the board members think that they may have voted too hastily. A September 4, 2014, Los Angeles Times article quoted several board members as saying that they should have asked tougher questions early on and were too quick to defer to their “crusading superintendent” and an ongoing mission they also strongly believe in—closing the technological gap between Los Angeles’ poor students and their wealthier peers.
“The notion of the constantly ticking inequity clock” fueled the fervor of iPads for all, school board member Steve Zimmer said. “It’s my job to balance that urgency with scrutiny. And never have I failed more at that balance.”
So how does this story end?
With an FBI investigation and $1.3 billion spent on a dysfunctional disaster. The Pearson platform had an incomplete curriculum that was essentially worthless, and the tablets themselves were easily hacked within weeks by students who bypassed the feeble security restrictions and were able to freely surf the Internet—video games and porn for everyone!
The whole deal was killed in December 2014—the day after the FBI seized 20 boxes of documents from the district’s business office as part of its investigation into the contract with Apple.
Under scrutiny were the bidding process and the relationship between Superintendent John Deasy—who resigned abruptly under pressure in October 2014—and his close relationship with Apple and Pearson executives, the beneficiaries of the mammoth contract. The deal is also being currently investigated by a federal grand jury.10
Where did this all go wrong?
To answer that, we need to go to back to the beginning.
John Deasy was hired as superintendent in 2011 and was determined to make a difference. By most accounts, he was passionate and sincere about his desire to make the Los Angeles school district better and to help level the student playing field, given what he had perceived as an achievement gap.
To be sure, the district he inherited was in crisis: thousands of teachers, counselors and librarians had lost their jobs during the recession; fewer than half of the students were reading at grade level, and more than 10,000 students were dropping out of high school annually.
Deasy made no apologies—and ruffled some feathers—as a reformer who was going to fix a very complicated mess in order to make things better for students: “I’m not going to be interested in looking at third graders and saying, ‘Sorry, this is the year you don’t learn to read,’ or to juniors and saying, ‘You don’t get to graduate,” he told Los Angeles Public Radio station KPCC in 2012. “So the pace needs to be quick, and we make no apologies for that.”11
He was clearly a man on a mission. Unfortunately, he picked the wrong mission.
I asked my friend Dr. Pedro Noguera, who knew Deasy professionally, what he thought of him. Pedro is one of the most respected voices in American education; he’s been tenured at Berkeley, Harvard and NYU and is currently Distinguished Professor of Education at UCLA. He is one of the most thoughtful, caring people and educators you will ever meet. Pedro told me: “John Deasy is a good man—he tried to make a positive difference; the teachers union wasn’t thrilled with him because he had no patience for the union. But he tried to do what he thought was best for the kids.”
Unlike Joel Klein and Amplify in New York, Deasy hadn’t sold out to corporate overlords. Nonetheless, as is clear when the emails exchanged by Deasy, Pearson and Apple are examined, Deasy was enthralled by the prospect of working with the tech giants; they, in turn, seemed only too eager to financially exploit his enthusiasm. But there has been no accusation or insinuation that Deasy personally profited from the deal. What is apparent is that Deasy was a zealot who believed in tech as the cure and in himself as “the one” who would transform the broken Los Angeles public school system—and that he would go to any lengths to realize his vision.
The essence of the FBI investigation is that dozens of meetings, conversations and email exchanges with Pearson and Apple had occurred beginning nearly a year before Los Angeles Unified officially put the project out to bid. Eventually, 19 other bids were also submitted. Apple and Pearson, although not initially the lowest bidders (as finalists, they were allowed to rejigger and lower their bid), won the lucrative contract on June 24, 2013.
And the epilogue?
Nearly two years later, the program is dead. Deasy has resigned amid scandal. The FBI investigation is ongoing. And the Securities and Exchange Commission (SEC) has gotten involved, having recently questioned school district officials as part of an informal inquiry into whether they properly used bond funds for the disastrous $1.3 billion project.
The sad reality is that companies like Apple and Pearson are profit-driven entities whose mission statement is to increase the bottom line. I think we all understand that this is America, and that companies should be allowed to make profits, but they shouldn’t do so at the expense of children’s well-being. There should be extra scrutiny and vetting before schools get in bed with for-profit companies, because, unfortunately, those companies don’t always have the best interests of the kids in mind.
An example: two executives of Houghton Mifflin Harcourt—one of the Big Three in educational publishing—were recently recorded on hidden camera by conservative activist James O’Keefe of Project Veritas, a nonprofit that investigates public- and private-sector misconduct and fraud. In the hidden-cam videos, the cynical executives are caught discussing the Common Core and their concern—or lack thereof—for what’s best for kids.12
“You don’t think that the educational publishing companies are in it for education, do you? No. They’re in it for the money,” Dianne Barrow, the West Coast accounts manager for Houghton Mifflin Harcourt, was caught on camera as saying. After explaining that Common Core is overwhelmingly profit-driven, Barrow went on to say, “I hate kids. I’m in it to sell books. Don’t even kid yourself for a heartbeat,” she says as she starts to laugh hysterically.
Another cynical Houghton Mifflin executive, Strategic Account Manager Amelia Petties, had this to say to the hidden camera about the Common Core: “Common Core is not new. We’re calling it Common Core, woo hoo! Call it Common Core . . . there’s always money in it because kids are great but it’s not always about the kids.” She pauses, then says, “It’s never about the kids,” as she, too, breaks out into loud, cackling laughter.
Petties even suggests that the name Common Core should be changed, because that could increase new sales and marketing opportunities: “Slapping a new name on it, which in my case I hope they do . . . then I could sell a shit ton of training around whatever you’re calling it.”
Regardless of whether you find these comments shocking or business-as-usual, would you want your children’s educational experience manipulated by private, profit-driven companies that demonstrate such contempt for kids and their education?
Meanwhile, back in Los Angeles, they want their money back. Los Angeles Unified general counsel David Holmquist sent a letter to Apple demanding that it stop any delivery of Pearson software and vowed to seek reimbursement for math and reading materials students have been unable to use. The vast majority of students still can’t access Pearson material on their iPads, Holmquist said.
Ah, but the little rascals can play Call of Duty and Grand Theft Auto on their security-bypassed iPads until the cows come home—the tablet-as-civil-rights movement in action.
Critics claim that the move toward tablets and tech should have been implemented more slowly, with a smaller rollout. Perhaps. Meanwhile, back in Silicon Valley, Google and Apple engineers continue to send their little ones to local no-tech, no-tablet Waldorf schools.
Go figure.
Educational Lessons from Down Under
Sydney Grammar is one of Australia’s top performing schools. Founded in 1854, its over 1,100 male students from pre-K through 12 are the sons of Sydney’s business and political elite and routinely place in the top 1 percent of Australian students in university entry scores each year. Boasting three former prime ministers as alumni, the historic school is well funded, with an annual tuition of over $34,000, and boasts some of the finest educators and administrators in all of Australia.
And, shockingly to some, this standard-bearer of elite education has decided to scrap technology and has done away with laptops in the classroom. According to its headmaster Dr. John Vallance, the devices “distract” from teaching, and he described the billions of dollars spent on computers in Australian schools over the past seven years as a “scandalous waste of money.”13
Dr. Vallance is no education slouch; he’s a Cambridge scholar, a trustee of the State Library of NSW Foundation, a director of the National Art School and has been headmaster of Sydney Grammar for 18 years. Indeed, in 2014 the Coalition government appointed him as a special reviewer of the national arts curriculum.
This seasoned educator has taken a decidedly sour outlook on the role of technology in the classroom: “I’ve seen so many schools with limited budgets spending a disproportionate amount of their money on technology that doesn’t really bring any measurable, or non-measurable, benefits,” he said. “Schools have spent hundreds and hundreds of millions of dollars on interactive whiteboards, digital projectors, and now they’re all being jettisoned.”
Further, Dr. Vallance said in a March 26, 2016, interview in The Australian, the $2.4 billion spent by the Australian government on the “Digital Education Revolution,” which used taxpayer monies to buy laptops for high school students, “didn’t really do anything except enrich Microsoft and Hewlett Packard and Apple,” adding, “they’ve got very powerful lobby influence in the educational community.”
Thus Sydney Grammar has banned students from bringing laptops to school and requires them to handwrite assignments and essays until Year 10. The students have access to computers in the school computer lab, but Dr. Vallance regards laptops in the classroom as a distraction: “We find that having laptops or iPads in the classroom inhibits conversation—it’s distracting.”
Dr. Vallance believes that “if you’re lucky enough to have a good teacher and a motivating group of classmates, it would seem a waste to introduce anything that’s going to be a distraction from the benefits that kind of social context will give you.” He added, “We see teaching as fundamentally a social activity. It’s about interaction between people, about discussion, about conversation,” and he thinks that computers in the classroom have robbed children of the chance to debate and discuss ideas with their teacher.
He also feels that laptops have led to less rigor in the classroom and have taken away from teacher preparation, indicating that laptops “introduced a great deal of slackness” in teaching and “made it much easier of giving the illusion of having prepared a lesson.”
He also inherently believes in the educational benefit of learning to write by hand: “Allowing children to lose that capacity to express themselves by writing is a very dangerous thing.” He said that Sydney Grammar had been studying the difference between handwritten and computer-typed tasks among boys in Year 3 and Year 5. “In creative writing tasks, they find it much easier to write by hand, to put their ideas down on a piece of paper, than they do with a keyboard.”
Aware that he’d be criticized as out of step and anti-technology, Dr. Vallance said he was sure people would call him a “dinosaur,” but responded by saying, “I’m in no way anti-technology. I love gadgets. It’s partly because we all love gadgets so much that we have these rules, otherwise we’d all just muck about. Technology is a servant, not a master. You can’t end up allowing the tail to wag the dog, which I think it is at the moment.”
Dr. Vallance said it was a “really scandalous situation” that Australia was “spending more on education than ever before and the results are gradually getting worse and worse” and said he preferred to spend on teaching staff than on technology. “They end up being massive lines in the budgets of schools which at the same time have leaky toilets and roofs and ramshackle buildings. If I had a choice between filling a classroom with laptops or hiring another teacher, I’d take the other teacher every day of the week.”
The internationally respected Organization for Economic Co-operation and Development (OECD) has also chimed in and questioned the growing reliance on technology in schools. In a 2015 report, it said schools must give students a solid foundation in reading, writing and math before introducing computers. Indeed, it found that heavy users of computers in the classroom “do a lot worse in most learning outcomes,” and concluded by saying: “In the end, technology can amplify great teaching, but great technology cannot replace poor teaching,”
Dr. Vallance takes an even more cynical view: “I think when people come to write the history of this period in education . . . this investment in classroom technology is going to be seen as a huge fraud.”
Reading Effects: Screens vs. Paper
As far as education and screens in the classroom go, there is also the issue of the comprehension differences between reading something on a radiant screen rather than on paper.
In a study called “Reading Linear Texts on Paper Versus Computer Screen: Effects on Reading Comprehension,” published in January 2013 in the International Journal of Educational Research, Professor Anne Mangen of the University of Stavanger in Norway found that students who read text on computers performed worse on comprehension tests than students who read the same text on paper.14
Mangen and her colleagues had asked 72 tenth-grade students of similar reading ability to study one narrative and one expository text, each about 1,500 words in length. Half the students read the texts on paper and the other half read them in pdf files on computer screens. Afterward, students completed reading-comprehension tests consisting of multiple-choice and short-answer questions, during which they had access to the texts.
While Joseph Chilton Pearce has attributed the decreased-comprehension effects of screens to the problematic way that the brain processes radiant light, Mangen thinks that students reading on computer screens had a more difficult time finding particular information when referencing the texts because they could only scroll or click through the pdfs one section at a time. The students reading on paper, by contrast, could hold the whole text in their hands and switch between different pages.
Mangen surmised: “The ease with which you can find out the beginning, end and everything in between and the constant connection to your path, your progress in the text, might be some way of making it less taxing cognitively, so you have more free capacity for comprehension.”
This notion that reading, far from being a static affair, is instead a journey over a word landscape is echoed in a 2013 Scientific American article called “The Reading Brain in the Digital Age: The Science of Paper Versus Screens.”15
“There is physicality in reading,” developmental psychologist and cognitive scientist Maryanne Wolf of Tufts University is quoted as saying in the article, “maybe even more than we want to think about as we lurch into digital reading—as we move forward perhaps with too little reflection.”
From an evolutionary perspective, writing is a relatively new phenomenon. Thus, as far as our brains are concerned, text is a tangible part of our physical world. Indeed, early writing, such as Sumerian cuneiform or Egyptian hieroglyphics, began as pictorial representations shaped like the objects they represented. Even in our modern alphabet, we see traces of these pictorial roots: C as crescent moon, S as snake.
The article points out that beyond just treating individual letters as physical objects, the human brain may also perceive a text, as Mangen suggests, in its entirety as a kind of physical landscape. In that sense, paper books present a more obvious landscape, with a much more pronounced topography, than onscreen texts.
An open book presents two clearly defined domains—the left and right pages—and a total of eight corners with which readers can orient themselves. In addition, the book traveler can physically see where the book begins and ends and where a particular page is in relation to those points. Finally, the reader can gauge by the thickness of the pages how much has been read/traveled and how much of the journey remains. These are reassuring physical markers that can help the reader form a coherent mental map.
In contrast, most screens lack all of that and thus inhibit people from mapping the journey in their minds. A reader of screen text might scroll through a stream of words, but it is extremely difficult to see any one passage in the context of the entire text. Even though e-readers like the Kindle and tablets like the iPad can re-create pagination, the screen only displays a single page at a time—the rest of the word landscape remains out of view and out of physical touch.
And that matters.
“The implicit feel of where you are in a physical book turns out to be more important than we realized,” says Abigail Sellen of Microsoft Research Cambridge, in England, and co-author of The Myth of the Paperless Office. “Only when you get an e-book do you start to miss it. I don’t think e-book manufacturers have thought enough about how you might visualize where you are in a book.”
More Screens, Less Eye Contact
Apart from screens and reading deficits, other education experts also point out the potential adverse social effects: “Major concerns are focused on the impact of computers on the children’s social and emotional development.”
According to a report by Colleen Cordes and Edward Miller (2000): “Children between the ages of 10 and 17 today will experience nearly one-third fewer face-to-face interactions with other people throughout their lifetimes as a result of their increasingly electronic culture, at home and in school.”16 Keep in mind that the one-third estimate is from 16 years ago; what might it be today?
Whatever happened to eye contact? It’s gone way down, thanks to our screen culture. A Wall Street Journal article published in May 2013, “Just Look Me in the Eye Already,” examined the way that technology use has affected our eye contact—and the negative effect that is having on our relationships.
According to Quantified Impressions, a Texas-based communications analytics company, an adult now makes eye contact between only 30 and 60 percent of the time in a typical conversation, but emotional connection is built when eye contact is made during 60 to 70 percent of the conversation. In other words, the less eye contact, the less a connection is made.
Our screens and screen culture have normalized the experience of having conversations with little or no eye contact. We’ve seen it in adults, and we all have certainly seen it in kids. Unfortunately, we are losing something vital and inherently human.
“Eye contact, although it occurs over a gap of yards, is not a metaphor,” psychiatrists Thomas Lewis, Fari Amini and Richard Lannon write in the book A General Theory of Love. “When we meet the gaze of another, two nervous systems achieve a palpable and intimate apposition.”
Adults lament that kids no longer make eye contact, but parents are often guilty of providing the model for that behavior, in what has been called “Distracted Parent Syndrome.”
As Carolyn Gregoire writes in “How Technology is Killing Eye Contact,” the September 28, 2013, entry of her column in The Huffington Post: “Many parents are concerned about what their own digital multitasking and lack of eye contact might be communicating to their children.”
Blogger Rachel Marie Martin wrote in a recent post, “20 Things I Will Regret Not Doing with My Kids”: “I want my kids to remember that there were times when their mother looked them in the eye and smiled. And for me this often means shutting my laptop, putting down my phone, stopping my list, and just giving them time.”
Screens in the Classroom? Think First—Screens Later
As Dr. Toyama discovered with his theory of technology’s Law of Amplification, technology can help education where it’s already doing well, but it does little for mediocre educational systems and, worse, in dysfunctional schools it “can cause outright harm.” In my research for this book and in conversations with various education experts, that seemed to be the consensus—but with an additional proviso: that technology can only help when a child or student is developmentally ready to handle powerful and hypnotic screens.
Technology can certainly be helpful in a well-supported and thoughtful high school curriculum. Perhaps even in middle school, some limited exposure to computer learning can be helpful. But the notion of sticking a radiant screen in the hands of a kindergartener or a child in elementary school is not only not helpful educationally, but, as we have read, could be neurologically and clinically harmful—especially for already vulnerable children.
Even Joel Klein, Rupert Murdoch’s dark prince, of all people, supports that view. Speaking to Carlo Rotella for the September 12, 2013, issue of the New York Times Magazine, he responded to tech-in-the-classroom criticisms from Sherry Turkle, MIT professor and author of Alone Together, by saying that he wouldn’t put fourth-graders in a MOOC—massive open online course—and that he would exercise “great restraint” in introducing technology into a kindergarten classroom.17 Thank goodness for small favors.
Maybe we should follow the lead of the tech engineers and Waldorf schools wait until our kids are beyond third or fourth grade—some suggest they should be at least ten years old—before interactive tablets are introduced.
In a 1999 interview, Joseph Chilton Pearce discussed a four-day symposium in Berkeley that he attended, where 21 education experts from around the world discussed computers in education: “At that . . . symposium at Berkeley we concluded that everything hinges on age appropriateness. One professor from MIT made the passionate plea that we must encourage children to develop the ability to think first, and then give them the computer. After that, the sky’s the limit. But if you introduce the computer before the child’s thought processes are worked out, then you have a disaster in the making. This is because, as Piaget pointed out, the first twelve years of life are spent putting into place the structures of knowledge that enable young people to grasp abstract, metaphoric, symbolic types of information. . . . The danger here is that the computer . . . will interrupt that development.”18
In “Computer Integration into the Early Childhood Curriculum,” published in Education (Fall 2012), authors Mona and Heyam Mohammad also frame the issue in developmental terms: “Piaget’s theory, known also as the constructivist perspective, says that learners benefit most from ‘concrete’ experiences or hands-on activities that allow the learner to manipulate his/her environment in order to construct knowledge based on interaction with the world.”19
Translation: Lego, not Minecraft.
As previously mentioned, Pearce attributes a large part of the adverse effect of screens on young children to the radiant light and the fact that children can go “catatonic” in front of a screen: “This has to do with the way that that the brain reacts to radiant light, which is the light source of TV and computer monitors, and reflected light, which is what brings us the rest of our visual experience . . . the brain tends to close down in response to radiant light sources. We’ve all seen how children get when they watch television for any length of time.”
In that interview, Pearce goes on to describe how the television industry started introducing “startle effects” into kids’ programming in order to snap kids out of their trance so that they could pay attention again. But over time, like any hypnotic drug, desensitization occurs to the point where the attention-grabbing startle effects had to become bigger and bigger.
Still, as Pearce explains, while the child’s neocortex may realize that the increasingly shocking images aren’t real, the “reptilian” brain does not, and the child goes into perpetual cortisol-releasing fight-or-flight response. This massive overstimulation is “causing the brain to maladapt in ways previously thought impossible. It is literally breaking down on all levels of neural development.” Pearce was presciently forecasting some of the neurological and clinical issues that we discussed regarding Dr. Dunckley’s work with electronic screen syndrome.
Wouldn’t it just be a better idea to leave the screens out of the classroom—at least through elementary school? As Pearce says: “We must encourage children to develop the ability to think first, and then give them the computer. After that, the sky’s the limit.”
Unfortunately, that might be easier said than done today. In our new digital landscape, it will become increasingly difficult to insulate a child during his key developmental years from the advances in our Brave New e-World.
These are indeed strange times we live in. The divide between the real and the digital is increasingly blurring as our society is becoming more and more virtual. The thought-provoking movie The Matrix hints at the mind-bending that may await us just over the horizon.
In the meantime, let’s take a look at the farther reaches of the digital landscape today.