NOTES

PROLOGUE

  1. 1. Dahlia Bazzaz, “Dispatches from Parents: Northshore School District’s First Online-Only Day to Prevent Coronavirus Spread,” Seattle Times, March 9, 2020, https://www.seattletimes.com/seattle-news/education/how-northshore-parents-handled-the-first-day-of-online-learning/.

INTRODUCTION

  1. 1. Vanessa Lu and Kristin Rushowy, “Rainbow Loom Bracelet Maker Hot Toy Trend,” Toronto Star, October 4, 2013.

  2. 2. Ashley Rivera, “How to Make a Rainbow Loom Starburst Bracelet,” YouTube video, August 1, 2013, https://www.youtube.com/watch?v=RI7AkI5dJzo.

  3. 3. Cheong Choon Ng, “Experience: I Invented the Loom Band,” Guardian, September 26, 2014, https://www.theguardian.com/lifeandstyle/2014/sep/26/i-invented-the-loom-band-experience.

  4. 4. Henry Jenkins, Confronting the Challenges of Participatory Culture: Media Education for the Twenty-First Century (Cambridge, MA: MIT Press, 2009).

  5. 5. Mizuko Ito, Crystle Martin, Rachel Cody Pfister, Matthew H. Rafalow, Katie Salen, and Amanda Wortman, Affinity Online: How Connection and Shared Interest Fuel Learning (New York: New York University Press, 2018).

  6. 6. Clayton M. Christensen, Michael B. Horn, and Curtis W. Johnson, Disrupting Class: How Disruptive Innovation Will Change the Way the World Learns (New York: McGraw-Hill, 2008), 101.

  7. 7. Salman Kahn, “Let’s Use Video to Reinvent Education,” TED talk, March 1, 2011, https://www.youtube.com/watch?v=nTFEUsudhfs.

  8. 8. Michael Noer, “One Man, One Computer, 10 Million Students: How Khan Academy Is Reinventing Education,” Forbes, November 19, 2012, https://www.forbes.com/sites/michaelnoer/2012/11/02/one-man-one-computer-10-million-students-how-khan-academy-is-reinventing-education/#7c96110644e0; Clive Thompson, “How Khan Academy Is Changing the Rules of Education,” Wired, July 15, 2011, https://www.wired.com/2011/07/ff_khan/; Kayla Webley, “Reboot the School,” Time, July 2012.

  9. 9. Salman Khan, The One World Schoolhouse: Education Reimagined (Boston: Grand Central, 2012).

  10. 10. Laura Pappano, “The Year of the MOOC,” New York Times, November 2, 2012.

  11. 11. David Carr, “Udacity CEO Says MOOC ‘Magic Formula’ Emerging,” InformationWeek, August 19, 2013, https://www.informationweek.com/software/udacity-ceo-says-mooc-magic-formula-emerging/d/d-id/1111221. Three good brief histories of MOOCs are Audrey Watters, “MOOCs: An Introduction,” Hack Education (blog), August 26, 2014, http://hackeducation.com/2014/08/26/introduction-to-moocs; Fiona Hollands and Devayani Tirthali, MOOCS: Expectations and Reality, Full Report (New York: Center for Benefit-Cost Studies of Education, 2014), https://files.eric.ed.gov/fulltext/ED547237.pdf; and Barbara Means, Marianne Bakia, and Robert Murphy, Learning Online: What Research Tells Us about Whether, When, and How (New York: Routledge, 2014). On Sebastian Thrun’s prediction, see Steven Leckart, “The Stanford Education Experiment Could Change Higher Learning Forever,” Wired, March 30, 2012, https://www.wired.com/2012/03/ff_aiclass/. For a critical response, see Audrey Watters, “A Future with Only Ten Universities,” Hack Education (blog), October 15, 2013, http://hackeducation.com/2013/10/15/minding-the-future-openva.

  12. 12. On Thrun’s reversal, see Max Chafkin, “Udacity’s Sebastian Thrun, Godfather of Free Online Education, Changes Course,” Fast Company, November 14, 2013, https://www.fastcompany.com/3021473/udacity-sebastian-thrun-uphill-climb. For contemporaneous criticism of the comments, see Rebecca Schuman, “The King of MOOCS Abdicates the Throne,” Slate, November 19, 2013, https://slate.com/human-interest/2013/11/sebastian-thrun-and-udacity-distance-learning-is-unsuccessful-for-most-students.html.

  13. 13. Emily Ann Brown, “Sal Khan Envisions a Future of Active, Mastery-Based Learning,” District Administration, January 31, 3019, https://districtadministration.com/sal-khan-envisions-a-future-of-active-mastery-based-learning/. There is no robust published evidence to validate Khan’s claims about the magnitude of learning gains.

  14. 14. Kenneth R. Koedinger, John R. Anderson, William H. Hadley, and Mary A. Mark, “Intelligent Tutoring Goes to School in the Big City,” International Journal of Artificial Intelligence in Education 8 (1997): 30–43.

  15. 15. Philip Wesley Jackson, Life in Classrooms (New York: Teachers College Press, 1990), 166–167; Larry Cuban, The Flight of a Butterfly or the Path of a Bullet? Using Technology to Transform Teaching and Learning (Cambridge, MA: Harvard Education Press, 2018).

  16. 16. Justin Reich and Mizuko Ito, “Three Myths about Education Technology and the Points of Light Beyond,” Connected Learning Alliance Blog, October 30, 2017, https://clalliance.org/blog/three-myths-education-technology-points-light-beyond/.

  17. 17. Morgan G. Ames, The Charisma Machine: The Life, Death, and Legacy of One Laptop per Child (Cambridge, MA: Harvard University Press, 1995).

  18. 18. David Tyack and Larry Cuban, Tinkering toward Utopia: A Century of Public School Reform (Cambridge, MA: MIT Press, 2019).

  19. 19. For an example of my teaching, see Justin Reich, “Conflict and Identity: Using Contemporary Questions to Inspire the Study of the Past,” World History Connected, last modified February 2007, https://worldhistoryconnected.press.uillinois.edu/4.2/reich.html.

  20. 20. UNESCO Global Education Monitoring Report, “Six Ways to Ensure Higher Education Leaves No One Behind,” Policy Paper 30, April, 2017, https://unesdoc.unesco.org/ark:/48223/pf0000247862.

  21. 21. Victoria Lee and Constance Lindsey, “Access to Calculus Could Hinder Low-Income and Black Students,” Urban Wire (blog), March 6, 2018, https://www.urban.org/urban-wire/unequal-access-calculus-could-hinder-low-income-and-black-students.

1. INSTRUCTOR-GUIDED LEARNING AT SCALE

  1. 1. Audrey Watters, “MOOCs: An Introduction,” Modernlearners.com, n.d., https://modernlearners.com/moocs-an-introduction; Fiona M. Hollands and Devayani Tirthali, MOOCs: Expectations and Reality (New York: Center for Benefit-Cost Studies, Teachers College, Columbia University, 2014); Barbara Means, Learning Online: What Research Tells Us about Whether, When and How (New York: Taylor and Francis, 2014); John Naughton, “Welcome to the Desktop Degree,” Guardian, February 4, 2012, https://www.theguardian.com/technology/2012/feb/05/desktop-degree-stanford-university-naughton.

  2. 2. “Press Conference: MIT, Harvard Announce edX,” YouTube video, May 3, 2012, https://www.youtube.com/watch?v=7pYwGpKMXuA; Laura Pappano, “The Year of the MOOC,” New York Times, November 2, 2012.

  3. 3. 1999–2000 online learning numbers are from Anna Sikora and C. Dennis Carroll, A Profile of Participation in Distance Education: 1999–2000 (Washington, DC: National Center for Educational Statistics, 2002), https://nces.ed.gov/pubs2003/2003154.pdf. For background on early HarvardX and MITx MOOCs, see Andrew Ho, Justin Reich, Sergiy Nesterko, Daniel Seaton, Tommy Mullaney, James Waldo, and Isaac Chuang, “HarvardX and MITx: The First Year of Open Online Courses, Fall 2012–Summer 2013 (HarvardX and MITx Working Paper No. 1),” SSRN (2014), https://ssrn.com/abstract=2381263.

  4. 4. “Clayton Christensen Interview with Mark Suster at Startup Grind 2013,” YouTube video, 06:40, “Startup Grind,” February, 20, 2013, https://www.youtube.com/watch?v=KYVdf5xyD8I. For more on the idea of “super-professors,” see Justin Reich, “Personalized Learning, Backpacks Full of Cash, Rockstar Teachers, and MOOC Madness: The Intersection of Technology, Free-Market Ideology, and Media Hype in U.S. Education Reform,” lecture, Berkman Klein Center for Internet and Society at Harvard University, May 7, 2013, https://cyber.harvard.edu/events/luncheon/2013/05/reich. For a related perspective on how online learning would remake higher education, see Kevin Carey, The End of College: Creating the Future of Learning and the University of Everywhere (New York: Riverhead Books, 2016); and Steven Leckart, “The Stanford Education Experiment Could Change Higher Learning Forever,” Wired, March 28, 2012, https://www.wired.com/2012/03/ff_aiclass/.

  5. 5. Daphne Koller, “What We’re Learning from Online Education,” filmed June 2012 at TEDGlobal 2012, Edinburgh, Scotland, video, https://www.ted.com/talks/daphne_koller_what_we_re_learning_from_online_education?language=en.

  6. 6. Koller, “What We’re Learning.”

  7. 7. The Quote Investigator website has an excellent investigation into this framing, often misattributed to Yeats; see “The Mind Is Not a Vessel That Needs Filling, but Wood That Needs Igniting,” Quote Investigator, last modified March 28, 2013, https://quoteinvestigator.com/2013/03/28/mind-fire/. The Loeb Classical Library translation has the full quote as, “For the mind does not require filling like a bottle, but rather, like wood, it only requires kindling to create in it an impulse to think independently and an ardent desire for the truth”; see Bill Thayer, “Plutarch, Moralia: On Listening to Lectures,” last modified April 1, 2018, http://penelope.uchicago.edu/Thayer/E/Roman/Texts/Plutarch/Moralia/De_auditu*.html. The first known usage of the translation that I use comes from James Johnson Sweeney, Vision and Image: A Way of Seeing (New York: Simon and Schuster, 1968), 119. The Lagemann quotation, definitively, is from Ellen Lagemann, “The Plural Worlds of Educational Research,” History of Education Quarterly 29, no. 2 (1989): 185–214, https://doi.org/10.2307/368309. Lagemann’s goal in the essay is to highlight how these different pedagogical traditions led to different traditions among different scholarly communities in the late twentieth century; part of the task of this book is to show that these fractures continue in the digital age, and I share Lagemann’s enthusiasm for bridging these divides.

  8. 8. John Dewey, “My Pedagogic Creed,” School Journal 54, no. 3 (1897): 77–80, http://dewey.pragmatism.org/creed.htm.

  9. 9. One early entry point into Thorndike’s work is Edward Thorndike, The Psychology of Learning (New York: Teachers College, 1913).

  10. 10. For research on MOOC instructional quality, see Anoush Margaryan, Manuela Bianco, and Allison Littlejohn, “Instructional Quality of Massive Open Online Courses (MOOCs),” Computers & Education 80 (2015): 77–83. The literature of MOOC criticism is extensive and thoughtful. An early response was from the faculty of the Philosophy Department of San Jose State University, when the university proposed requiring students to use Harvard’s JusticeX for an introductory course; “An Open Letter to Professor Michael Sandel from the Philosophy Department at San Jose State U,” Chronicle of Higher Education, May 2, 2013, https://www.chronicle.com/article/The-Document-an-Open-Letter/138937. Elizabeth Losh has a monograph and an edited volume that are useful starting points for critique; Elizabeth Losh, The War on Learning: Gaining Ground in the Digital University (Cambridge, MA: MIT Press, 2014); and Elizabeth Losh, MOOCs and Their Afterlives: Experiments in Scale and Access in Higher Education (Chicago: University of Chicago Press, 2017). Jonathan Rees is another thoughtful critic through his More or Less Bunk blog; a starting point is a piece in Slate: Jonathan Rees, “The MOOC Racket,” Slate, July 25, 2013, https://slate.com/technology/2013/07/moocs-could-be-disastrous-for-students-and-professors.html.

  11. 11. James Becker, Toward Automated Learning. A Professional Paper (Pittsburgh: Research for Better Schools, 1968); William Cooley and Robert Glaser, “The Computer and Individualized Instruction,” Science 166, no. 3905 (1969): 574–582; James Becker and Robert Scanlon, Applying Computers and Educational Technology to Individually Prescribed Instruction (Pittsburgh: Research for Better Schools, 1970).

  12. 12. On feature convergence, see Carl Straumsheim, “Where Does the LMS Go from Here?” Inside Higher Ed, September 23, 2014, https://www.insidehighered.com/news/2014/09/23/educause-gates-foundation-examine-history-and-future-lms.

  13. 13. For the relationship between MOOC platform and pedagogy, see Shreeharsh Kelkar, “Engineering a Platform: The Construction of Interfaces, Users, Organizational Roles, and the Division of Labor,” New Media & Society 20, no. 7 (2018): 2629–2646.

  14. 14. For an early description of the storefront metaphor, see Michael Feldstein, “Is Coursera Facebook, Amazon, or Pets.com?,” e-Literate, November 14, 2012, https://mfeldstein.com/is-coursera-facebook-amazon-or-petscom-2/. Coursera and edX weren’t the first to directly market online courses—Udemy had been marketing direct-to-consumer online courses since 2010, as one example—but they were the first to my knowledge to do so with elite university partners.

  15. 15. For a recent review, see Draylson M. Souza, Katia R. Felizardo, and Ellen F. Barbosa, “A Systematic Literature Review of Assessment Tools for Programming Assignments,” presentation, International Conference on Software Engineering Education and Training, Dallas, TX, April 2016, IEEE, https://ieeexplore.ieee.org/document/7474479. For an example, see the check50 program developed by Harvard’s CS50 team: https://cs50.readthedocs.io/check50/.

  16. 16. An early iteration of websim, the circuits simulator used in the first MITx course, is online at http://euryale.csail.mit.edu/.

  17. 17. Test developers sometimes argue that multiple-choice questions can test critical thinking or reasoning. Stanford researcher Mark Smith contests this claim, demonstrating that high-performing students on history assessments do not use complex reasoning to take multiple-choice tests, but rather use “three construct-irrelevant processes: factual recall / recognition, reading comprehension, and test-taking strategies.” Mark D. Smith, “Cognitive Validity: Can Multiple-Choice Items Tap Historical Thinking Processes?” American Educational Research Journal 54, no. 6 (2017): 1256–1287.

  18. 18. On evaluating writing with peer grading and automated essay scoring, see Stephen P. Balfour, “Assessing Writing in MOOCs: Automated Essay Scoring and Calibrated Peer Review™,” Research & Practice in Assessment 8 (2013): 40–48. For peer and self-assessment in MOOCs, see Chinmay Kulkarni, Koh Pang Wei, Huy Le, Daniel Chia, Kathryn Papadopoulos, Justin Cheng, Daphne Koller, and Scott R. Klemmer, “Peer and Self Assessment in Massive Online Classes,” ACM Transactions on Computer-Human Interaction (TOCHI) 20, no. 6 (2013): 1–31.

  19. 19. The literature and commentary on MOOCs’ low completion rates are extensive. One interesting voice making this point was Larry Bacow (along with Michael McPherson), who went on to become Harvard University president; Michael S. McPherson and Lawrence S. Bacow, “Online Higher Education: Beyond the Hype Cycle,” Journal of Economic Perspectives 29, no. 4 (2015): 135–154. The most comprehensive early study of MOOCs and completion rates is Katy Jordan, “Massive Open Online Course Completion Rates Revisited: Assessment, Length and Attrition,” International Review of Research in Open and Distributed Learning 16, no. 3 (2015): 341–358. Early reports on completion rates include Gayle Christensen, Andrew Steinmetz, Brandon Alcorn, Amy Bennett, Deirdre Woods, and Ezekiel Emanuel, “The MOOC Phenomenon: Who Takes Massive Open Online Courses and Why?” SSRN (2013), https://ssrn.com/abstract=2350964; René F. Kizilcec, Chris Piech, and Emily Schneider, “Deconstructing Disengagement: Analyzing Learner Subpopulations in Massive Open Online Courses,” presentation, International Conference on Learning Analytics and Knowledge, Leuven, Belgium, April 2013; Ho et al., “The First Year of Open Online Courses.” On completion rates of students who intend to complete, see Justin Reich, “MOOC Completion and Retention in the Context of Student Intent,” EDUCAUSE Review Online, December 8, 2014, https://er.educause.edu/articles/2014/12/mooc-completion-and-retention-in-the-context-of-student-intent. On completion rates of learners pursing verified certificates, see Justin Reich and José A. Ruipérez-Valiente, “The MOOC Pivot,” Science 363, no. 6423 (2019): 130–131.

  20. 20. Zach Lam, Kathy Mirzae, Andreas Paepcke, Krystal Smith, and Mitchell Stevens, “Doing Things with MOOCs: Utilization Strategies of Stanford’s California MOOC Learners,” MIT Office of Digital Learning x-Talks, October 16, 2018, https://openlearning.mit.edu/events/doing-things-moocs-utilization-strategies-learners-massively-open-online-courses.

  21. 21. “Open edX Conference 2018 with Zvi Galil Keynote: Georgia Tech’s Online MOOC-based Master Program,” YouTube video, “Open edX,” June 13, 2018, https://www.youtube.com/watch?v=-ETTblOvH6w; Joshua Goodman, Julia Melkers, and Amanda Pallais, “Can Online Delivery Increase Access to Education?” Journal of Labor Economics 37, no. 1 (2019): 1–34.

  22. 22. Joshua Littenberg-Tobias and Justin Reich, Evaluating Access, Quality, and Equity in Online Learning: A Case Study of a MOOC-Based Blended Professional Degree Program, pre-print retrieved from doi:10.31235/osf.io/ 8nbsz.

  23. 23. Littenberg-Tobias and Reich, “Evaluating Access.”

  24. 24. NanoDegrees have much in common with earlier forms of non-degree technical certificates, such as the Microsoft Certified Technician programs. On the 1990s history of Microsoft and related information technology non-degree certifications, see Clifford Adelman, “A Parallel Universe: Certification in the Information Technology Guild,” Change: The Magazine of Higher Learning 32, no. 3 (2000): 20–29.

  25. 25. For more on for-profit higher education and the role of credentials in higher education, see Tressie McMillan Cottom, Lower Ed: The Troubling Rise of For-Profit Colleges in the New Economy (New York: New Press, 2017).

  26. 26. Phil Hill, “Coursera CEO Interview: Betting on OPM Market and Shift to Low-Cost Masters Degrees,” E-literate, December 6, 2018, https://mfeldstein.com/coursera-ceo-interview-betting-on-opm-market-and-shift-to-low-cost-masters-degrees/. For a critique of OPMs, see Kevin Carey, “The Creeping Capitalist Take Over of Higher Education,” Highline: Huffington Post, April 1, 2019, https://www.huffpost.com/highline/article/capitalist-takeover-college/.

  27. 27. Laura Pappano, “The Boy Genius of Ulan Bator,” New York Times, September 13, 2013, https://www.nytimes.com/2013/09/15/magazine/the-boy-genius-of-ulan-bator.html.

  28. 28. Justin Reich and Ruipérez-Valiente, “MOOC Pivot”; John D. Hansen and Justin Reich, “Democratizing Education? Examining Access and Usage Patterns in Massive Open Online Courses,” Science 350, no. 6265 (2015): 1245–1248; René F. Kizilcec, Andrew J. Saltarelli, Justin Reich, and Geoffrey L. Cohen, “Closing Global Achievement Gaps in MOOCs,” Science 355, no. 6322 (2017): 251–252; Ezekiel J. Emanuel, “Online Education: MOOCs Taken by Educated Few,” Nature 503, no. 7476 (2013): 342. For more recent data, see Isaac Chuang and Andrew Ho, “HarvardX and MITx: Four Years of Open Online Courses—Fall 2012–Summer 2016,” December 23, 2016, https://ssrn.com/abstract=2889436.

  29. 29. The initial report on the SJSU Plus experiment was Ellaine D. Collins, “SJSU Plus Augmented Online Learning Environment Pilot Project Report,” Research and Planning Group for California Community Colleges 38 (2013): 45. A subsequent analysis put results in a somewhat more favorable light, especially for courses run twice with refinements between iterations. Erin L. Woodhead, Preston Tim Brown, Susan Snycerski, Sean Laraway, Nicholas G. Bathurst, Greg Feist, and Ronald F. Rogers, “An Examination of the Outcomes of a Brief and Innovative Partnership: SJSU and Udacity,” Innovative Higher Education 42, no. 5–6 (2017): 463–476, DOI: 10.1007/s10755-017-9400-4; Lindsay McKenzie, “Arizona State Moves on from Global Freshman Academy,” Inside Higher Ed, September 17, 2019, https://www.insidehighered.com/digital-learning/article/2019/09/17/arizona-state-changes-course-global-freshman-academy.

  30. 30. For self-regulated learning and MOOCs, see Allison Littlejohn, Nina Hood, Colin Milligan, and Paige Mustain, “Learning in MOOCs: Motivations and Self-Regulated Learning in MOOCs,” Internet and Higher Education 29 (2016): 40–48; René F. Kizilcec, Mar Pérez-Sanagustín, and Jorge J. Maldonado, “Self-Regulated Learning Strategies Predict Learner Behavior and Goal Attainment in Massive Open Online Courses,” Computers and Education 104 (2017): 18–33; and M. Elena Alonso-Mencía, Carlos Alario-Hoyos, Jorge Maldonado-Mahauad, Iria Estévez-Ayres, Mar Pérez-Sanagustín, and Carlos Delgado Kloos, “Self-Regulated Learning in MOOCs: Lessons Learned from a Literature Review,” Educational Review (2019): 1–27. For how people develop self-regulated learning skills, see Scott G. Paris and Alison H. Paris, “Classroom Applications of Research on Self-Regulated Learning,” Educational Psychologist 36, no. 2 (2001): 89–101; and Barry J. Zimmerman, “Becoming a Self-Regulated Learner: An Overview,” Theory into Practice 41, no. 2 (2002): 64–70.

  31. 31. Reich and Ruipérez-Valiente, “MOOC Pivot.”

  32. 32. Reich, “Rebooting MOOC Research.”

  33. 33. Justin Reich, “Big Data MOOC Research Breakthrough: Learning Activities Lead to Achievement,” EdTech Researcher (blog), March 30, 2014, http://www.edtechresearcher.com/2014/03/big_data_mooc_research_breakthrough_learning_activities_lead_to_achievement/. For a variant, see Kenneth R. Koedinger, Jihee Kim, Julianna Zhuxin Jia, Elizabeth A. McLaughlin, and Norman L. Bier, “Learning Is Not a Spectator Sport: Doing Is Better Than Watching for Learning from a MOOC,” presentation, ACM Conference on Learning at Scale, Vancouver, BC, Canada, March 14–15, 2015; Jennifer DeBoer, Andrew D. Ho, Glenda S. Stump, and Lori Breslow, “Changing ‘Course’: Reconceptualizing Educational Variables for Massive Open Online Courses,” Educational Researcher 43, no. 2 (2014): 74–84.

  34. 34. For a case study of OLI development, see Candace Thille, Emily Schneider, René F. Kizilcec, Christopher Piech, Sherif A. Halawa, and Daniel K. Greene, “The Future of Data-Enriched Assessment,” Research & Practice in Assessment 9 (2014): 5–16.

  35. 35. For the OLI statistics study, see William G. Bowen, Matthew M. Chingos, Kelly A. Lack, and Thomas I. Nygren, “Interactive Learning Online at Public Universities: Evidence from a Six-Campus Randomized Trial,” Journal of Policy Analysis and Management 33, no. 1 (2014): 94–111. David Pritchard’s introductory physics MOOC at MIT is probably the best studied xMOOC; Kimberly F. Colvin, John Champaign, Alwina R. Liu, Qian Zhou, Colin Fredericks, and David E. Pritchard, “Learning in an Introductory Physics MOOC: All Cohorts Learn Equally, Including an On-Campus Class,” International Review of Research in Open and Distributed Learning 15, no. 4 (2014). On the high costs of effective online learning, also see McPherson and Bacow, “Beyond the Hype Cycle.”

  36. 36. Justin Reich and Elizabeth Huttner-Loan, Teaching Systems Lab MOOCs in Review: 2017–2019 (Cambridge, MA: Teaching Systems Lab, 2019), doi:10.35542/osf.io/c3bhw.

  37. 37. The total number of MOOC learners has increased at a lower rate than the total number of MOOC courses, so that each course now has many fewer people in it. See Chuang and Ho “HarvardX and MITx Year 4”; and Reich and Ruiperez-Valiente, “MOOC Pivot.”

  38. 38. On credit recovery, see Carolyn J. Heinrich, Jennifer Darling-Aduana, Annalee Good, and Huiping Cheng, “A Look Inside Online Educational Settings in High School: Promise and Pitfalls for Improving Educational Opportunities and Outcomes,” American Educational Research Journal 56, no. 6 (2019): 2147–2188. For state-level studies of virtual schools, see June Ahn and Andrew McEachin, “Student Enrollment Patterns and Achievement in Ohio’s Online Charter Schools,” Educational Researcher 46, no. 1 (2017): 44–57; Brian R. Fitzpatrick, Mark Berends, Joseph J. Ferrare, and R. Joseph Waddington, “Virtual Illusion: Comparing Student Achievement and Teacher and Classroom Characteristics in Online and Brick-and-Mortar Charter Schools,” Educational Researcher 49, no. 3 (2020): 161–175, https://doi.org/10.3102/0013189X20909814. One exception to the bleak research on K–12 virtual schools comes from the Florida Virtual Schools: Guido Schwerdt and Matthew M. Chingos, “Virtual Schooling and Student Learning: Evidence from the Florida Virtual School,” Beiträge zur Jahrestagung des Vereins für Socialpolitik 2015, Ökonomische Entwicklung—Theorie und Politik—Session: ICT in Educational Production, No. B24-V2, ZBW—Deutsche Zentralbibliothek für Wirtschaftswissenschaften, Leibniz Informationszentrum Wirtschaft, https://www.econstor.eu/bitstream/10419/113202/1/VfS_2015_pid_39.pdf. Notably, Florida Virtual Schools is one of the only K–12 providers to be based within the state educational system rather than provisioned by a for-profit provider.

  39. 39. Chris Dede and John Richards proposed a name for LMSs that are prepopulated with content but intended for use by teachers in small classes: digital teaching platforms; Christopher Dede and John Richards, eds., Digital Teaching Platforms: Customizing Classroom Learning for Each Student (New York: Teachers College Press, 2012). On Summit Learning, see Joanne Jacobs, “Pacesetter in Personalized Learning: Summit Charter Network Shares Its Model Nationwide,” Education Next 17, no. 4 (2017): 16–25; and Matt Barnum, “Summit Learning, the Zuckerberg-Backed Platform, Says 10% of Schools Quit Using It Each Year. The Real Figure is Higher,” Chalkbeat, May 23, 2019, https://www.chalkbeat.org/posts/us/2019/05/23/summit-learning-the-zuckerberg-backed-platform-says-10-of-schools-quit-using-it-each-year-the-real-figure-is-higher/.

  40. 40. John Daniel, Asha Kanwar, and Stamenka Uvalić-Trumbić, “Breaking Higher Education’s Iron Triangle: Access, Cost, and Quality,” Change: The Magazine of Higher Learning 41, no. 2 (2009): 30–35. The metaphor comes from the project management field; Dennis Lock, Project Management Handbook (Aldershot, Hants, England: Gower Technical Press, 1987).

  41. 41. Patrick McAndrew and Eileen Scanlon, “Open Learning at a Distance: Lessons for Struggling MOOCs,” Science 342, no. 6165 (2013): 1450–1451.

2. ALGORITHM-GUIDED LEARNING AT SCALE

  1. 1. In the late 2000s and early 2010s, some other common reform efforts in schools involved alignment with the new Common Core curriculum, using data from standardized tests to improve instruction, adopting differentiated instruction approaches like “response to intervention” and addressing issues of social and emotional learning.

  2. 2. Benjamin S. Bloom, “The 2 Sigma Problem: The Search for Methods of Group Instruction as Effective as One-to-One Tutoring,” Educational Researcher 13, no. 6 (1984): 4–16. Bloom’s thought-piece was inspired by two doctoral dissertations. Subsequent investigations of human tutoring have demonstrated less than two standard deviation effects; see Kurt VanLehn, “The Relative Effectiveness of Human Tutoring, Intelligent Tutoring Systems, and Other Tutoring Systems,” Educational Psychologist 46, no. 4 (2011): 197–221.

  3. 3. R. C. Atkinson and H. A. Wilson, “Computer-Assisted Instruction,” Science 162, no. 3849 (1968): 73–77.

  4. 4. Brian Dear, The Friendly Orange Glow: The Untold Story of the PLATO System and the Dawn of Cyberculture (New York: Pantheon, 2017). R. A. Avner and Paul Tenczar, “The TUTOR Manual” (Washington, DC: Education Resources Information Center, 1970), https://eric.ed.gov/?id=ED050583.

  5. 5. Michael Horn and Heather Staker, Blended: Using Disruptive Innovation to Improve Schools (New York: Wiley, 2014).

  6. 6. Eric Westervelt, “Meet the Mind Reading Robo Tutor in the Sky,” NPRed, October 13, 2015, https://www.npr.org/sections/ed/2015/10/13/437265231/meet-the-mind-reading-robo-tutor-in-the-sky.

  7. 7. Anya Kamenetz, “5 Doubts about Data Driven Schools,” NPRed, June 3, 2016, https://www.npr.org/sections/ed/2016/06/03/480029234/5-doubts-about-data-driven-schools. The largest dataset in DataShop, a public repository of data from computer-assisted learning systems, has 2.4 million transactions, which were collected from 628 students over a period of six months, several orders of magnitude less than millions of data points per student per day. https://pslcdatashop.web.cmu.edu/DatasetInfo?datasetId=428.

  8. 8. “The Mismeasure of Students: Using Item Response Theory instead of Traditional Grading to Assess Student Proficiency,” Knewton Blog, June 7, 2012, https://medium.com/knerd/the-mismeasure-of-students-using-item-response-theory-instead-of-traditional-grading-to-assess-b55188707ee5.

  9. 9. For a history and introduction to item response theory, see W. J. van der Linden, “Item Response Theory,” in International Encyclopedia of Education, 3rd ed., eds. Penelope Peterson, Eva Baker, and Barry McGaw (Oxford, UK: Elsevier, 2010), 81–89.

  10. 10. Among the most important variants to IRT are a set of algorithms developed at Carnegie Mellon University called “knowledge tracing.” For an introduction, see John R. Anderson, Albert T. Corbett, Kenneth R. Koedinger, and Ray Pelletier, “Cognitive Tutors: Lessons Learned,” The Journal of the Learning Sciences 4, no. 2 (1995): 167–207; Kenneth R. Koedinger and Albert T. Corbett, “Cognitive Tutors: Technology Bringing Learning Sciences to the Classroom,” in The Cambridge Handbook of the Learning Sciences, ed. R. K. Sawyer (New York: Cambridge University Press, 2006), 61–77.

  11. 11. Tony Wan, “Jose Ferreira Steps Down as Knewton CEO, Eyes Next Education Startup,” EdSurge, December 21, 2016, https://www.edsurge.com/news/2016-12-21-jose-ferreira-steps-down-as-knewton-ceo-eyes-next-education-startup; Jeffrey Young, “Hitting Reset, Knewton Tries New Strategy: Competing with Textbook Publishers,” EdSurge, November 30, 2017, https://www.edsurge.com/news/2017-11-30-hitting-reset-knewton-tries-new-strategy-competing-with-textbook-publishers; Lindsay McKenzie, “End of the Line for Much-Hyped Tech Company,” Inside Higher Ed, May 7, 2019, https://www.insidehighered.com/digital-learning/article/2019/05/07/wiley-buys-knewton-adaptive-learning-technology-company.

  12. 12. Maciej Cegłowski. “The Internet with a Human Face,” presentation at Beyond Tellerrand, Düsseldorf, Germany, May 20, 2014, https://idlewords.com/talks/internet_with_a_human_face.htm.

  13. 13. For further discussion, see Justin Reich, “Personalized Learning, Backpacks Full of Cash, Rockstar Teachers, and MOOC Madness: The Intersection of Technology, Free-Market Ideology, and Media Hype in U.S. Education Reform,” May 7, 2013, presentation at Berkman Klein Center, Harvard University, https://cyber.harvard.edu/events/luncheon/2013/05/reich. For an example of advocating for market-based reforms through technological changes, see Chester E. Finn, Jr. and Daniela R. Fairchild, eds., Education Reform for the Digital Era, https://files.eric.ed.gov/fulltext/ED532508.pdf.

  14. 14. Emily Ann Brown, “Sal Khan Envisions a Future of Active, Mastery-Based Learning,” District Administration, January 31, 3019, https://districtadministration.com/sal-khan-envisions-a-future-of-active-mastery-based-learning/; Salman Kahn, The One World Schoolhouse: Education Reimagined (New York: Grand Central, 2012), 12.

  15. 15. Audrey Watters, Teaching Machines (Cambridge, MA: MIT Press, forthcoming), citing Simon Ramo, “A New Technique of Education,” Engineering and Science 21 (October 1975): 372.

  16. 16. Clayton M. Christensen, Curtis W. Johnson, and Michael B. Horn, Disrupting Class: How Disruptive Innovation Will Change the Way the World Learns (New York: McGraw-Hill, 2008).

  17. 17. Jill Lepore, “The Disruption Machine,” New Yorker 23 (2014): 30–36. Audrey Watters, “The Myth and Millennialism of ‘Disruptive Innovation,’ ” Hack Education, May 24, 2013, http://hackeducation.com/2013/05/24/disruptive-innovation.

  18. 18. Alex Molnar, Gary Miron, Najat Elgeberi, Michael K. Barbour, Luis Huerta, Sheryl Rankin Shafer, and Jennifer King Rice, “Virtual Schools in the U.S., 2019,” National Education Policy Center, May 28, 2019, https://nepc.colorado.edu/publication/virtual-schools-annual-2019; Jeff Wulfson, “Commonwealth of Massachusetts Virtual Schools—Funding and Amendment of Certificates for Greenfield Commonwealth Virtual School and for TEC Connections Academy Commonwealth Virtual School,” December 8, 2017, http://www.doe.mass.edu/bese/docs/fy2018/2017-12/item5.html; Christian Wade, “Virtual Schools Grow, along with Costs to Districts,” The Daily News of Newburyport, March 25, 2019, https://www.newburyportnews.com/news/regional_news/virtual-schools-grow-along-with-costs-to-districts/article_be168543-ae01-5bfa-8ac3-d3c6db09eb49.html.

  19. 19. Mark Dynarski, Roberto Agodini, Sheila Heaviside, Timothy Novak, Nancy Carey, Larissa Campuzano, Barbara Means, et al., “Effectiveness of Reading and Mathematics Software Products: Findings from the First Student Cohort,” National Center for Education Evaluation and Regional Assistance (2007); Saiying Steenbergen-Hu and Harris Cooper, “A Meta-analysis of the Effectiveness of Intelligent Tutoring Systems on K–12 Students’ Mathematical Learning,” Journal of Educational Psychology 105, no. 4 (2013): 970.

  20. 20. Eric Taylor, “New Technology and Teacher Productivity” (unpublished manuscript, January 2018), Cambridge, MA, Harvard Graduate School of Education, available at https://scholar.harvard.edu/files/erictaylor/files/technology-teachers-jan-18.pdf. Taylor’s literature review has an excellent overview and set of references for the history of evaluation of computer-assisted instruction.

  21. 21. John F. Pane, Beth Ann Griffin, Daniel F. McCaffrey, and Rita Karam, “Effectiveness of Cognitive Tutor Algebra I at Scale,” Educational Evaluation and Policy Analysis 36, no. 2 (2014): 127–144.

  22. 22. Interpreting effect sizes is the topic of vigorous discussion in education research. Cohen proposed a set of guidelines whereby a 0.2 standard deviation improvement would be considered small, 0.5 medium, and 0.8 large; in Jacob Cohen, Statistical Power Analysis for the Behavioral Sciences (New York: Academic Press, 1977). Since then, it has been more widely agreed in education research that these large effect sizes are almost never found in well-conducted randomized control trials. Lipsey argues for evaluating the magnitude of effect size based on prior research about gains in typical conditions; Carolyn J. Hill, Howard S. Bloom, Alison Rebeck Black, and Mark W. Lipsey, “Empirical Benchmarks for Interpreting Effect Sizes in Research,” Child Development Perspectives 2, no. 3 (2008): 172–177. Kraft suggests that based on more recent evidence, appropriate guidelines might be 0.05 standard deviation as small, 0.1 as medium, and 0.2 as large; Matthew A. Kraft, “Interpreting Effect Sizes of Education Interventions,” Educational Researcher (forthcoming), available at https://scholar.harvard.edu/files/mkraft/files/kraft_2018_interpreting_effect_sizes.pdf.

  23. 23. Steve Ritter, Michael Yudelson, Stephen E. Fancsali, and Susan R. Berman, “How Mastery Learning Works at Scale,” in Proceedings of the Third (2016) ACM Conference on Learning at Scale (Association of Computing Machinery Digital Library, 2016), 71–79.

  24. 24. Neil T. Heffernan and Cristina Lindquist Heffernan, “The ASSISTments Ecosystem: Building a Platform that Brings Scientists and Teachers Together for Minimally Invasive Research on Human Learning and Teaching,” International Journal of Artificial Intelligence in Education 24, no. 4 (2014): 470–497; Jeremy Roschelle, Mingyu Feng, Robert F. Murphy, and Craig A. Mason, “Online Mathematics Homework Increases Student Achievement,” AERA Open 2, no. 4 (2016), https://doi.org/10.1177/2332858416673968.

  25. 25. Early evidence of Teach to One was more positive; Douglas D. Ready, Katherine Conn, Elizabeth Park, and David Nitkin, “Year-One Impact and Process Results from the I3 Implementation of Teach to One: Math” (New York: Consortium for Policy Research Education, Teachers College, Columbia University, 2016), https://www.classsizematters.org/wp-content/uploads/2018/11/Ready-1st-year-Teach-to-One-Elizabeth-evaluation-Nov.-2016.pdf. The last study was Douglas D. Ready, Katherine Conn, Shani Bretas, and Iris Daruwala, “Final Impact Results from the i3 Implementation of Teach to One: Math” (New York: Consortium for Policy Research Education, Teachers College, Columbia University, 2019), https://www.newclassrooms.org/wp-content/uploads/2019/02/Final-Impact-Results-i3-TtO.pdf.

3. PEER-GUIDED LEARNING AT SCALE

  1. 1. For an example of where lines blur between algorithmic personalization and “whole child” learning, see Tom Vander Ark, “Chan Zuckerberg Backs Personalized Learning R&D Agenda,” Getting Smart, November 17, 2017, https://www.gettingsmart.com/2017/11/chan-zuckerberg-backs-personalized-learning-rd-agenda.

  2. 2. Antonio Fini, “The Technological Dimension of a Massive Open Online Course: The Case of the CCK08 Course Tools,” International Review of Research in Open and Distributed Learning 10, no. 5 (2009); David Cormier, “What Is a MOOC?,” YouTube video, December 8, 2010, https://www.youtube.com/watch?v=eW3gMGqcZQc. The CCK08 course had several antecedents, including courses developed by Alec Couros, David Wiley, and Cathy Davidson. See Cathy Davidson, “What Was the First MOOC?,” HASTAC, September 27, 2013, https://www.hastac.org/blogs/cathy-davidson/2013/09/27/what-was-first-mooc.

  3. 3. George Siemens, “Connectivism: A Learning Theory for the Digital Age,” International Journal of Instructional Technology and Distance Learning 2, no. 1 (2005), http://www.itdl.org/Journal/Jan_05/article01.htm; Stephen Downes, “Places to Go: Connectivism and Connective Knowledge,” Innovate: Journal of Online Education 5, no. 1 (2008): 6; David Weinberger, Too Big to Know: Rethinking Knowledge Now That the Facts Aren’t the Facts, Experts Are Everywhere, and the Smartest Person in the Room Is the Room (New York: Basic Books, 2011).

  4. 4. Detailed technical plans for developing these kinds of learning environments can be found at Kim Jaxon and Alan Levine, “Staking Your Claim: How the Open Web Is Won for Teaching and Learning,” University of California Irvine, 2017, http://connectedcourses.stateu.org/.

  5. 5. Downes, “Places to Go”; Fini “The Technological Dimension.”

  6. 6. Stephen Downes, “The Rise of MOOCs,” April 23, 2012, https://www.downes.ca/cgi-bin/page.cgi?post=57911.

  7. 7. Jean Lave and Etienne Wenger, Situated Learning: Legitimate Peripheral Participation (Cambridge, UK: Cambridge University Press, 1991). For other connections from connectivism to prior pedagogical theory, see Rita Kop and Adrian Hill, “Connectivism: Learning Theory of the Future or Vestige of the Past?,” The International Review of Research in Open and Distributed Learning 9, no. 3 (2008), http://www.irrodl.org/index.php/irrodl/article/view/523/1103. For Downes on physicists, see Stephen Downes, “‘Connectivism’ and Connective Knowledge,” Huffington Post, January 5, 2011, https://www.huffpost.com/entry/connectivism-and-connecti_b_804653.

  8. 8. On DS106, see Howard Rheingold, “DS106: Enabling Open, Public, Participatory Learning,” Connected Learning Alliance, https://clalliance.org/resources/ds106-enabling-open-public-participatory-learning/. Alan Levine, “A MOOC or Not a MOOC: DS106 Questions the Form,” in Invasion of the MOOCs: The Promise and Perils of Massive Open Online Courses, eds. Steven D. Krause and Charles Lowe (Anderson, SC: Parlor Press, 2014), 29–38, available online at https://parlorpress.com/products/invasion-of-the-moocs. For the origins of edupunk, see Jim Groom, “The Glass Bees,” bavatuesdays, May 25, 2008, https://bavatuesdays.com/the-glass-bees/. Some history is also discussed in Anya Kamenetz, DIY U: Edupunks, Edupreneurs, and the Coming Transformation of Higher Education (New York: Chelsea Green, 2010).

  9. 9. A discussion of the Daily Create is in Abram Anders, “Theories and Applications of Massive Online Open Courses (MOOCs): The Case for Hybrid Design,” International Review of Research in Open and Distributed Learning 16, no. 6 (2015): 39–61.

  10. 10. W. Ian O’Byrne and Kristine E. Pytash, “Becoming Literate Digitally in a Digitally Literate Environment of Their Own,” Journal of Adolescent & Adult Literacy 60, no. 5 (2017): 499–504, http://thinq.studio/wp-content/uploads/2018/02/JAAL-article-Becoming-Literate-Digitally-Domain-of-OnesOwn.pdf.

  11. 11. I discuss some of my own efforts along these lines at Justin Reich, “Techniques for Unleashing Student Work from Learning Management Systems,” KQED Mindshift, February 13, 2015, https://www.kqed.org/mindshift/39332/techniques-for-unleashing-student-work-from-learning-management-systems.

  12. 12. Alexandra Juhasz and Anne Balsamo, “An Idea Whose Time Is Here: FemTechNet–A Distributed Online Collaborative Course (DOCC),” Ada: A Journal of Gender, New Media, and Technology 1, no. 1 (2012); Robinson Meyer, “5 Ways of Understanding the New, Feminist MOOC That’s Not a MOOC,” Atlantic, August 20, 2013, https://www.theatlantic.com/technology/archive/2013/08/5-ways-of-understanding-the-new-feminist-mooc-thats-not-a-mooc/278835/.

  13. 13. For the shift from the open web to walled gardens maintained by oligopolist technology companies see David Weinberger, “The Internet That Was (and Still Could Be),” Atlantic, June 22, 2015, https://www.theatlantic.com/technology/archive/2015/06/medium-is-the-message-paradise-paved-internet-architecture/396227/. For similar patterns in higher education, see Jim Groom and Brian Lamb, “Reclaiming Innovation,” EDUCAUSE Review Online, May 13, 2014. https://www.educause.edu/visuals/shared/er/extras/2014/ReclaimingInnovation/default.html. As of 2019, the Connected Learning MOOC, CLMOOC, stands out, along with DS106, as one of the few remaining ongoing facilitated cMOOC learning experiences; see Chad Sansing, “Your Summer of Making and Connecting,” English Journal 103, no. 5 (2014): 81.

  14. 14. Mitchel Resnick, John Maloney, Andrés Monroy-Hernández, Natalie Rusk, Evelyn Eastmond, Karen Brennan, Amon Millner, Eric Rosenbaum, Jay Silver, Brian Silverman, and Yasmin Kafai, “Scratch: Programming for All,” Communications of the ACM 52, no. 11 (2009): 60–67.

  15. 15. Mitchel Resnick, Lifelong Kindergarten: Cultivating Creativity through Projects, Passion, Peers, and Play (Cambridge, MA: MIT Press, 2017).

  16. 16. Seymour Papert, Mindstorms: Children, Computers, and Powerful Ideas (New York: Basic Books, 1980).

  17. 17. For Scratch usage statistics, see https://scratch.mit.edu/statistics/.

  18. 18. Mizuko Ito, Kris Gutiérrez, Sonia Livingstone, Bill Penuel, Jean Rhodes, Katie Salen, Juliet Schor, Julian Sefton-Green, and S. Craig Watkins, Connected Learning: An Agenda for Research and Design (Irvine, CA: Digital Media and Learning Research Hub, 2013), https://dmlhub.net/wp-content/uploads/files/Connected_Learning_report.pdf, 7.

  19. 19. Mitchel Resnick, “ Let’s Teach Kids to Code,” TEDx Beacon Street, November 2012, https://www.ted.com/talks/mitch_resnick_let_s_teach_kids_to_code/transcript?language=en.

  20. 20. Mitchel Resnick, “The Next Generation of Scratch Teaches Much More Than Coding,” EdSurge, January 3, 2019, https://www.edsurge.com/news/2019-01-03-mitch-resnick-the-next-generation-of-scratch-teaches-more-than-coding.

  21. 21. Jal Mehta and Sarah Fine, In Search of Deeper Learning (Cambridge, MA: Harvard University Press, 2019).

  22. 22. Ito et al., Connected Learning.

  23. 23. Resnick et al., “Scratch”; Papert, “Mindstorms.”

  24. 24. Paul A. Kirschner, John Sweller, and Richard E. Clark, “Why Minimal Guidance during Instruction Does Not Work: An Analysis of the Failure of Constructivist, Discovery, Problem-Based, Experiential, and Inquiry-Based Teaching,” Educational Psychologist 41, no. 2 (2006): 75–86.

  25. 25. Sigal Samuel, “Canada’s ‘Incel Attack’ and Its Gender-Based Violence Problem,” Atlantic, April 28, 2018, https://www.theatlantic.com/international/archive/2018/04/toronto-incel-van-attack/558977/.

  26. 26. danah boyd, “Media Manipulation, Strategic Amplification, and Responsible Journalism,” Points, September 14, 2018, https://points.datasociety.net/media-manipulation-strategic-amplification-and-responsible-journalism-95f4d611f462.

  27. 27. Alice Marwick and Rebecca Lewis, “Media Manipulation and Disinformation Online,” Data and Society, May 5, 2017, https://datasociety.net/library/media-manipulation-and-disinfo-online/.

  28. 28. Zeynep Tufekci, “YouTube: The Great Radicalizer,” New York Times, March 10, 2018.

  29. 29. Richard Hofstadter, The Paranoid Style in American Politics (New York: Vintage, 2012). On Lewin, see MIT News Office, “MIT Indefinitely Removes Online Physics Lectures and Courses by Walter Lewin,” December 8, 2014, http://news.mit.edu/2014/lewin-courses-removed-1208.

  30. 30. On confusion in cMOOCs, see Rita Kop, “The Challenges to Connectivist Learning on Open Online Networks: Learning Experiences during a Massive Open Online Course,” The International Review of Research in Open and Distributed Learning 12, no. 3 (2011): 19–38; Colin Milligan, Allison Littlejohn, and Anoush Margaryan, “Patterns of Engagement in Connectivist MOOCs,” Journal of Online Learning and Teaching 9, no. 2 (2013): 149–159. For “getting stuck” in classrooms in Scratch and solutions for educators, see Paulina Haduong and Karen Brennan, “Helping K–12 Teachers Get Unstuck with Scratch: The Design of an Online Professional Learning Experience,” in Proceedings of the 50th ACM Technical Symposium on Computer Science Education (Association for Computing Machinery Digital Library, 2019), 1095–1101; Karen Brennan, “Beyond Right or Wrong: Challenges of Including Creative Design Activities in the Classroom,” Journal of Technology and Teacher Education 23, no. 3 (2015): 279–299.

  31. 31. Anton Barua, Stephen W. Thomas, and Ahmed E. Hassan, “What Are Developers Talking About? An Analysis of Topics and Trends in Stack Overflow,” Empirical Software Engineering 19, no. 3 (2014): 619–654.

4. TESTING THE GENRES OF LEARNING AT SCALE

  1. 1. Games researcher Jane McGonigal estimated in 2010 that people across the world played 3 billion hours of video games per week. Jane McGonigal, “Gaming Can Make a Better World,” TED talk, February 2010, https://www.ted.com/talks/jane_mcgonigal_gaming_can_make_a_better_world.

  2. 2. See, for instance, Larry Johnson, Samantha Adams Becker, Victoria Estrada, and Alex Freeman, NMC Horizon Report: 2014 K–12 Edition (Austin, TX: New Media Consortium, 2014), https://files.eric.ed.gov/fulltext/ED559369.pdf.

  3. 3. For games as powerful sites of learning, see James Gee, What Video Games Have to Teach Us about Learning and Literacy (New York: St. Martin’s Press, 2007).

  4. 4. Federal Trade Commission, Lumosity to Pay $2 Million to Settle FTC Deceptive Advertising Charges for Its “Brain Training” Program, January 5, 2016, https://www.ftc.gov/news-events/press-releases/2016/01/lumosity-pay-2-million-settle-ftc-deceptive-advertising-charges.

  5. 5. Daniel J. Simons, Walter R. Boot, Neil Charness, Susan E. Gathercole, Christopher F. Chabris, David Z. Hambrick, and Elizabeth A. L. Stine-Morrow, “Do ‘Brain-Training’ Programs Work?,” Psychological Science in the Public Interest 17, no. 3 (2016): 103–186. Thomas S. Redick, Zach Shipstead, Elizabeth A. Wiemers, Monica Melby-Lervåg, and Charles Hulme, “What’s Working in Working Memory Training? An Educational Perspective,” Educational Psychology Review 27, no. 4 (2015): 617–633.

  6. 6. Robert S. Woodworth and E. L. Thorndike, “The Influence of Improvement in One Mental Function upon the Efficiency of Other Functions (I),” Psychological Review 8, no. 3 (1901): 247. For a recent piece on the origins of transfer research, see Daniel Willingham, “Critical Thinking: Why Is It So Hard to Teach?,” American Educator, Summer 2007, https://www.aft.org/sites/default/files/periodicals/Crit_Thinking.pdf.

  7. 7. Giovanni Sala and Fernand Gobet, “Does Far Transfer Exist? Negative Evidence from Chess, Music, and Working Memory Training,” Current Directions in Psychological Science 26, no. 6 (2017): 515–520. The research on memory and chess is quite extensive. An important early contribution is William G. Chase and Herbert A. Simon, “Perception in Chess,” Cognitive Psychology 4, no. 1 (1973): 55–81. A more recent study is Yanfei Gong, K. Anders Ericsson, and Jerad H. Moxley, “Recall of Briefly Presented Chess Positions and Its Relation to Chess Skill,” PloS one 10, no. 3 (2015): https://doi.org/10.1371/journal.pone.0118756. Giovanni Sala, and Fernand Gobet, “Experts’ Memory Superiority for Domain-Specific Random Material Generalizes across Fields of Expertise: A Meta-analysis,” Memory & Cognition 45, no. 2 (2017): 183–193.

  8. 8. Douglas B. Clark, Emily E. Tanner-Smith, and Stephen S. Killingsworth, “Digital Games, Design, and Learning: A Systematic Review and Meta-analysis,” Review of Educational Research 86, no. 1 (2016): 79–122. Pieter Wouters, Christof van Nimwegen, Herre van Oostendorp, and Erik D. van der Spek, “A Meta-analysis of the Cognitive and Motivational Effects of Serious Games,” Journal of Educational Psychology 105, no. 2 (2013): 249–265, https://doi.org/10.1037/a0031311.

  9. 9. Brenda Laurel, Utopian Entrepreneur (Cambridge, MA: MIT Press, 2001).

  10. 10. David B. Tyack and Larry Cuban, Tinkering toward Utopia (Cambridge, MA: Harvard University Press, 1995).

  11. 11. Frederic Lardinois, “Duolingo Hires Its First Chief Marketing Officer as Active User Numbers Stagnate but Revenue Grows,” Techcrunch, August 1, 2018, https://techcrunch.com/2018/08/01/duolingo-hires-its-first-chief-marketing-officer-as-active-user-numbers-stagnate/.

  12. 12. Hermann Ebbinghaus, Memory, trans. H. A. Ruger and C. E. Bussenius (New York: Teachers College, 1913); Nicholas Cepeda, Harold Pashler, Edward Vul, John T. Wixted, and Doug Rohrer, “Distributed Practice in Verbal Recall Tasks: A Review and Quantitative Synthesis,” Psychological Bulletin 132, no. 3 (2006): 354.

  13. 13. Burr Settles and Brendan Meeder, “A Trainable Spaced Repetition Model for Language Learning,” in Proceedings of the 54th Annual Meeting of the Association for Computational Linguistics, vol. 1, Long Papers (Stroudsburg, PA: Association for Computational Linguistics, 2016), 1848–1858; Roumen Vesselinov and John Grego, “Duolingo Effectiveness Study” (City University of New York, 2012), https://s3.amazonaws.com/duolingo-papers/other/vesselinov-grego.duolingo12.pdf.

  14. 14. Eric Klopfer, Jason Haas, Scot Osterweil, and Louisa Rosenheck, Resonant Games (Cambridge, MA: MIT Press, 2018).

  15. 15. Alex Calhoun, “Vanished Helps Kids Save the Future with Science,” Wired, November 28, 2011, https://www.wired.com/2011/11/vanished-helps-kids-save-the-future-with-science/; Eric Klopfer, Jason Haas, Scott Osterweil, and Louisa Rosenheck, “I Wish I Could Go On Here Forever,” in Resonant Games: Design Principles for Learning Games that Connect Hearts, Minds, and the Everyday (Cambridge, MA: MIT Press, 2018), https://www.resonant.games/pub/8w3uihyo.

  16. 16. For an introduction to the game and an optimistic take on educational possibilities, see Steve Nebel, Sascha Schneider, and Günter Daniel Rey, “Mining Learning and Crafting Scientific Experiments: A Literature Review on the Use of Minecraft in Education and Research,” Journal of Educational Technology & Society 19, no. 2 (2016): 355. Sales and user participation data at https://en.wikipedia.org/wiki/Minecraft.

  17. 17. Stampylonghead’s Minecraft introductions can be found at https://www.youtube.com/watch?v=cMsQlTkpQMM&list=UUj5i58mCkAREDqFWlhaQbOw.

  18. 18. Katie Salen, “10 Life Skills Parents Can Nurture through Minecraft,” Connected Camps Blog, September 16, 2017, https://blog.connectedcamps.com/10-life-skills-parents-nurture-in-minecraft/.

  19. 19. Samantha Jamison, “Computer Game a Building Block for Engineers,” Carnegie Mellon University News, July 26, 2017, https://www.cmu.edu/news/stories/archives/2017/july/minecraft-course.html; Sarah Guthals and Beth Simon, “Minecraft, Coding, and Teaching,” https://www.edx.org/course/minecraft-coding-and-teaching.

  20. 20. For some recent research on Zoombinis gameplay, see Elizabeth Rowe, Jodi Asbell-Clarke, Santiago Gasca, and Kathryn Cunningham, “Assessing Implicit Computational Thinking in Zoombinis Gameplay,” in Proceedings of the 12th International Conference on the Foundations of Digital Games (Association for Computing Machinery, 2017), https://par.nsf.gov/servlets/purl/10061931. The Wikipedia page for Zoombinis provides a helpful overview of the game: https://en.wikipedia.org/wiki/Logical_Journey_of_the_Zoombinis. See also Chris Hancock and Scot Osterweil, “Zoombinis and the Art of Mathematical Play,” Hands On! Spring 1996, https://external-wiki.terc.edu/download/attachments/41419448/zoombinisandmathplay.pdf?api=v2; and Eric Klopfer, Jason Haas, Scot Osterweil, and Louisa Rosenheck, “In a Game, You Can Be Whoever You Want,” in Resonant Games: Design Principles for Learning Games that Connect Hearts, Minds, and the Everyday (Cambridge, MA: MIT Press, 2018), https://www.resonant.games/pub/wkrjlp3o.

5. THE CURSE OF THE FAMILIAR

  1. 1. Judith Haymore Sandholtz, Cathy Ringstaff, and David C. Dwyer, Teaching with Technology: Creating Student-Centered Classrooms (New York: Teachers College Press, 1997).

  2. 2. A comment on Class Central, a review site for MOOCs, captures some of the critiques of the Neuroscience course: “Material & topic seemed interesting. Kind of silly way of presenting it, in my opinion, but quality of the videos was frankly good. Instructor tried to be funny too hard. I dropped; too much posh design & annoying music as background distraction for me in a neuroscience course,” https://www.classcentral.com/course/edx-fundamentals-of-neuroscience-part-1-the-electrical-properties-of-the-neuron-942. For a response to the critiques of DALMOOC, see George Siemens, “Students Need to Take Ownership of Their Own Learning,” Online Educa Berlin, November 19, 2014, https://oeb.global/oeb-insights/george-siemens-moocs-elearning-online-educa-berlin/. For more on the “two-track” DALMOOC, see Shane Dawson, Srecko Joksimović, Vitomir Kovanović, Dragan Gašević, and George Siemens, “Recognising Learner Autonomy: Lessons and Reflections from a Joint x / c MOOC,” Proceedings of Higher Education Research and Development Society of Australia 2015 (2015).

  3. 3. Tom Page, “Skeuomorphism or Flat Design: Future Directions in Mobile Device User Interface (UI) Design Education,” International Journal of Mobile Learning and Organisation 8, no. 2: 130–142; David Oswald and Steffen Kolb, “Flat Design vs. Skeuomorphism–Effects on Learnability and Image Attributions in Digital Product Interfaces,” in DS 78: Proceedings of the 16th International Conference on Engineering and Product Design Education (Design Education and Human Technology Relations, University of Twente, The Netherlands, May 4, 2014), 402–407.

  4. 4. Matthew Glotzbach, “A New Monthly Milestone for Quizlet: 50 Million Monthly Learners,” Quizlet Blog, October 29, 2018, https://quizlet.com/blog/a-new-milestone-for-quizlet-50-million-monthly-learners.

  5. 5. One useful source for understanding why structural elements of schools are so inimical to new innovations is David K. Cohen and Jal D. Mehta, “Why Reform Sometimes Succeeds: Understanding the Conditions That Produce Reforms That Last,” American Educational Research Journal 54, no. 4 (2017): 644–690.

  6. 6. From 2012 to 2018, Audrey Watters tracked and commented on venture investments in the edtech sector. For the 2018 example, see Audrey Watters, “The Business of ‘EdTech Trends,’” Hack Education, December 31, 2018, http://hackeducation.com/2018/12/31/top-ed-tech-trends-money. For another critical view of venture capitalism and philanthropy in education, see Ben Williamson, “Silicon Startup Schools: Technocracy, Algorithmic Imaginaries and Venture Philanthropy in Corporate Education Reform,” Critical Studies in Education 59, no. 2 (2018): 218–236.

  7. 7. John B. Diamond, “Where the Rubber Meets the Road: Rethinking the Connection between High-Stakes Testing Policy and Classroom Instruction,” Sociology of Education 80, no. 4 (2007): 285–313; Christopher Lynnly Hovey, Lecia Barker, and Vaughan Nagy, “Survey Results on Why CS Faculty Adopt New Teaching Practices,” in Proceedings of the 50th ACM Technical Symposium on Computer Science Education (Association for Computing Machinery Digital Library, 2019), 483–489.

  8. 8. Seymour Papert, Mindstorms: Children, Computers, and Powerful Ideas (New York: Basic Books, 1980); Mitchel Resnick, Brad Myers, Kumiyo Nakakoji, Ben Schneiderman, Randy Pausch, Ted Selker, and Mike Eisenberg, “Design Principles for Tools to Support Creative Thinking,” National Science Foundation workshop on Creativity Support Tools (Washington, DC, 2005), http://www.cs.umd.edu/hcil/CST/Papers/designprinciples.htm; Moran Tsur and Natalie Rusk, “Scratch Microworlds: Designing Project-Based Introductions to Coding,” in Proceedings of the 49th ACM Technical Symposium on Computer Science Education (Association for Computing Machinery Digital Library, 2018), 894–899.

  9. 9. Natalie Rusk and Massachusetts Institute of Technology Media Laboratory, Scratch Coding Cards: Creative Coding Activities for Kids (San Francisco: No Starch Press, 2017).

  10. 10. Tsur and Rusk, “Scratch Microworlds”; Phillip Schmidtt, Mitchel Resnick, and Natalie Rusk, “Learning Creative Learning: How We Tinkered with MOOCs,” P2PU, http://reports.p2pu.org/learning-creative-learning/.

  11. 11. Karen Brennan, Sarah Blum-Smith, and Maxwell M. Yurkofsky, “From Checklists to Heuristics: Designing MOOCs to Support Teacher Learning,” Teachers College Record 120, no. 9 (2018): n9; Paulina Haduong and Karen Brennan, “Getting Unstuck: New Resources for Teaching Debugging Strategies in Scratch,” in Proceedings of the 49th ACM Technical Symposium on Computer Science Education, 1092 (Association for Computing Machinery Digital Library, 2018), https://doi.org/10.1145/3159450.3162248; see also https://gettingunstuck.gse.harvard.edu/about.html.

  12. 12. For an example of Desmos integration into College Board tests, see https://digitaltesting.collegeboard.org/pdf/about-desmos-calculator.pdf. For an example of integration into the Smarter Balanced consortium tests, see Tony Wan, “Desmos Passes the Smarter Balanced Tests (and Hopes to Save Students $100),” EdSurge, May 8, 2017, https://www.edsurge.com/news/2017-05-08-desmos-passes-the-smarter-balanced-test-and-hopes-to-save-math-students-100.

  13. 13. Dan Meyer, “Math Class Needs a Makeover,” TED talk (2010), https://www.ted.com/talks/dan_meyer_math_class_needs_a_makeover?language=en; “The Three Acts of a Mathematical Story,” Mr. Meyer (blog), May 11, 2011, http://blog.mrmeyer.com/2011/the-three-acts-of-a-mathematical-story/; “Missing the Promise of Mathematical Modeling,” Mathematics Teacher 108, no. 8 (2015): 578–583.

  14. 14. Dan Meyer, “Video and Multiple Choice: What Took Us So Long?” Desmos (blog), October 27, 2016, https://blog.desmos.com/articles/video-multiple-choice-what-took-us-so-long/.

  15. 15. Ibid.

  16. 16. Larry Cuban. Hugging the Middle: How Teachers Teach in an Era of Testing and Accountability (New York: Teachers College, Columbia University, 2009); Larry Cuban, Inside the Black Box of Classroom Practice: Change without Reform in American Education (Cambridge, MA: Harvard Education Press, 2013).

  17. 17. Cohen and Mehta, “Why Reform.”

  18. 18. For the connections between Dewey’s pragmatism and Web 2.0, see Michael Glassman and Min Ju Kang “The Logic of Wikis: The Possibilities of the Web 2.0 Classroom,” International Journal of Computer-Supported Collaborative Learning 6, no. 1 (2011): 93–112, https://doi.org/10.1007/s11412-011-9107-y. Another example of the idea of technology as the tool that allows Dewey’s pedagogy to come to life is Chris Lehmann, “The True Promise of Technology,” Whole Child Blog, February 25, 2011, http://www.wholechildeducation.org/blog/the-true-promise-of-technology.

6. THE EDTECH MATTHEW EFFECT

  1. 1. This chapter is adapted from Justin Reich and Mizuko Ito, From Good Intentions to Real Outcomes: Equity by Design in Learning Technologies (Irvine, CA: Digital Media and Learning Research Hub, 2017), https://clalliance.org/wp-content/uploads/2017/11/GIROreport_1031.pdf, which was released under a Creative Commons 3.0 license. I am particularly indebted to Mimi for the “three myths” discussed later in the chapter. In the iPad era when tablet computers were first released, I wrote a book with Tom Daccord highlighting how iPads might easily be used to extend existing practices and exploring what some of these pockets of excellence looked like then and could look like in the future; see Tom Daccord and Justin Reich, iPads in the Classroom: From Consumption and Curation to Creation (Dorchester, MA: EdTech Teacher, 2014); and Tom Daccord and Justin Reich, “How to Transform Teaching with Tablets,” Educational Leadership 72, no. 8 (2015): 18–23.

  2. 2. Matthew 25:29 (New International Version).

  3. 3. Larry Cuban, Teachers and Machines: The Classroom Use of Technology since 1920 (New York: Teachers College Press, 1986), 23.

  4. 4. Jeannie Oakes, Keeping Track: How Schools Structure Inequality (New Haven: Yale University Press, 2005).

  5. 5. Paul Attewell, “Comment: The First and Second Digital Divides,” Sociology of Education 74, no. 3 (2001): 252–259.

  6. 6. Harold Wenglinsky, “Does It Compute? The Relationship between Educational Technology and Student Achievement in Mathematics” (Princeton, NJ: Educational Testing Services, 1998), https://www.ets.org/research/policy_research_reports/publications/report/1998/cneu, 3.

  7. 7. Ulrich Boser, “Are Schools Getting a Big Enough Bang for Their Education Buck?,” Center for American Progress Blog, June 14, 2013, https://www.americanprogress.org/issues/education-k-12/reports/2013/06/14/66485/are-schools-getting-a-big-enough-bang-for-their-education-technology-buck; Matthew H. Rafalow, “Disciplining Play: Digital Youth Culture as Capital at School,” American Journal of Sociology 123, no. 5 (2018): 1416–1452.

  8. 8. I describe the hopes for education in the Web 2.0 era in “Reworking the Web, Reworking the World: How Web 2.0 Is Changing Our Society,” Beyond Current Horizons (2008), https://edarxiv.org/hqme5/, https://doi.org/10.35542/osf.io/hqme5. Another perspective is Christine Greenhow, Beth Robelia, and Joan E. Hughes, “Learning, Teaching, and Scholarship in a Digital Age: Web 2.0 and Classroom Research: What Path Should We Take Now?,” Educational Researcher 38, no. 4 (2009): 246–259.

  9. 9. Justin Reich, Richard Murnane, and John Willett, “The State of Wiki Usage in US K–12 Schools: Leveraging Web 2.0 Data Warehouses to Assess Quality and Equity in Online Learning Environments,” Educational Researcher 41, no. 1 (2012): 7–15.

  10. 10. John D. Hansen and Justin Reich, “Democratizing Education? Examining Access and Usage Patterns in Massive Open Online Courses,” Science 350, no. 6265 (2015): 1245–1248.

  11. 11. Justin Reich, “The Digital Fault Line: Background,” EdTech Researcher, May 3, 2013, https://blogs.edweek.org/edweek/edtechresearcher/2013/05/the_digital_fault_line_background.html; S. Craig Watkins and Alexander Cho, The Digital Edge: How Black and Latino Youth Navigate Digital Inequality (New York: NYU Press, 2018); Vikki S. Katz and Michael H. Levine, “Connecting to Learn: Promoting Digital Equity among America’s Hispanic Families,” Joan Ganz Cooney Center at Sesame Workshop, 2015, https://eric.ed.gov/?id=ED555584; Vikki S. Katz, Meghan B. Moran, and Carmen Gonzalez, “Connecting with Technology in Lower-Income US Families,” New Media & Society 20, no. 7 (2018): 2509–2533.

  12. 12. René F. Kizilcec, Andrew J. Saltarelli, Justin Reich, and Geoffrey L. Cohen, “Closing Global Achievement Gaps in MOOCs,” Science 355, no. 6322 (2017): 251–252.

  13. 13. Kizilcec et al., “Closing.”

  14. 14. Rene Kizilcec, Justin Reich, Michael Yeomans, Christoph Dann, Emma Brunskill, Glenn Lopez, Selen Turkay, Joseph Williams, and Dustin Tingley, “Scaling Up Behavioral Science Interventions in Online Education,” in Proceedings of the National Academy of Science (forthcoming).

  15. 15. Kizilcec et al. “Closing.”

  16. 16. Tressie McMillan Cottom, “Intersectionality and Critical Engagement with the Internet” (February 10, 2015). Available at SSRN, https://ssrn.com/abstract=2568956, 9.

  17. 17. Reich and Ito, “From Good intentions.”

  18. 18. Betsy James DiSalvo, “Glitch Game Testers: The Design and Study of a Learning Environment for Computational Production with Young African American Males,” PhD diss., Georgia Institute of Technology, 2012; Betsy James DiSalvo, Mark Guzdail, Tom Mcklin, Charles Meadows, Kenneth Perry, Corey Steward, and Amy Bruckman, “Glitch Game Testers: African American Men Breaking Open the Console,” in Proceedings of the 2009 DiGRA International Conference: Breaking New Ground: Innovation in Games, Play, Practice and Theory, http://www.digra.org/digital-library/publications/glitch-game-testers-african-american-men-breaking-open-the-console/.

  19. 19. Digital Promise, IT Best Practices Toolkits: Student Tech Teams, 2018, https://verizon.digitalpromise.org/toolkit/student-tech-teams/.

  20. 20. Ricarose Roque, “Family Creative Learning,” Makeology: Makerspaces as Learning Environments 1 (2016): 47. For research on Scratchers and their parents, see Karen Brennan and Mitchel Resnick, “Imagining, Creating, Playing, Sharing, Reflecting: How Online Community Supports Young People as Designers of Interactive Media,” in Emerging Technologies for the Classroom, eds. Chrystalla Mouza and Nancy Lavigne (New York: Springer, 2013), 253–268. In the One Laptop Per Child deployment in Paraguay, similar evidence emerged that constructionist tools like Scratch were primarily taken up in homes where parents could support learning about computing; see Morgan Ames, The Charisma Machine: The Life, Death, and Legacy of One Laptop per Child (Cambridge, MA: MIT Press, 2019).

  21. 21. See, Tech Goes Home 2018 Annual Report, https://static.wixstatic.com/ugd/6377ee_1a8d7ab992c94c3da08f0dc4a5d56e49.pdf.

  22. 22. Sarah Kessler, “How Jim McKelvey’s Launchcode Is Helping Unconventional Tech Talent,” Fast Company, April 18, 2016, https://www.fastcompany.com/3058467/how-jim-mckelveys-launchcode-is-helping-unconventional-tech-talent; Carl Straumsheim, “One Course, Three Flavors,” Inside Higher Ed, January 21, 2014, https://www.insidehighered.com/news/2014/01/21/harvard-u-experiments-three-versions-same-course.

  23. 23. Mizuko Ito, Kris Gutiérrez, Sonia Livingstone, Bill Penuel, Jean Rhodes, Katie Salen, Juliet Schor, Julian Sefton-Green, and S. Craig Watkins, “Connected Learning: An Agenda for Research and Design” (Irvine, CA: Digital Media and Learning Research Hub, 2013), https://dmlhub.net/wp-content/uploads/files/Connected_Learning_report.pdf.

  24. 24. Moran Tsur and Natalie Rusk, “Scratch Microworlds: Designing Project-Based Introductions to Coding,” in Proceedings of the 49th ACM Technical Symposium on Computer Science Education (Association for Computing Machinery Digital Library, 2018), 894–899.

  25. 25. Nichole Pinkard, Sheena Erete, Caitlin K. Martin, and Maxine McKinney de Royston, “Digital Youth Divas: Exploring Narrative-Driven Curriculum to Spark Middle School Girls’ Interest in Computational Activities,” Journal of the Learning Sciences 26, no. 3 (2017): 477–516.

  26. 26. Rebecca Pitt, “Mainstreaming Open Textbooks: Educator Perspectives on the Impact of Openstax College Open Textbooks,” International Review of Research in Open and Distributed Learning 16, no. 4 (2015); David Ruth, “OpenStax Announces Top 10 Schools That Have Adopted Free College Textbooks,” OpenStax, February 21, 2019, https://openstax.org/press/openstax-announces-top-10-schools-have-adopted-free-college-textbooks.

  27. 27. Benjamin L. Castleman, and Lindsay C. Page, Summer Melt: Supporting Low-Income Students through the Transition to College (Cambridge, MA: Harvard Education Press, 2014).

  28. 28. Benjamin L. Castleman and Lindsay C. Page, “Summer Nudging: Can Personalized Text Messages and Peer Mentor Outreach Increase College Going among Low-Income High School Graduates?,” Journal of Economic Behavior & Organization 115 (2015): 144–160; Benjamin L. Castleman and Lindsay C. Page, “A Response to ‘Texting Nudges Harm Degree Completion,’ ” EducationNext, January, 28, 2019, https://www.educationnext.org/response-texting-nudges-harm-degree-completion/.

7. THE TRAP OF ROUTINE ASSESSMENT

  1. 1. An early version of the argument in this chapter is in Justin Reich, “Will Computers Ever Replace Teachers?,” New Yorker: Elements, July 8, 2014, https://www.newyorker.com/tech/annals-of-technology/will-computers-ever-replace-teachers.

  2. 2. One of the most readable summaries of how computers are changing labor market demands is Frank Levy and Richard J. Murnane, Dancing with Robots: Human Skills for Computerized Work (Washington, DC: Third Way NEXT, 2013). Two of the researchers who have continued research along similar lines are David Autor and David Deming; see Autor’s Work of the Past, Work of the Future (Cambridge, MA: National Bureau of Economic Research, 2019); and Deming’s “The Growing Importance of Social Skills in the Labor Market,” Quarterly Journal of Economics 132, no. 4 (2017): 1593–1640. See also Morgan R. Frank, David Autor, James E. Bessen, Erik Brynjolfsson, Manuel Cebrian, David J. Deming, Maryann Feldman, et al., “Toward Understanding the Impact of Artificial Intelligence on Labor,” Proceedings of the National Academy of Sciences 116, no. 4 (2019): 6531–6539.

  3. 3. Levy and Murnane “Dancing with Robots.”

  4. 4. Frank Levy and Richard J. Murnane, The New Division of Labor: How Computers Are Creating the Next Job Market (Princeton, NJ: Princeton University Press, 2005).

  5. 5. David H. Autor, Frank Levy, and Richard J. Murnane, “The Skill Content of Recent Technological Change: An Empirical Exploration,” The Quarterly Journal of Economics 118, no. 4 (2003): 1279–1333; Deming, “Social Skills.”

  6. 6. On various skill frameworks, see Chris Dede, “Comparing Frameworks for 21st Century Skills,” in 21st Century Skills: Rethinking How Students Learn, eds. James Bellanca and Ron Brandt (Bloomington, IN: Solution Tree Press, 2010), 51–76.

  7. 7. Dana Remus and Frank S. Levy, “Can Robots Be Lawyers?,” Computers, Lawyers, and the Practice of Law (November 27, 2016), available at SSRN: https://ssrn.com/abstract=2701092 or http://dx.doi.org/10.2139/ssrn.2701092.

  8. 8. Levy and Murnane, “New Division.”

  9. 9. Henry I. Braun and Robert Mislevy, “Intuitive Test Theory,” Phi Delta Kappan 86, no. 7 (2005): 488–497.

  10. 10. Daniel M. Koretz, Measuring Up (Cambridge, MA: Harvard University Press, 2008).

  11. 11. Brian Dear, The Friendly Orange Glow: The Untold Story of the PLATO System and the Dawn of Cyberculture (New York: Pantheon, 2017).

  12. 12. R. A. Avner and Paul Tenczar, “The TUTOR Manual” (US Department of Education, Education Resources Information Center [ERIC], 1970), https://eric.ed.gov/?id=ED050583.

  13. 13. See Common Core State Standards Initiative, http://www.corestandards.org/.

  14. 14. For another argument in favor of focusing education on what computers can’t do, see Conrad Wolfram, “Teaching Kids Real Math with Computers,” TED talk (2010), https://www.ted.com/talks/conrad_wolfram_teaching_kids_real_math_with_computers/transcript?language=en.

  15. 15. Unsupervised machine learning, where algorithms classify documents into clusters based on feature similarity, is another branch of machine learning, which thus far has limited applications to assessment. With colleagues, I’ve proposed ways of using natural language processing and unsupervised machine learning to aid human assessment; see Justin Reich, Dustin Tingley, Jetson Leder-Luis, Margaret E. Roberts, and Brandon Stewart, “Computer-Assisted Reading and Discovery for Student Generated Text in Massive Open Online Courses,” Journal of Learning Analytics 2, no. 1 (2015): 156–184; Researchers at the University of Illinois at Urbana-Champaign have proposed using unsupervised learning models as inputs to supervised algorithms for assessment; see Saar Kuzi, William Cope, Duncan Ferguson, Chase Geigle, and ChengXiang Zhai, “Automatic Assessment of Complex Assignments Using Topic Models,” Proceedings of the 2019 ACM Learning at Scale Conference (Association for Computing Machinery Digital Library, 2019), 1–10.

  16. 16. For an overview of machine-evaluated pronunciation, see Silke M. Witt, “Automatic Error Detection in Pronunciation Training: Where We Are and Where We Need to Go,” Proceedings of the International Symposium on Automatic Detection on Errors in Pronunciation Training (2012): 1–8.

  17. 17. For a paper that offers some sense of the amount of data required for various pronunciation training tasks, see Wenping Hu, Yao Qian, and Frank K. Soong, “A New DNN-based High Quality Pronunciation Evaluation for Computer-Aided Language Learning (CALL),” Interspeech (2013): 1886–1890, https://pdfs.semanticscholar.org/ef29/bfcf0fcf71496b2c6a09ae415010c5d7a2dc.pdf.

  18. 18. For an optimistic argument for the state of automated essay grading, see Mark D. Shermis, “State-of-the-Art Automated Essay Scoring: Competition, Results, and Future Directions from a United States Demonstration,” Assessing Writing 20 (2014): 53–76. For a more pessimistic view, see Les Perelman, “When ‘the State of the Art’ Is Counting Words,” Assessing Writing 21 (2014): 104–111.

  19. 19. Shermis, “State-of-the-Art.”

  20. 20. Perelman, “State”; Randy Elliot Bennett, “The Changing Nature of Educational Assessment,” Review of Research in Education 39, no. 1 (2015): 370–407, https://doi.org/10.3102/0091732X14554179.

  21. 21. Quotations from https://secureservercdn.net/45.40.149.159/b56.e17.myftpupload.com/wp-content/uploads/2019/12/R.pdf. As cited in Les Perelman, “Babel Generator” (n.d.), http://lesperelman.com/writing-assessment-robo-grading/babel-generator/.

  22. 22. Steven Kolowich, “Writing Instructor, Skeptical of Automated Grading, Pits Machine vs. Machine,” Chronicle of Higher Education 28 (2014), https://www.chronicle.com/article/Writing-Instructor-Skeptical/146211.

  23. 23. The excerpt is found in Audrey Watters Teaching Machines (Cambridge, MA: MIT Press, forthcoming), and the quotation is from “Exams by Machinery,” Ohio State University Monthly (May 1931): 339. The quotation is also cited in Stephen Petrina, “Sidney Pressey and the Automation of Education, 1924–1934,” Technology and Culture 45, no. 2 (2004): 305–330.

  24. 24. Harold Abelson, Gerald Jay Sussman, and Julie Sussman, Structure and Interpretation of Computer Programs (Cambridge, MA: MIT Press, 1996), xxii.

  25. 25. Jennifer French, Martin A. Segado, and Phillip Z. Ai, “Sketching Graphs in a Calculus MOOC: Preliminary Results,” in Frontiers in Pen and Touch: Impact of Pen and Touch Technology on Education (Cham, Switzerland: Springer, 2017), 93–102.

  26. 26. Valerie Jean Shute and Matthew Ventura, Stealth Assessment: Measuring and Supporting Learning in Video Games (Cambridge, MA: MIT Press, 2013). Jennifer S. Groff, “The Potentials of Game-Based Environments for Integrated, Immersive Learning Data,” European Journal of Education 53, no. 2 (2018): 188–201.

  27. 27. Groff, “The Potentials of Game-Based Environments for Integrated, Immersive Learning Data.”

  28. 28. For one example of a rapidly developing AI / machine-learning-based technological system in the game of Go, see David Silver, Aja Huang, Chris J. Maddison, Arthur Guez, Laurent Sifre, George Van Den Driessche, Julian Schrittwieser, et al., “Mastering the Game of Go with Deep Neural Networks and Tree Search,” Nature 529, no. 7587 (2016): 484.

  29. 29. David Pogue, “I’ll Have My AI Call Your AI,” Scientific American 319, no. 2 (2018): 26, https://www.scientificamerican.com/article/googles-duplex-ai-scares-some-people-but-i-cant-wait-for-it-to-become-a-thing/.

  30. 30. An early description of edX efforts at autograding is Piotr Mitros, Vikas Paruchuri, John Rogosic, and Diana Huang, “An Integrated Framework for the Grading of Freeform Responses,” in The Sixth Conference of MIT’s Learning International Networks Consortium, 2013, https://linc.mit.edu/linc2013/proceedings/Session3/Session3Mit-Par.pdf. Another early evaluation is Erin Dawna Reilly, Rose Eleanore Stafford, Kyle Marie Williams, and Stephanie Brooks Corliss, “Evaluating the Validity and Applicability of Automated Essay Scoring in Two Massive Open Online Courses,” International Review of Research in Open and Distributed Learning 15, no. 5 (2014), http://www.irrodl.org/index.php/irrodl/article/view/1857.

8. THE TOXIC POWER OF DATA AND EXPERIMENTS

  1. 1. On the analogies between online learning platforms and experimentation in consumer technology, see Shreeharsh Kelkar, “Engineering a Platform: The Construction of Interfaces, Users, Organizational Roles, and the Division of Labor,” New Media & Society 20, no. 7 (2018): 2629–2646.

  2. 2. Bruce Schneier, “Data Is a Toxic Asset,” Schneier on Security Blog, March 4, 2016, https://www.schneier.com/blog/archives/2016/03/data_is_a_toxic.html.

  3. 3. A detailed description of a publicly available dataset with five years of Scratch data is Benjamin Mako Hill and Andrés Monroy-Hernández, “A Longitudinal Dataset of Five Years of Public Activity in the Scratch Online Community,” Scientific Data 4 (2017): 170002, https://doi.org/10.1038/sdata.2017.2. Research involving extensive uses of Scratch data includes Sayamindu Dasgupta, William Hale, Andrés Monroy-Hernández, and Benjamin Mako Hill, “Remixing as a Pathway to Computational Thinking,” in Proceedings of the 19th ACM Conference on Computer-Supported Cooperative Work and Social Computing (Association for Computing Machinery Digital Library, 2016), 1438–1449; and Seungwon Yang, Carlotta Domeniconi, Matt Revelle, Mack Sweeney, Ben U. Gelman, Chris Beckley, and Aditya Johri, “Uncovering Trajectories of Informal Learning in Large Online Communities of Creators,” in Proceedings of the Second (2015) ACM Conference on Learning at Scale (Association for Computing Machinery Digital Library, 2015), 131–140. A detailed description of a publicly available HarvardX and MITx dataset is at Jon P. Daries, Justin Reich, Jim Waldo, Elise M. Young, Jonathan Whittinghill, Andrew Dean Ho, Daniel Thomas Seaton, and Isaac Chuang, “Privacy, Anonymity, and Big Data in the Social Sciences,” Communications of the ACM 57, no. 9 (2014): 56–63.

  4. 4. For an example of treating learning records like text documents, see Cody A. Coleman, Daniel T. Seaton, and Isaac Chuang, “Probabilistic Use Cases: Discovering Behavioral Patterns for Predicting Certification,” in Proceedings of the Second (2015) ACM Conference on Learning at Scale (Association for Computing Machinery Digital Library, 2015), 141–148.

  5. 5. Guanliang Chen, Dan Davis, Claudia Hauff, and Geert-Jan Houben, “Learning Transfer: Does It Take Place in MOOCs? An Investigation into the Uptake of Functional Programming in Practice,” in Proceedings of the Third (2016) ACM Conference on Learning at Scale (Association for Computing Machinery Digital Library, 2016), 409–418. In a related study that connected MOOC learning to new behaviors, researchers investigated whether participation in a MOOC about learning analytics led to greater involvement in the scholarly society for learning analytics or submissions to learning analytics conferences; see Yuan Wang, Luc Paquette, and Ryan Baker, “A Longitudinal Study on Learner Career Advancement in MOOCs,” Journal of Learning Analytics 1, no. 3 (2014): 203–206.

  6. 6. Tommy Mullaney, “Making Sense of MOOCs: A Reconceptualization of HarvardX Courses and Their Students,” SSRN (2014), https://papers.ssrn.com/sol3/papers.cfm?abstract_id=2463736.

  7. 7. For another take on using large-scale data to zoom in and zoom out on learner behavior, see Jennifer DeBoer, Andrew D. Ho, Glenda S. Stump, and Lori Breslow, “Changing ‘Course’: Reconceptualizing Educational Variables for Massive Open Online Courses,” Educational Researcher 43, no. 2 (2014): 74–84. For research on how changes in the Scratch environment lead to changes in behavior, see Sayamindu Dasgupta and Benjamin Mako Hill, “How ‘Wide Walls’ Can Increase Engagement: Evidence from a Natural Experiment in Scratch,” in Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems (Association for Computing Machinery Digital Library, 2018), 1–11.

  8. 8. Ryan Baker, Sidney D’Mello, Merecedes Rodrigo, and Arthur Graesser, “Better to be Frustrated Than Bored: The Incidence, Persistence, and Impact of Learners’ Cognitive-Affective States during Interactions with Three Different Computer-Based Learning Environments,” International Journal of Human-Computer Studies 68, no. 4 (2010), 223–241, https://www.sciencedirect.com/science/article/pii/S1071581909001797.

  9. 9. Boston School Committee, Reports of the Annual Visiting Committees of the Public Schools of the City of Boston, (1845), City Document no. 26 (Boston: J. H. Eastburn), 12.

  10. 10. Justin Reich, “ ‘Compass and Chart’: Millenarian Textbooks and World History Instruction in Boston, 1821–1923,” SSRN (2009), https://papers.ssrn.com/sol3/papers.cfm?abstract_id=2193129.

  11. 11. Lorrie Shepard, “A Brief History of Accountability Testing, 1965–2007,” in The Future of Test-Based Educational Accountability, eds. Katherine Ryan and Lorrie Shepard (New York: Routledge. 2008), 25–46.

  12. 12. For two studies with promising results for education technology at scale, see John F. Pane, Beth Ann Griffin, Daniel F. McCaffrey, and Rita Karam, “Effectiveness of Cognitive Tutor Algebra I at Scale,” Educational Evaluation and Policy Analysis 36, no. 2 (2014): 127–144; and William G. Bowen, Matthew M. Chingos, Kelly A. Lack, and Thomas I. Nygren, “Interactive Learning Online at Public Universities: Evidence from a Six-Campus Randomized Trial,” Journal of Policy Analysis and Management 33, no. 1 (2014): 94–111. On the risk / benefit calculation from data-intensive educational research, see Rebecca Ferguson and Doug Clow, “Where Is the Evidence? A Call to Action for Learning Analytics,” Proceedings of the Seventh International Learning Analytics and Knowledge Conference (2017), 56–65.

  13. 13. For the official statement from the meeting, see Asilomar Convention for Learning Research in Higher Education, 2014, http://asilomar-highered.info/.

  14. 14. For an introduction to some of these concerns, see Elana Zeide, “Education Technology and Student Privacy,” in The Cambridge Handbook of Consumer Privacy, eds. Evan Selinger, Jules Polonetsky, and Omer Tene, 70–84, SSRN (2018), https://ssrn.com/abstract=3145634.

  15. 15. A landmark text of how surveillance and data capture is reshaping the world is Shoshana Zuboff, The Age of Surveillance Capitalism: The Fight for a Human Future at the New Frontier of Power (New York: Public Affairs, 2019).

  16. 16. For an overview of K–12 legal issues related to student data and privacy, see Leah Plunkett, Alicia Solow-Niederman, and Urs Gasser, “Framing the Law and Policy Picture: A Snapshot of K–12 Cloud-Based Ed Tech and Student Privacy in Early 2014,” Berkman Center Research Publication 2014-10 (2014). For the yawning gaps between current law and much needed ethical guidelines, see Elana Zeide, “Unpacking Student Privacy,” in Handbook of Learning Analytics, eds. Charles Lang, George Siemens, Alyssa Wise, and Dragan Gasevic (Society for Learning Analytics Research, 2017), 327–335.

  17. 17. Natasha Singer, “Online Test-Takers Feel Anti-cheating Software’s Uneasy Glare,” New York Times, April 5, 2015, https://www.nytimes.com/2015/04/06/technology/online-test-takers-feel-anti-cheating-softwares-uneasy-glare.html.

  18. 18. Helen Nissenbaum, Privacy in Context: Technology, Policy, and the Integrity of Social Life (Stanford, CA: Stanford University Press, 2009). For the theory of contextual integrity applied to MOOCs and virtual learning environments, see Elana Zeide and Helen F. Nissenbaum, “Learner Privacy in MOOCs and Virtual Education,” Theory and Research in Education 16, no. 3 (2018): 280–307, available at https://ssrn.com/abstract=3303551 or http://dx.doi.org/10.2139/ssrn.3303551; Charlie Moore, Vivian Zhong, and Anshula Gandhi, “Healthy Minds Study Survey Data Informed 2016 Senior House Decisions,” Tech, July 26, 2017, https://thetech.com/2017/07/26/healthy-minds-data-used-in-2016-senior-house-decisions. See also Cynthia Barnhart, Senior House Decision Process FAQ, http://chancellor.mit.edu/sites/default/files/sh-decisionprocess-faq.pdf; and Elizabeth Glaser, “MIT Misused Survey Data to Take Action against Senior House,” Tech, July 26, 2017, https://thetech.com/2017/07/26/healthy-minds-survey-misuse.

  19. 19. John D. Hansen and Justin Reich, “Democratizing Education? Examining Access and Usage Patterns in Massive Open Online Courses,” Science 350, no. 6265 (2015): 1245–1248.

  20. 20. Two efforts to share the role of research in the edX consortium are the edX page on research, https://www.edx.org/about/research-pedagogy, and the HarvardX research statement, https://harvardx.harvard.edu/research-statement, which is attached to many course registration pages.

  21. 21. Audrey Watters, “The Weaponization of Education Data,” Hack Education, December 11, 2017, http://hackeducation.com/2017/12/11/top-ed-tech-trends-weaponized-data.

  22. 22. On school cooperation with Immigration and Customs Enforcement, see Erica Green, “For Immigrant Students, a New Worry: Call to ICE,” New York Times, May 30, 2018, https://www.nytimes.com/2018/05/30/us/politics/immigrant-students-deportation.html.

  23. 23. James Murphy, “The Undervaluing of School Counselors,” Atlantic, September 16, 2016, https://www.theatlantic.com/education/archive/2016/09/the-neglected-link-in-the-high-school-to-college-pipeline/500213/; Douglas J. Gagnon and Marybeth J. Mattingly, “Most U.S. School Districts Have Low Access to School Counselors: Poor, Diverse, and City School Districts Exhibit Particularly High Student-to-Counselor Ratios,” Carsey Research (2016), https://scholars.unh.edu/cgi/viewcontent.cgi?article=1285&context=carsey. The claim that Naviance is used in 40 percent of high schools is from the company’s website; see https://www.naviance.com/solutions/states.

  24. 24. David Christian, Amy Lawrence, and Nicole Dampman, “Increasing College Access through the Implementation of Naviance: An Exploratory Study,” Journal of College Access 3, no. 2 (2017): 28–44; Christine Mulhern, “Changing College Choices with Personalized Admissions Information at Scale: Evidence on Naviance,” April 2019, https://scholar.harvard.edu/files/mulhern/files/naviance_mulhern_april2019.pdf.

  25. 25. A Learning at Scale keynote, unrecorded but with a short published abstract, provides one example of this position: Peter Norvig, “Machine Learning for Learning at Scale,” in Proceedings of the Second (2015) ACM Conference on Learning at Scale (Association for Computing Machinery Digital Library, 2015), 215. See also Daphne Koller, “What We’re Learning from Online Education,” filmed June 2012 at TEDGlobal 2012, Edinburgh, Scotland, video, https://www.ted.com/talks/daphne_koller_what_we_re_learning_from_online_education?language=en.

  26. 26. Justin Reich “Engineering the Science of Learning,” Bridge 46, no. 3 (2016), https://www.nae.edu/162627/Engineering-the-Science-of-Learning.

  27. 27. Rene Kizilcec, Justin Reich, Michael Yeomans, Christoph Dann, Emma Brunskill, Glenn Lopez, Selen Turkay, Joseph Williams, and Dustin Tingley, “Scaling Up Behavioral Science Interventions in Online Education,” in Proceedings of the National Academy of Science (forthcoming).

  28. 28. Benjamin Harold, “Pearson Tested ‘Social-Psychological’ Messages in Learning Software, with Mixed Results,” Education Week: Digital Education, April 17, 2018, https://blogs.edweek.org/edweek/DigitalEducation/2018/04/pearson_growth_mindset_software.html. Kate Crawford (@katecrawford), “Ed tech company experiments on 9000 kids without anyone’s consent or knowledge to see if they test differently when ‘social-psychological’ messaging is secretly inserted? HARD NO,” https://twitter.com/katecrawford/status/986584699647791104. On mindset theory, see David Scott Yeager and Carol S. Dweck, “Mindsets That Promote Resilience: When Students Believe That Personal Characteristics Can Be Developed,” Educational Psychologist 47, no. 4 (2012): 302–314.

  29. 29. Michelle N. Meyer, Patrick R. Heck, Geoffrey S. Holtzman, Stephen M. Anderson, William Cai, Duncan J. Watts, and Christopher F. Chabris, “Objecting to Experiments That Compare Two Unobjectionable Policies or Treatments,” Proceedings of the National Academy of Sciences 116, no. 22 (2019): 10723–10728.

  30. 30. For Pearson’s response, see Valerie Strauss, “Pearson Conducts Experiment on Thousands of College Students without Their Knowledge,” Washington Post: Answer Sheet Blog, April 23, 2018, https://www.washingtonpost.com/news/answer-sheet/wp/2018/04/23/pearson-conducts-experiment-on-thousands-of-college-students-without-their-knowledge/?utm_term=.9efe30965b57.

  31. 31. For more thoughts on my cautious optimism on these experiments, see Justin Reich, “Can Text Messages and Interventions Nudge Students through School?” KQED Mindshift, June 3, 2015, https://www.kqed.org/mindshift/40719/can-text-messages-and-interventions-nudge-students-through-school.

  32. 32. Zeide, “Unpacking.”

  33. 33. Monica Bulger, Patrick McCormick, and Mikaela Pitcan, “The Legacy of inBloom,” Data & Society Research Institute, 2017, https://datasociety.net/library/the-legacy-of-inbloom/.

  34. 34. Bulger, McCormick, and Pitcan “The Legacy of inBloom.”

  35. 35. Leah Plunkett, Alicia Solow-Niederman, and Urs Gasser, “Framing the Law and Policy Picture: A Snapshot of K–12 Cloud-Based Ed Tech and Student Privacy in Early 2014,” presentation at Harvard Law School, June 3, 2014, available at https://papers.ssrn.com/sol3/papers.cfm?abstract_id=2442432.

  36. 36. Kenneth R. Koedinger, Ryan S.J.d. Baker, Kyle Cunningham, Alida Skogsholm, Brett Leber, and John Stamper, “A Data Repository for the EDM Community: The PSLC DataShop,” Handbook of Educational Data Mining 43 (2010): 43–56.

CONCLUSION

  1. 1. Frederick James Smith, “The Evolution of the Motion Picture: VI—Looking into the Future with Thomas A. Edison,” New York Dramatic Mirror, July 9, 1913, 24, column 3, as analyzed in https://quoteinvestigator.com/2012/02/15/books-obsolete/; “Edison Predicts Film Will Replace Teacher, Books,” Associated Press, May 15, 1923, available at https://virginiachronicle.com/cgi-bin/virginia?a=d&d=HR19230518.2.11.

  2. 2. Phil Hill, “Instructure: Plans to Expand Beyond Canvas LMS into Machine Learning and AI,” e-Literate, March, 2019, https://eliterate.us/instructure-plans-to-expand-beyond-canvas-lms-into-machine-learning-and-ai/.

  3. 3. “Roy Amara: 1925–2007, American Futurologist,” Oxford Essential Quotations, 4th ed., ed. Susan Ratcliffe, 2016, https://www.oxfordreference.com/view/10.1093/acref/9780191826719.001.0001/q-oro-ed4-00018679.

  4. 4. John F. Pane, Beth Ann Griffin, Daniel F. McCaffrey, and Rita Karam, “Effectiveness of Cognitive Tutor Algebra I at Scale,” Educational Evaluation and Policy Analysis 36, no. 2 (2014): 127–144; Jeremy Roschelle, Mingyu Feng, Robert F. Murphy, and Craig A. Mason, “Online Mathematics Homework Increases Student Achievement,” AERA Open 2, no. 4 (2016), https://doi.org/10.1177/2332858416673968.

  5. 5. For statistics on Wikipedia size, see https://en.wikipedia.org/wiki/Wikipedia:Size_comparisons#cite_note-wikistatsall-2. For the Newspapers on Wikipedia Projects, see Mike Caulfield, “Announcing the Newspapers on Wikipedia Project (#NOW),” Hapgood.us, May 29, 2018, https://hapgood.us/2018/05/29/announcing-the-local-historical-newspapers-project-lhnp/. See also Emma Lurie and Eni Mustafaraj, “Investigating the Effects of Google’s Search Engine Result Page in Evaluating the Credibility of Online News Sources,” in Proceedings of the 10th ACM Conference on Web Science (Association for Computing Machinery Digital Library, 2018), 107–116, https://dl.acm.org/doi/10.1145/3201064.3201095. On early distaste for Wikipedia and some argument for how it might be useful to educators, see Justin Reich and Tom Daccord, Best Ideas for Teaching with Technology: A Practical Guide for Teachers, by Teachers (New York: Routledge, 2015).