13

Algorithms and Analytics Recode Education: Pay Attention!

The MOOCs we have discussed have been linked in our research with notions of “data colonialism” (Knox 2016a), a critical perspective that highlights the bias and discrimination apparent in the technologies at work beneath the slick facades of course software. It is to these considerations we now turn.

Perhaps one of the most surprising aspects of projects to develop learning at scale is the lack of attention given to the technology that makes such global education possible. Platform promotion has tended to describe the MOOC, for example, simply as a means of access, or a “transparent window” to universities (Knox 2016b, 311), and research has been largely focused around student retention (Jordan 2014; Henderikx, Kreijns, and Kalz 2017), or broad demographics (Breslow et al. 2013; Perna et al. 2013). The focus on removing barriers to participation seems to overlook the significant transformation of education brought about by the vastly complex technological infrastructure and software involved in offering courses online. Through this point in the manifesto, we argue that it is important to understand the move online as a recoding, a transformation, rather than an augmentation or, indeed, enhancement of education. New technologies change and reshape what we do as teachers—sometimes for better, sometimes for worse—regardless of how often we prefer to speak of them as enhancements or tools.

In line with the sociomaterial perspective we have applied throughout this book, educational technology should be understood not simply as a set of tools applied to an existing and authentic education, but as formative of, and inextricable from, the educational contexts in which it is involved. However, casting the MOOC—for example—as the ultimate freedom from obstacles to access is directly premised on the assumption of technology as neutral, straightforwardly providing a way in to a supposedly unchanged educational experience. It is important, then, to understand much of the discourse around open and online education as lacking in theoretical engagement with the active role of technology; it either ignores it or describes it in terms that are “rooted in a kind of mad instrumentalized culture of positivism and technological rationality” (Giroux 2017, 141).

However, recent developments in learning analytics (see Lang et al. 2017) are surfacing tangible examples of technologies—involved in identifying at-risk students, or automating teacher feedback—that challenge this apparent neutrality, and the MOOC has been a significant player in the emergence of such approaches in education. This is largely because of the large numbers of new kinds of participants taking MOOCs, which have generated the kind of sizable data sets needed to drive analytic modeling. Learning analytics is typically defined as “measurement, collection, analysis and reporting of data about learners and their contexts, for purposes of understanding and optimising learning and the environments in which it occurs” (Long and Siemens 2011, 34).

Beginning with this definition, two important and interrelated issues become apparent with respect to the idea that this technology recodes education: first, in the sense of the discursive shifts that have come to prominence as learning analytics gains traction in the sector and, second, in relation to the concrete ways learning analytics software is able to produce environments that directly influence educational activities and shape learners through automation. This has implications for the understanding of education in its capacity to liberate, as well as for assumptions about the kinds of subjectivity attributed to students, as has been discussed previously in relation to the work of Isaiah Berlin.

A flourishing area of research has developed around the Society for Learning Analytics Research (SoLAR, n.d.), including a program of summer institutes and international conferences and the establishment of a dedicated academic journal (Journal of Learning Analytics). As an emergent field, learning analytics has garnered significant attention, including influential reports from Universities UK (2016) and the development of institutional policy (see Tsai and Gasevic 2017). Analytics are becoming common within the day-to-day experience of teachers and students, increasingly appearing in the form of dashboards as educational software companies and platforms develop their products (Clow 2013). Horizon scanning in higher education has frequently forecast the imminent disruption of teaching and learning practices through such additions (Johnson et al. 2014, 2015, 2016).

Paying attention to this body of work reveals important ways in which education is being reframed and recoded, in particular in relation to the role of the teacher. Advocates of learning analytics promote notions of “actionable intelligence” (Campbell, De Blois, and Oblinger 2007) and “data driven decision making” (Barneveld, Arnold, and Campbell 2012), which appear to clearly position analytics as providing an objective basis for subsequent educational intervention. This holds the potential to give precedence and authority to the insights generated through the data dashboard over those of the human teacher, who is often positioned merely to respond to analytic outputs (Knox 2017b).

This can be understood as an extension of the teacherless model of student self-direction discussed earlier in relation to open education. It is a kind of mythologizing of the objectivity of data (boyd and Crawford 2012) introduced to supplant the supposed deficiencies and inefficiencies of human teachers. This assumed need to supplement the capacities of the teacher partly derives from the convergence, through learning analytics, of particular kinds of disciplinary expertise in the educational environment. While learning analytics software is designed for end use by teachers, students, and educational administrators, it draws from highly specific techniques developed from data mining and business intelligence (Ferguson 2012). In this sense, it can be understood as black-boxed technology (Latour 1999), encouraging engagement with its outputs—in the form of dashboards and visualizations—but concealing the complexity of its functioning, which is incomprehensible to those without proficiency in data science.

Through learning analytics, therefore, pedagogic authority and the capacity to make decisions about the activity and progress of learners is being spread across humans and machines, and strongly inflected by data science practices and expertise. This is not necessarily problematic—in fact, interdisciplinary partnerships are essential to the future of educational research and development—but it becomes so where teachers and the act of teaching are not placed at the center of the design and development of new educational technologies and where the supposed objectivity of data is not subjected to critical interrogation by educators themselves.

This relates to a second, and equally important, aspect of learning analytics, which involves a more literal recoding of the educational environment through the deployment of data and algorithm. Sociological studies are beginning to pay attention to the “algorithmic culture” (Striphas 2015) emerging from the increasing prevalence of complex computational routines in “social ordering, governance and control” (Williamson 2014). Algorithms are becoming “powerful and consequential actors in a wide variety of domains” (Ziewitz 2015, 3). While a technical description of an algorithm might suggest simply the “encoded procedures for transforming input data into a desired output, based on specified calculations” (Gillespie 2014, 167), critical work in this area is developing a much broader and complex picture of the social power of the algorithm (Beer 2017), and its involvement in the “proceduralisation of knowledge, and as a result, the formalizing and delineating of social life” (Knox 2018, 165). It is important, then, to understand algorithms in relation to the broader social systems in which they operate rather than simply as isolated technical operations or encoded instructions. The technical definition of the algorithm overlooks the powerful ways in which these highly complex routines are often deeply enmeshed in human activities. They are shaped and defined through user interactions, but also work to shape and define users, through ordering, recommending, disclosing, or concealing information.

Important work is examining the biases generated through algorithmic systems, such as problematic profiling in the prediction of recidivism and, notably for this discussion, the gaming of university rankings (O’Neil 2016). As Knox (2018) demonstrates, algorithms are increasingly being used in MOOCs to, for example, intervene in learning activities or categorize learners into particular groups. Algorithms can therefore be understood to actively construct social relations, as much as objectively measure them through technical procedures. It is algorithms that often operate beneath the surface of learning analytics systems, as performative devices that produce rather than discover educational realities through techniques such as cluster analysis (see Perrotta and Williamson 2018). Understood as recodings, these techniques produce powerful and convincing outcomes that are beginning to shape educational policy (Williamson 2017). In this sense, it is important to understand algorithms in material terms, as concrete actors interwoven in the fabric of educational activity, influencing and shaping both governance and practice. The capacity for analytics and their underlying algorithms to efficiently process large volumes of data and provide empirical and tangible outcomes is key to understanding their appeal in an education system eager not only for economical solutions but also for decisive measures. In this way, analytics and algorithms might be seen as not only aligned with but also as amplifying “the logic of economic rationality and ‘accountability’ that pervades governance cultures in education” (Perrotta and Williamson 2018, 4).

While it might be tempting to view analytics and algorithms as straightforwardly liberating—unburdening teachers and administrators of the difficult task of identifying which students are failing or delivering students from states of incomprehension about their performance—a more critical perspective is needed, one that recognizes an intensification of technocratic control in education. On the one hand, the dashboards and visualizations generated by learning analytics offer an unprecedented, and alluring, appearance of transparency in the often complicated, obscure, and intangible domain of learning. On the other hand, the extent to which decision-making power is shifted away from teachers and students, and toward new kinds of data science expertise, needs further examination. Moreover, as emerging work outside education is highlighting, data-driven technologies are increasingly being employed in ways that amplify existing social inequalities (Eubanks 2018; Noble 2018), and more research is needed to investigate the ways in which educational analytics might replicate and intensify deeply embedded discrimination within our societies and our institutions. Prinsloo’s (2016) suggestions for decolonizing the use of educational data offer one productive avenue of exploration here. The drive to collect, process, and analyze student data in education has the potential to embed an opaque inner world of algorithmic operations that centralizes educational power and agency through a kind of technoscientific governance that is directly at odds with teacher professionalism. As teachers and researchers, we need to build research programs and teaching methods that directly address this risk while remaining open to the many creative and potentially critical uses of computational data that are available to us. Algorithms and analytics recode education: pay attention!