1
Introduction
Learning analytics in the classroom
Jason M. Lodge, Jared Cooney Horvath and Linda Corrin
Learning analytics burst into prominence as a field in the early 2010s. It has since grown rapidly, with researchers around the world from a variety of disciplines engaging in the learning analytics community through meetings, conferences and publications. Learning analytics was initially defined as: āthe measurement, collection, analysis and reporting of data about learners and their contexts, for purposes of understanding and optimizing learning and the environments in which it occursā (Long, Siemens, Conole & GaÅ”eviÄ, 2011). The field provides the basis for the power of data science methodologies to be brought to bear on education. In this volume, we have brought together some of the leading scholars in learning analytics to take one further step towards that goal. We collectively seek to understand how learning analytics will have an impact in the classroom.
The field of learning analytics can no longer be thought of as new or emerging. With the Learning Analytics and Knowledge Conference now well established and journals such as The Journal of Learning Analytics also now several volumes in press, the field is maturing. Despite the rapid growth of the interdisciplinary endeavour that is learning analytics, there remains much work to be done. In this edited volume, we seek to highlight the work of some of the key researchers in the field with a particular emphasis on what the work means in practice. As with the previous edited collection, From the laboratory to the classroom: Translating science of learning for teachers (Horvath, Lodge & Hattie, 2017), this volume is an attempt to make sense of a complex field of research for teachers and educational researchers.
As the field of learning analytics has grown, there appear to have been several distinct branches, or sub-fields, emerge. The first, understandably, focuses predominantly on the data and analytics side. This includes the aspects of the field traditionally associated with core aspects of data science. The kinds of activities occurring in this branch include the development of means of collecting and synthesising data, the construction of predictive models and the creation and testing of algorithmic work. It is recognised that this is a high-level generalisation of the kind of data science work that is occurring in the community, we are here trying to provide an overall categorisation of the strands of activity rather than a comprehensive map.
The second broad area of activity involves the use of methodologies developed through the first to better understand student learning (Lodge & Corrin, 2017). This sub-field is engaged in using new forms of data developed by and through the learning analytics community to uncover aspects of learning that were previously difficult to capture, particularly in real time. There is a strong overlap between this branch and research occurring in the cognate disciplines such as the learning sciences and educational psychology (GaÅ”eviÄ, Dawson & Siemens, 2015). The increased use of data such as log files and the power of techniques for analysing and modelling data across various instruments is providing insight into student learning that has been largely limited in the past to crude behavioural data, self-report instruments and complex psychophysiological measures. The examples presented here suggest great potential in the use of learning analytics to enrich the learning sciences and educational psychology and is a worthy endeavour.
While the focus of much of the work presented in this volume is on classroom application, it should not be overlooked the contribution learning analytics is making to foundational research in multiple domains, particularly in computer science, human computer interaction and the learning sciences. The fundamental research is critical for moving the field forward but is often forgotten in the discussions around the utility of learning analytics in practice.
The third major sub-branch is the use of learning analytics in practice, the focus of this volume. This is the sub-branch of the field that dominated its early focus. Significant effort in early learning analytics work concentrated on innovations that would support student retention both in face-to-face learning environments (e.g. Arnold & Pistilli, 2012; Aguilar, Lonn & Teasley, 2014) and online (Kizilcec, Piech & Schneider, 2013). In more recent times, both the use of learning analytics to make predictions about student progress in the wild and the development of analytics-based interventions have been key areas of progress. As opposed to the high-level modelling used in earlier attempts at predicting when students might need assistance, later efforts are now focussed on more sophisticated approaches. These efforts are increasingly incorporating what is being learned from foundational research, as described in the first two sub-branches, and an acknowledgement of the key role that learning design plays in helping teachers to interpret the analytics in ways that impact practice (Lockyer, Heathcote & Dawson, 2013; Bakharia et al., 2016).
Finally, there are obvious policy and ethical implications that are raised by the increased collection and use of data generated about and by students. While we did not seek contributions directly assessing and/or discussing the ethical implications of learning analytics in the classroom, these issues undoubtedly permeate all the ideas and practices outlined in this volume. We consider it critical that this is recognised here as a core consideration for all work occurring in the field of learning analytics, despite it not being a core focus of the volume.
There is undoubtedly nascent and emerging potential for learning analytics to fundamentally alter practice and policy in education in the near future, particularly as machine learning and artificial intelligence increase in power and availability. As the theory and methods in learning analytics research become increasingly sophisticated, it is not hard to see educational researchers and teachers being left behind as the technical aspects of the field advance. Our aim in putting this volume together was to try to bridge that gap. Learning analytics has been, and must remain, a transdisciplinary endeavour if the data generated are to continue to be used to meaningfully enhance student learning (Lodge, Alhadad, Lewis & GaÅ”eviÄ, 2017). We have brought together a collection of chapters from some of the leading researchers in learning analytics to help close this gap.
Aim of this book
Enhancing student learning is a āwickedā problem. There are many factors that contribute to the efficacy of learning environments and activities. Knowing which combination of these factors will most effectively improve the learning of students will continue to be a pressing challenge as we progress further into the 21st century.
The opportunities for leveraging the growing body of data available in practice are becoming increasingly apparent. The collection of chapters in this book highlight the real and concrete potential of learning analytics to help both better understand and enhance student learning. What will be critical in this evolution of the field is that educational researchers and teachers are included in the journey. It is patently clear that articulating the pedagogical purpose of activities and curriculum are critical for understanding how data might be used to enhance learning in practice. Without the meaning given to the data by those on the ground where the data are being collected, it is very difficult to determine what these data mean.
This collection of chapters is therefore an attempt to help bridge the gap between research, theory and practice. As such, each chapter includes specific discussion about what the research means in the classroom. In these parts, authors of all chapters have provided guidance about how the research applies in the wild. The book will therefore be of benefit to both the learning analytics community and to those new to thinking about what data and analytics mean in a classroom context. Our hope is that the great chapters included in this volume from key researchers will help to foster a deeper and sustainable conversation between learning analytics researchers and teaching practitioners. This conversation will be critical for the evolution of the field into the future.
Outline of the book
This volume is split into parts related to key themes in understanding learning analytics through the lens of the classroom. In the first part, the broad theoretical perspectives are considered. In Chapter 2, Donoghue, Horvath and Lodge consider the applicability of translation frameworks used in the science of learning for the field of learning analytics. One of the key issues in the use of data of any kind to infer student learning is that there is always an inferential gap (Lodge et al., 2017). There is no way to directly measure the processing that occurs in the mind. The kinds of metaphors and methodologies used throughout the diverse range of disciplines interested in learning all rely on some form of inference, which in turn are based on a set of underlying philosophical assumptions. Donoghue et al. unpack what the different levels of analysis mean for interpreting data about learning collected across the full spectrum of methodologies from neurons to neighbourhoods. The discussion in this chapter provides a foundation for more holistic thinking about meaning making and data, which will become increasingly critical as more data from more diverse sources, such as psychophysiological data, are added into the mix. In Chapter 3, Bartimote, Pardo and Reimann delve further into the theoretical aspects of learning analytics. In their chapter, they discuss learning analytics from a realist perspective. Building on a case study in the context of a statistics course, the distinction between learning analytics and learner analytics is drawn. The chapter makes an important contribution to understanding how it is possible to link theory and practice in learning analytics.
The second major part of the book focuses on understanding learning through analytics. Much of the emphasis on learning analytics has been to understand student progression and determine possible avenues for intervention. As is evident in Chapter 4 by Lodge, Panadero, Broadbent and de Barba, there is great value in drawing on learning analytics data, such as behavioural trace data, to better understand learning processes such as those involved in self-regulated learning. Higher-level cognitive and metacognitive processes like those involved in self-regulated learning need to be better understood in order to best use data to help facilitate and enhance learning. Enhancing student self-regulated learning is a difficult undertaking without the nuance that can be provided by a teacher, but some possibilities for using data to do so are discussed in this chapter. Along similar lines, Arguel, Pachman and Lockyer discuss the possibilities for understanding and responding to student emotion using data in Chapter 5. Progress in the area of affective computing and an increased emphasis on emotions in education mean that there is great potential for detecting and intervening when students exhibit various emotions while they learn. Perhaps the most obvious example of this is when behavioural traces suggest a student has reached an impasse in the learning process and is confused. Arguel and colleagues cover how learning analytics can help better understand how student emotions impact on learning in digital environments and how these environments can be created to respond to these emotions.
We then turn to the relationship between learning design and learning analytics. The relationship between these areas has been a strong area of focus in recent years (e.g. Mor, Ferguson & Wasson, 2015). In Chapter 6, Olney, Rienties and Toetenel describe how learning analytics and learning design have been used in combination in a large-scale implementation at the Open University in the UK. A key feature of the approach used in the examples outlined in this chapter is that visualisations are provided to teachers who are then able to interpret the data being generated by students through the lens of the learning design. Olney and colleagues therefore provide a cutting-edge example of a large-scale implementation of learning analytics with learning design as the centrepiece. Prieto-Alvarez, Martinez-Maldonado and Anderson extend the conversation about the use of design to inform learning analytics interventions by discussing possibilities for co-design in Chapter 7. In particular, they point out the power of co-designing learning analytics implementations with students. This chapter therefore bridges the learning analytics community with other trends such as the āstudents as partnersā initiative (e.g. Mercer-Mapstone et al., 2017). In Chapter 8, Thompson and colleagues focus on the design of assessment in a graduate course that strategically incorporates data. In particular, they emphasise the importance of an interdisciplinary approach, through which the implementation of a data-rich approach achieved greater impact for students and for generating new insights into effective learning analytics implementation.
The largest part of the book is devoted to chapters discussing specific examples of the possible impact that learning analytics can and will have on education. The collection of chapters in this part covers a wide range of contexts and implementation strategies. In the first of these practice-focused chapters, JovanoviÄ and colleagues discuss the role learning analytics can play in understanding and enhancing student learning strategies in flipped classes in Chapter 9. Using a learning analytics framework and design research methodology, these researchers have examined ways of enhancing self-regulated learning in flipped classes. This chapter adds to the limited research to date on ways in which learning analytics might be used to enhance learning in flipped classes but also provides approaches that will be useful beyond that setting. Along similar lines, Howard, Thompson and Pardo present research on the use of learning analytics to support learning in a blended context in Chapter 10. These researchers have used a minimally intrusive approach drawing on a diverse range of data from physical and virtual environments to facilitate interactions in both settings. The consideration of learning that occurs across digital and real-life classroom settings is vital, particularly given most educational activities and tasks can now be described as blended due to the ubiquitous nature of digital technologies in many aspects of learning.
Text and writing analytics have become an area of particular emphasis in the field of learning analytics in the last few years. In Chapter 11, McDonald, Moskal, Gunn and Donald examine the role language plays in learning. In particular, they outline how language can illuminate the efficacy of student-teacher interactions. Their work demonstrates the power of text-based analytics for helping to better understand these interactions and use the insights derived from text-based data to enhance student learning.
Student engagement is often seen as a central concern in ensuring students successfully transition into, through and out of university. In Chapter 12, Blumenstein and colleagues present a set of case studies where analytics are used to inform interventions to ānudgeā students towards academic success. Directly instructing students to engage in their studies can often backfire, with students interpreting the intervention as patronising or punitive. The framework provided by these researchers will prove useful in informing similar approaches that draw on data to encourage, rather than demand, students to more meaningfully engage with their studies.
In Chapter 13, Corrin explores the opportunities and challenges that arise from the development of student-facing analytics tools. An increasing number of systems are emerging with the potential to place the outcomes of learning analytics directly in the hands of students. While this can offer many benefits in supporting studentsā ability to monitor and adapt their study strategies, there are also concerns about the impact on students who are less capable of interpreting these data and determining appropriate actions. This chapter profiles differ...