Online Learning Analytics
eBook - ePub

Online Learning Analytics

  1. 232 pages
  2. English
  3. ePUB (mobile friendly)
  4. Available on iOS & Android
eBook - ePub

Online Learning Analytics

About this book

"In our increasingly digitally enabled education world, analytics used ethically, strategically, and with care holds the potential to help more and more diverse students be more successful on higher education journeys than ever before. Jay Liebowitz and a cadre of the fields best 'good trouble' makers in this space help shine a light on the possibilities, potential challenges, and the power of learning together in this work."

—Mark David Milliron, Ph.D., Senior Vice President and Executive Dean of the Teachers College, Western Governors University

Due to the COVID-19 pandemic and its aftereffects, we have begun to enter the "new normal" of education. Instead of online learning being an "added feature" of K–12 schools and universities worldwide, it will be incorporated as an essential feature in education. There are many questions and concerns from parents, students, teachers, professors, administrators, staff, accrediting bodies, and others regarding the quality of virtual learning and its impact on student learning outcomes.

Online Learning Analytics is conceived on trying to answer the questions of those who may be skeptical about online learning. Through better understanding and applying learning analytics, we can assess how successful learning and student/faculty engagement, as examples, can contribute towards producing the educational outcomes needed to advance student learning for future generations. Learning analytics has proven to be successful in many areas, such as the impact of using learning analytics in asynchronous online discussions in higher education. To prepare for a future where online learning plays a major role, this book examines:

  • Data insights for improving curriculum design, teaching practice, and learning
  • Scaling up learning analytics in an evidence-informed way
  • The role of trust in online learning.

Online learning faces very real philosophical and operational challenges. This book addresses areas of concern about the future of education and learning. It also energizes the field of learning analytics by presenting research on a range of topics that is broad and recognizes the humanness and depth of educating and learning.

Frequently asked questions

Yes, you can cancel anytime from the Subscription tab in your account settings on the Perlego website. Your subscription will stay active until the end of your current billing period. Learn how to cancel your subscription.
No, books cannot be downloaded as external files, such as PDFs, for use outside of Perlego. However, you can download books within the Perlego app for offline reading on mobile or tablet. Learn more here.
Perlego offers two plans: Essential and Complete
  • Essential is ideal for learners and professionals who enjoy exploring a wide range of subjects. Access the Essential Library with 800,000+ trusted titles and best-sellers across business, personal growth, and the humanities. Includes unlimited reading time and Standard Read Aloud voice.
  • Complete: Perfect for advanced learners and researchers needing full, unrestricted access. Unlock 1.4M+ books across hundreds of subjects, including academic and specialized titles. The Complete Plan also includes advanced features like Premium Read Aloud and Research Assistant.
Both plans are available with monthly, semester, or annual billing cycles.
We are an online textbook subscription service, where you can get access to an entire online library for less than the price of a single book per month. With over 1 million books across 1000+ topics, we’ve got you covered! Learn more here.
Look out for the read-aloud symbol on your next book to see if you can listen to it. The read-aloud tool reads text aloud for you, highlighting the text as it is being read. You can pause it, speed it up and slow it down. Learn more here.
Yes! You can use the Perlego app on both iOS or Android devices to read anytime, anywhere — even offline. Perfect for commutes or when you’re on the go.
Please note we cannot support devices running on iOS 13 and Android 7 or earlier. Learn more about using the app.
Yes, you can access Online Learning Analytics by Jay Liebowitz in PDF and/or ePUB format, as well as other popular books in Education & Data Mining. We have over one million books available in our catalogue for you to explore.

Information

Edition
1
Subtopic
Data Mining

Chapter 1 Leveraging Learning Analytics for Assessment and Feedback

Dirk Ifenthaler1 and Samuel Greiff2
1 University of Mannheim and Curtin University
2 University of Luxembourg

Abstract

This chapter critically reflects the current state of research in learning analytics and educational assessment. Given the omnipresence of technology-enhanced assessment approaches, vast amounts of data are produced in such systems, which open further opportunities for advancing assessment and feedback systems as well as pedagogical assessment practice. A yet-to-be-solved limitation of learning analytics frameworks is the lack of a stronger focus on dynamic or real-time assessment and feedback, as well as the improvement of learning environments. Therefore, a benefits matrix for analytics-enhanced assessment is suggested, which provides examples on how to harness data and analytics for educational assessment. Further, a framework for implementing analytics-enhanced assessment is suggested. The chapter concludes with a critical reflection on current challenges for making use of analytics data for educational assessments. Clearly, stakeholders in the educational arena need to address ethics and privacy issues linked to analytics-enhanced assessments.
Keywords: Assessment, feedback, learning analytics, analytics-enhanced assessment

Introduction

A recent search in scientific databases identified an increase of research publications focusing on assessment from the 1950s to the 2020s by over 380%. Despite an intense debate over the past seven decades, the distinction between formative and summative assessment has not resulted in a precise definition, and the distinction between the two remains blurry (Newton, 2007). To the contrary, other terms have been introduced, such as learning-oriented assessment (Carless, 2007), emphasizing the development of learning elements of assessment; sustainable assessment (Boud, 2000), proposing the support of student learning beyond the formal learning setting; or stealth assessment (Shute et al., 2016), denoting assessments that take place in the background without the user noticing it.
More recently, technology-enhanced assessments enriched standard or paper-based assessment approaches, some of which hold much promise for supporting learning (Webb et al., 2013; Webb & Ifenthaler, 2018 b). While much effort in institutional and national systems is focused on harnessing the power of technology-enhanced approaches in order to reduce costs and increase efficiency (Bennett, 2015), a range of different technology-enhanced assessment scenarios have been the focus of educational research and development—however, often at small scale (Stƶdberg, 2012).
For example, technology-enhanced assessments may involve a pedagogical agent for providing feedback during a learning process (Johnson & Lester, 2016). Other scenarios of technology-enhanced assessments include analyses of a learner’s decisions and interactions during game-based learning (Bellotti et al., 2013; Ifenthaler et al., 2012; Kim & Ifenthaler, 2019), scaffolding for dynamic task selection including related feedback (Corbalan et al., 2009), remote asynchronous expert feedback on collaborative problem-solving tasks (Rissanen et al., 2008), or semantic rich and personalized feedback as well as adaptive prompts for reflection through data-driven assessments (Ifenthaler, 2012).
Accordingly, it is expected that technology-enhanced assessment systems meet a number of specific requirements, such as (a) adaptability to different subject domains, (b) flexibility for experimental as well as learning and teaching settings, (c) management of huge amounts of data, (d) rapid analysis of complex and unstructured data, (e) immediate feedback for learners and educators, as well as (f) generation of automated reports of results for educational decision making (Ifenthaler et al., 2010).
With the increased availability of vast and highly varied amounts of data from learners, teachers, learning environments, and administrative systems within educational settings, further opportunities arise for advancing pedagogical assessment practice (Ifenthaler et al., 2018). Analytics-enhanced assessment harnesses formative as well as summative data from learners and their contexts (e.g., learning environments) in order to facilitate learning processes in near real time and help decision makers to improve learning environments. Hence, analytics-enhanced assessment may provide multiple benefits for students, schools, and involved stakeholders. However, as noted by Ellis (2013), analytics currently fail to make full use of educational data for assessment.
This chapter critically reflects the current state of research in educational assessment and identifies ways to harness data and analytics for assessment. Further, a benefits matrix for analytics-enhanced assessment is suggested, followed by a framework for implementing assessment analytics.

Current State of Educational Assessment

Tracing the history of educational assessment practice is challenging, as there are a number of diverse concepts referring to the idea of assessment. Educational assessment is a systematic method of gathering information or artefacts about a learner and learning processes to draw inferences of the persons’ dispositions (Baker et al., 2016). Scriven (1967) is often referred to as the original source of the distinction between formative and summative assessment. However, formative and summative assessment are considered to be overlapping concepts, and the function depends on how the inferences are used (Black & Wiliam, 2018).
Newton (2007) notes that the distinction between formative and summative assessment hindered the development of sound assessment practices on a broader level. In this regard, Taras (2005) states that every assessment starts with the summative function of judgment, and by using this information for providing feedback for improvement, the function becomes formative. Bloom et al. (1971) were concerned with the long-lasting idea of assessment separating learners based on a summative perspective of knowledge and behavior—the assessment of learning. In addition, Bloom et al. (1971) supported the idea of developing the individual learner and supporting the learner and teacher towards mastery of a phenomenon—the assessment for learning.
Following this discourse, Sadler (1989) developed a theory of formative assessment and effective feedback. Formative assessment helps students to understand their current state of learning and guides them in taking action to achieve their learning goals. A similar line of argumentation can be found in Black (1998), in which three main types of assessment are defined: (a) formative assessment to aid learning; (b) summative assessment for review, transfer, and certification; (c) summative assessment for accountability to the public. Pellegrino et al. (2001) extend these definitions with three main purposes of assessment: (a) assessment to assist learning (formative assessment); (b) assessment of individual student achievement (summative assess-ment); and (c) assessment to evaluate (evaluative assessment).
To facilitate learning through assessment, Carless (2007) emphasizes that assessment tasks should be learning tasks that are related to the defined learning outcomes and distributed across the learning and course period. Furthermore, to foster learners’ responsibility for learning (Bennett, 2011; Wanner & Palmer, 2018) and self-regulation (Panadero et al., 2017), self-assessments are suitable means. In general, self-assessments include students’ judgment and decision making about their work and comprise three steps: definition of the expectations, evaluating the work against the expectations, and revising the work (Andrade, 2010). Consequently, as Sadler (1989) argues, self-monitoring and external feedback are related to formative assessment, with the aim to evolve from using external feedback to self-monitoring to independently identify gaps for improvement. Hence, self-assessments enable learners to develop independence of relying on external feedback (Andrade, 2010).
However, self-assessment demands but also fosters evaluative judgment of learners (Panadero et al., 2019; Tai et al., 2018). Thus, self-assessments might be particularly challenging for learners with lower levels of domain or procedural knowledge (Sitzmann et al., 2010). Hence, the feedback generated internally by the learners could be complemented and further enhanced with external feedback (Butler & Winne, 1995). Such external feedback may help learners to adjust their self-monitoring (Sitzmann et al., 2010). Among others, the feedback provided should clearly define expectations (i.e., criteria, standards, goals), be timely, sufficiently frequent and detailed, be on aspects that are malleable through the students, be on how to close the gap, in a way learners can react upon it (Gibbs & Simpson, 2005; Nicol & Macfarlane‐Dick, 2006). Furthermore, assessment and feedback processes shall actively include the learner as an agent in the process (Boud & Molloy, 2013). However, offering formative assessments and individual feedback are limited in many ways throughout higher education due to resource constraints (##Broadbent et al., 2017; Gibbs & Simpson, 2005).
Assessment as learning is a concept that reflects a renewed focus on the nature of the integration of assessment and learning (Webb & Ifenthaler, 2018 a). Key aspects of assessment as learning include the centrality of understanding the learning gap and the role of assessment in helping students and teachers explore and regulate this gap (Dann, 2014). Thus, feedback and the way students regulate their response to feedback is critical for assessment as learning, just as it is for assessment for learning (Perrenoud, 1998). Other active research areas focus on peer assessment (Lin et al., 2016; Wanner & Palmer, 2018). Especially the opportunities of technology-enhanced peer in...

Table of contents

  1. Cover Page
  2. Half-Title Page
  3. Title Page
  4. Copyright Page
  5. Trademarks Used in This Book
  6. Dedication
  7. Table of Contents
  8. List of Figures
  9. List of Tables
  10. Foreword
  11. Preface
  12. Contributing Authors
  13. About the Editor
  14. Chapter 1 Leveraging Learning Analytics for Assessment and Feedback
  15. Chapter 2 Desperately Seeking the Impact of Learning Analytics in Education at Scale
  16. Chapter 3 Designing for Insights
  17. Chapter 4 Implementing Learning Analytics at Scale in an Online World
  18. Chapter 5 Realising the Potential of Learning Analytics
  19. Chapter 6 Using Learning Analytics and Instructional Design to Inform, Find, and Scale Quality Online Learning
  20. Chapter 7 Democratizing Data at a Large R1 Institution
  21. Chapter 8 The Benefits of the ā€˜New Normal’
  22. Chapter 9 Learning Information, Knowledge, and Data Analysis in Israel
  23. Chapter 10 Scaling Up Learning Analytics in an Evidence-Informed Way
  24. Chapter 11 The Role of Trust in Online Learning
  25. Chapter 12 Face Detection with Applications in Education
  26. Index