Research output

Verifying the Stability and Sensitivity of Learning Analytics Based Prediction Models: An Extended Case Study

Research output: Chapter in Book/Report/Conference proceedingChapterAcademic

Associated researcher

Associated organisations

Abstract

In this empirical contribution, a follow-up study of previous research [1], we focus on the issues of stability and sensitivity of Learning Analytics based prediction models. Do predictions models stay intact, when the instructional context is repeated in a new cohort of students, and do predictions models indeed change, when relevant aspects of the instructional context are adapted? Applying Buckingham Shum and Deakin Crick’s theoretical framework of dispositional learning analytics combined with formative assessments and learning management systems, we compare two cohorts of a large module introducing mathematics and statistics. Both modules were based on principles of blended learning, combining face-to-face Problem-Based Learning sessions with e-tutorials, and have similar instructional design, except for an intervention into the design of quizzes administered in the module. We analyse bivariate and multivariate relationships of module performance and track and disposition data to provide evidence of both stability and sensitivity of prediction models.
View graph of relations

Details

Original languageEnglish
Title of host publicationCommunications in Computer and Information Science, Vol. 583
Subtitle of host publicationComputer Supported Education
EditorsS Zvacek, M. T. Restivo, J. Uhomoibhi, M. Helfert
PublisherSpringer Verlag
Pages256-273
VolumeCommunications in Computer and Information Science
Edition583
DOIs
Publication statusPublished - 11 Feb 2016

Publication series

NameComputer Supported Education
Volume583
ISSN (Print)1865-0929
ISSN (Electronic)1865-0937