Modelling the pre-assessment learning effects of assessment: evidence in the validity chain

Francois J. Cilliers*, Lambert W. T. Schuwirth, Cees P. M. van der Vleuten

*Corresponding author for this work

Research output: Contribution to journalArticleAcademicpeer-review

Abstract

Medical Education 2012: 46: 10871098 OBJECTIVES We previously developed a model of the pre-assessment learning effects of consequential assessment and started to validate it. The model comprises assessment factors, mechanism factors and learning effects. The purpose of this study was to continue the validation process. For stringency, we focused on a subset of assessment factorlearning effect associations that featured least commonly in a baseline qualitative study. Our aims were to determine whether these uncommon associations were operational in a broader but similar population to that in which the model was initially derived. METHODS A cross-sectional survey of 361 senior medical students at one medical school was undertaken using a purpose-made questionnaire based on a grounded theory and comprising pairs of written situational tests. In each pair, the manifestation of an assessment factor was varied. The frequencies at which learning effects were selected were compared for each item pair, using an adjusted alpha to assign significance. The frequencies at which mechanism factors were selected were calculated. RESULTS There were significant differences in the learning effect selected between the two scenarios of an item pair for 13 of this subset of 21 uncommon associations, even when a p-value of <0.00625 was considered to indicate significance. Three mechanism factors were operational in most scenarios: agency; response efficacy, and response value. CONCLUSIONS For a subset of uncommon associations in the model, the role of most assessment factorlearning effect associations and the mechanism factors involved were supported in a broader but similar population to that in which the model was derived. Although model validation is an ongoing process, these results move the model one step closer to the stage of usefully informing interventions. Results illustrate how factors not typically included in studies of the learning effects of assessment could confound the results of interventions aimed at using assessment to influence learning.
Original languageEnglish
Pages (from-to)1087-1098
JournalMedical Education
Volume46
Issue number11
DOIs
Publication statusPublished - Nov 2012

Cite this