TY - JOUR
T1 - Utility of a multimodal computer-based assessment format for assessment with a higher degree of reliability and validity
AU - Renes, J.
AU - van der Vleuten, C.P.M.
AU - Collares, C.F.
N1 - Funding Information:
The author(s) reported there is no funding associated with the work featured in this article.
Publisher Copyright:
© 2022 The Author(s). Published by Informa UK Limited, trading as Taylor & Francis Group.
PY - 2023/4/3
Y1 - 2023/4/3
N2 - Multiple choice questions (MCQs) suffer from cueing, item quality and factual knowledge testing. This study presents a novel multimodal test containing alternative item types in a computer-based assessment (CBA) format, designated as Proxy-CBA. The Proxy-CBA was compared to a standard MCQ-CBA, regarding validity, reliability, standard error of measurement, and cognitive load, using a quasi-experimental crossover design. Biomedical students were randomized into two groups to sit a 65-item formative exam starting with the MCQ-CBA followed by the Proxy-CBA (group 1, n = 38), or the reverse (group 2, n = 35). Subsequently, a questionnaire on perceived cognitive load was taken, answered by 71 participants. Both CBA formats were analyzed according to parameters of the Classical Test Theory and the Rasch model. Compared to the MCQ-CBA, the Proxy-CBA had lower raw scores (p < 0.001, eta(2) = 0.276), higher reliability estimates (p < 0.001, eta(2) = 0.498), lower SEM estimates (p < 0.001, eta(2) = 0.807), and lower theta ability scores (p < 0.001, eta(2) = 0.288). The questionnaire revealed no significant differences between both CBA tests regarding perceived cognitive load. Compared to the MCQ-CBA, the Proxy-CBA showed increased reliability and a higher degree of validity with similar cognitive load, suggesting its utility as an alternative assessment format.
AB - Multiple choice questions (MCQs) suffer from cueing, item quality and factual knowledge testing. This study presents a novel multimodal test containing alternative item types in a computer-based assessment (CBA) format, designated as Proxy-CBA. The Proxy-CBA was compared to a standard MCQ-CBA, regarding validity, reliability, standard error of measurement, and cognitive load, using a quasi-experimental crossover design. Biomedical students were randomized into two groups to sit a 65-item formative exam starting with the MCQ-CBA followed by the Proxy-CBA (group 1, n = 38), or the reverse (group 2, n = 35). Subsequently, a questionnaire on perceived cognitive load was taken, answered by 71 participants. Both CBA formats were analyzed according to parameters of the Classical Test Theory and the Rasch model. Compared to the MCQ-CBA, the Proxy-CBA had lower raw scores (p < 0.001, eta(2) = 0.276), higher reliability estimates (p < 0.001, eta(2) = 0.498), lower SEM estimates (p < 0.001, eta(2) = 0.807), and lower theta ability scores (p < 0.001, eta(2) = 0.288). The questionnaire revealed no significant differences between both CBA tests regarding perceived cognitive load. Compared to the MCQ-CBA, the Proxy-CBA showed increased reliability and a higher degree of validity with similar cognitive load, suggesting its utility as an alternative assessment format.
KW - Computer-based assessment
KW - design-based experiment
KW - crossover design
KW - psychometrics
KW - Rasch model
KW - MULTIPLE-CHOICE QUESTIONS
KW - CLINICAL DECISION-MAKING
KW - ITEM RESPONSE THEORY
KW - PROFESSIONAL COMPETENCE
KW - MEDICAL-EDUCATION
KW - STRENGTHS
KW - STUDENTS
KW - IMPACT
KW - TESTS
KW - GUIDE
U2 - 10.1080/0142159X.2022.2137011
DO - 10.1080/0142159X.2022.2137011
M3 - Article
C2 - 36306368
SN - 0142-159X
VL - 45
SP - 433
EP - 441
JO - Medical Teacher
JF - Medical Teacher
IS - 4
ER -