Comparison of formula and number-right scoring in undergraduate medical training: a Rasch model analysis

Dario Cecilio-Fernandes*, Harro Medema, Carlos Collares, Lambertus Schuwirth, Janke Cohen-Schotanus, René A. Tio

*Corresponding author for this work

Research output: Contribution to journalArticleAcademicpeer-review

Abstract

Background: Progress testing is an assessment tool used to periodically assess all students at the end-of-curriculum level. Because students cannot know everything, it is important that they recognize their lack of knowledge. For that reason, the formula-scoring method has usually been used. However, where partial knowledge needs to be taken into account, the number-right scoring method is used. Research comparing both methods has yielded conflicting results. As far as we know, in all these studies, Classical Test Theory or Generalizability Theory was used to analyze the data. In contrast to these studies, we will explore the use of the Rasch model to compare both methods.

Methods: A 2 x 2 crossover design was used in a study where 298 students from four medical schools participated. A sample of 200 previously used questions from the progress tests was selected. The data were analyzed using the Rasch model, which provides fit parameters, reliability coefficients, and response option analysis.

Results: The fit parameters were in the optimal interval ranging from 0.50 to 1.50, and the means were around 1.00. The person and item reliability coefficients were higher in the number-right condition than in the formula-scoring condition. The response option analysis showed that the majority of dysfunctional items emerged in the formula-scoring condition.

Conclusions: The findings of this study support the use of number-right scoring over formula scoring. Rasch model analyses showed that tests with number-right scoring have better psychometric properties than formula scoring. However, choosing the appropriate scoring method should depend not only on psychometric properties but also on self-directed test-taking strategies and metacognitive skills.

Original languageEnglish
Article number192
Number of pages9
JournalBMC Medical Education
Volume17
DOIs
Publication statusPublished - 9 Nov 2017

Keywords

  • Assessment
  • Multiple choice questions
  • Formula scoring
  • Number-right scoring
  • Rasch model
  • Reliability
  • Validity
  • Construct-irrelevant variance
  • MULTIPLE-CHOICE TESTS
  • ITEM RESPONSE THEORY
  • DONT-KNOW OPTION
  • EDUCATION
  • KNOWLEDGE
  • VALIDITY
  • STUDENTS
  • SCORES
  • SCHOOL
  • RISK

Cite this