Composite reliability of a workplace-based assessment toolbox for postgraduate medical education

J. M. W. Moonen-van Loon*, K. Overeem, H. H. L. M. Donkers, C. P. M. van der Vleuten, E. W. Driessen

*Corresponding author for this work

Research output: Contribution to journalArticleAcademicpeer-review

Abstract

In recent years, postgraduate assessment programmes around the world have embraced workplace-based assessment (WBA) and its related tools. Despite their widespread use, results of studies on the validity and reliability of these tools have been variable. Although in many countries decisions about residents' continuation of training and certification as a specialist are based on the composite results of different WBAs collected in a portfolio, to our knowledge, the reliability of such a WBA toolbox has never been investigated. Using generalisability theory, we analysed the separate and composite reliability of three WBA tools [mini-Clinical Evaluation Exercise (mini-CEX), direct observation of procedural skills (DOPS), and multisource feedback (MSF)] included in a resident portfolio. G-studies and D-studies of 12,779 WBAs from a total of 953 residents showed that a reliability coefficient of 0.80 was obtained for eight mini-CEXs, nine DOPS, and nine MSF rounds, whilst the same reliability was found for seven mini-CEXs, eight DOPS, and one MSF when combined in a portfolio. At the end of the first year of residency a portfolio with five mini-CEXs, six DOPS, and one MSF afforded reliable judgement. The results support the conclusion that several WBA tools combined in a portfolio can be a feasible and reliable method for high-stakes judgements.
Original languageEnglish
Pages (from-to)1087-1102
JournalAdvances in Health Sciences Education
Volume18
Issue number5
DOIs
Publication statusPublished - Dec 2013

Keywords

  • Generalisability theory
  • Portfolio
  • Workplace-based assessments
  • Graduate medical education
  • Composite reliability

Cite this