Where the rubber meets the road - An integrative review of programmatic assessment in health care professions education

Suzanne Schut*, Lauren A. Maggio, Sylvia Heeneman, Jan van Tartwijk, Cees van der Vleuten, Erik Driessen

*Corresponding author for this work

Research output: Contribution to journal(Systematic) Review article peer-review

24 Citations (Web of Science)

Abstract

Introduction Programmatic assessment was introduced as an approach to design assessment programmes with the aim to simultaneously optimize the decision-making and learning function of assessment. An integrative review was conducted to review and synthesize results from studies investigating programmatic assessment in health care professions education in practice. Methods The authors systematically searched PubMed, Web of Science, and ERIC to identify studies published since 2005 that reported empirical data on programmatic assessment. Characteristics of the included studies were extracted and synthesized, using descriptive statistics and thematic analysis. Results Twenty-seven studies were included, which used quantitative methods (n= 10), qualitative methods (n= 12) or mixed methods (n= 5). Most studies were conducted in clinical settings (77.8%). Programmatic assessment was found to enable meaningful triangulation for robust decision-making and used as a catalyst for learning. However, several problems were identified, including overload in assessment information and the associated workload, counterproductive impact of using strict requirements and summative signals, lack of a shared understanding of the nature and purpose of programmatic assessment, and lack of supportive interpersonal relationships. Thematic analysis revealed that the success and challenges of programmatic assessment were best understood by the interplay between quantity and quality of assessment information, and the influence of social and personal aspects on assessment perceptions. Conclusion Although some of the evidence may seem compelling to support the effectiveness of programmatic assessment in practice, tensions will emerge when simultaneously stimulating the development of competencies and assessing its result. The identified factors and inferred strategies provide guidance for navigating these tensions.

Original languageEnglish
Pages (from-to)6-13
Number of pages8
JournalPerspectives on Medical Education
Volume10
Issue number1
Early online date21 Oct 2020
DOIs
Publication statusPublished - Jan 2021

Keywords

  • Programmatic Assessment
  • Knowledge synthesis
  • Health Care Professions Education
  • ASSESSMENT SYSTEM
  • COMPETENCE
  • FEEDBACK
  • STUDENTS
  • MODEL
  • WORKPLACE
  • OUTCOMES

Cite this