Abstract
This article presents lessons learnt from experiences with assessment of professional competence. Based on Miller's pyramid, a distinction is made between established assessment technology for assessing 'knows', 'knowing how' and 'showing how' and more recent developments in the assessment of (clinical) performance at the 'does' level. Some general lessons are derived from research of and experiences with the established assessment technology. Here, many paradoxes are revealed and empirical outcomes are often counterintuitive. Instruments for assessing the 'does' level are classified and described, and additional general lessons for this area of performance assessment are derived. These lessons can also be read as general principles of assessment (programmes) and may provide theoretical building blocks to underpin appropriate and state-of-the-art assessment practices.
Original language | English |
---|---|
Pages (from-to) | 703-719 |
Journal | Best Practice & Research Clinical Obstetrics & Gynaecology |
Volume | 24 |
Issue number | 6 |
DOIs | |
Publication status | Published - Dec 2010 |
Keywords
- assessment
- professional competence
- assessment of clinical performance
- principles of assessment (programmes)