Interrater and intrarater agreement on heart murmurs

S. Andersen*, A.H. Davidsen, H. Schirmer, H. Melbye, M. Spigt, J.C. Aviles-Solis

*Corresponding author for this work

Research output: Contribution to journalArticleAcademicpeer-review

1 Citation (Web of Science)


ObjectiveTo investigate interrater and intrarater agreement between physicians and medical students on heart sound classification from audio recordings, and factors predicting agreement with a reference classification.DesignIntra- and interrater agreement study.SubjectsSeventeen GPs and eight cardiologists from Norway and the Netherlands, eight medical students from Norway.Main outcome measuresProportion of agreement and kappa coefficients for intrarater agreement and agreement with a reference classification.ResultsThe proportion of intrarater agreement on the presence of any murmur was 83% on average, with a median kappa of 0.64 (range k = 0.09-0.86) for all raters, and 0.65, 0.69, and 0.61 for GPs, cardiologist, and medical students, respectively.The proportion of agreement with the reference on any murmur was 81% on average, with a median kappa of 0.67 (range 0.29-0.90) for all raters, and 0.65, 0.69, and 0.51 for GPs, cardiologists, and medical students, respectively.Distinct murmur, more than five years of clinical practice, and cardiology specialty were most strongly associated with the agreement, with ORs of 2.41 (95% CI 1.63-3.58), 2.19 (1.58-3.04), and 2.53 (1.46-4.41), respectively.ConclusionWe observed fair but variable agreement with a reference on heart murmurs, and physician experience and specialty, as well as murmur intensity, were the factors most strongly associated with agreement.
Original languageEnglish
Pages (from-to)491-497
Number of pages7
JournalScandinavian Journal of Primary Health Care
Issue number4
Publication statusPublished - Nov 2022


  • Interrater
  • intrarater
  • agreement
  • heart murmur
  • heart auscultation
  • valvular heart disease

Cite this