Hierarchical modeling of agreement

Sophie Vanbelle*, Timothy Mutsvari, Dominique Declerck, Emmanuel Lesaffre

*Corresponding author for this work

Research output: Contribution to journalArticleAcademicpeer-review

14 Citations (Web of Science)


Kappa-like agreement indexes are often used to assess the agreement among examiners on a categorical scale. They have the particularity of correcting the level of agreement for the effect of chance. In the present paper, we first define two agreement indexes belonging to this family in a hierarchical context. In particular, we consider the cases of a random and fixed set of examiners. Then, we develop a method to evaluate the influence of factors on these indexes. Agreement indexes are directly related to a set of covariates through a hierarchical model. We obtain the posterior distribution of the model parameters in a Bayesian framework. We apply the proposed approach on dental data and compare it with the generalized estimating equations approach.
Original languageEnglish
Pages (from-to)3667-3680
JournalStatistics in Medicine
Issue number28
Publication statusPublished - 10 Dec 2012


  • Cohen's kappa
  • intraclass
  • reliability
  • multilevel
  • Markov chain Monte Carlo
  • nested
  • rater

Cite this