Abstract
Academic evaluation is a critical component of research, with the interaction between quantitative and qualitative assessments becoming a prominent area of focus. This study examines the relationship between peer review scores and citations within the framework of open peer review. Utilizing data from the OpenReview platform for papers presented at the International Conference on Learning Representations (ICLR), the papers were classified into oral presentations, poster presentations, and rejected manuscripts. Weighted scores were calculated using the confidence score method, followed by an analysis using correlation and regression techniques. The findings reveal significant differences among the three categories in terms of review scores and citations, demonstrating a positive correlation between review scores and citations. Additionally, it was found that papers with greater inconsistency in reviews tended to receive higher citations. Reviewers of rejected papers displayed significantly higher confidence in their assessments compared to reviewers of accepted papers. The results highlight the alignment between peer review and traditional bibliometric indicators in the context of open peer review. However, the degree of concordance between the two evaluation methods is not substantial, suggesting that they are not interchangeable. Therefore, traditional bibliometric indicators should be considered an essential complement to peer review. Furthermore, when evaluating the consistency between quantitative and qualitative assessments and the confidence levels of reviewers, peer review demonstrates greater effectiveness than "traditional peer review" in addressing issues of "poor selection".
Original language | English |
---|---|
Pages (from-to) | 4721-4740 |
Number of pages | 20 |
Journal | Scientometrics |
Volume | 129 |
Issue number | 8 |
Early online date | 1 Jul 2024 |
DOIs | |
Publication status | Published - Aug 2024 |
Keywords
- Reviewer score
- ICLR
- OpenReview
- Citation
- Open peer review
- BIBLIOMETRIC INDICATORS
- INTERRATER RELIABILITY
- PREDICTIVE-VALIDITY
- PEER REVIEWS
- MANUSCRIPT
- RECOMMENDATIONS
- PERFORMANCE
- CHALLENGES
- MATTER