Reproducibility in Management Science

Milos Fisar*, Ben Greiner, Christoph Huber, Elena Katok, Ali I. Ozkes, Management Science Reproducibility Collaboration

*Corresponding author for this work

Research output: Contribution to journalArticleAcademicpeer-review

Abstract

With the help of more than 700 reviewers, we assess the reproducibility of nearly 500 articles published in the journal Management Science before and after the introduction of a new Data and Code Disclosure policy in 2019. When considering only articles for which data accessibility and hardware and software requirements were not an obstacle for reviewers, the results of more than 95% of articles under the new disclosure policy could be fully or largely computationally reproduced. However, for 29% of articles, at least part of the data set was not accessible to the reviewer. Considering all articles in our sample reduces the share of reproduced articles to 68%. These figures represent a significant increase compared with the period before the introduction of the disclosure policy, where only 12% of articles voluntarily provided replication materials, of which 55% could be (largely) reproduced. Substantial heterogeneity in reproducibility rates across different fields is mainly driven by differences in data set accessibility. Other reasons for unsuccessful reproduction attempts include missing code, unresolvable code errors, weak or missing documentation, and software and hardware requirements and code complexity. Our findings highlight the importance of journal code and data disclosure policies and suggest potential avenues for enhancing their effectiveness.
Original languageEnglish
Pages (from-to)1343-1356
Number of pages14
JournalManagement Science
Volume70
Issue number3
Early online date22 Dec 2023
DOIs
Publication statusPublished - Mar 2024

Keywords

  • reproducibility
  • replication
  • crowd science

Cite this