Piloting an Open and Reusable Service of Reproducibility Checks
A fundamental requirement of science is transparency. In empirical quantitative social sciences, limited transparency implies that incentives for negligent data work practices or even
outright fraud may be unacceptably high. A key element of transparency is the possibility to replicate results. In empirical economics, the core discipline we consider, replication requires
access to the data and programming code used in the original study. Reproducibility, that is the possibility of reproducing the numbers, tables and figures in a publication using the original
data and code, is a basic criterion for transparency. Technologies to ensure reproducibility exist, but the change in skills, workflows, incentives, financing, and cultural norms needed for implementation and diffusion is slow. Requirements of funding organizations have been placing an increasing emphasis on research integrity and data management plans, thus endorsing reproducibility. National and European science policies develop strategies that support reproducibility. Still, neither funding agencies nor national science policies mandate fully reproducible practices to date. Learned societies play an important role in disseminating best practices within disciplines. A best practice in economics is the policy of the American Economic Association (AEA) that has systematic reproducibility requirements for all submissions to its journals. This project pilots a service that allows verifying reproducibility of research submitted to economics journals without placing the burden of developing and operating the checks entirely on them. It develops technological and organizational solutions that allow checking and certifying reproducibility in an easy-to-teach, efficient, pragmatic and scalable way. The usability in institutional contexts outside journals will be investigated as well. The concept of this service represents a central missing element in the development of system-wide reproducibility policies. The concept will be piloted in cooperation with scientific journals using ZBW’s Journal Data Archive and in cooperation with research institutions. Project results will be made available for reuse under Open Science principles. Main project outcomes will
● an easy-to-teach concept for training students in conducting effective reproducibility checks for journals and research institutions drawing on the AEA’s unique competency in cooperation with its first Data Editor
● an efficient, semi-automated infrastructure for reproducibility checks piloted in the ZBW Journal Data Archive serving journals
● a pragmatic approach, integrating discipline-specific best practices into coherent workflows and focusing on public use data and the widespread statistical software.
● a strategy of scaling up reuse of the project outputs by (a) external cooperation within the project (b) workshops for wider stakeholder engagement using ZBW’s large network at all levels of the policy development cycle and (c) a dedicated strategy for long-term availability of successful project outcomes through ZBW.
Publications
Project start:
01. January 2025
Project end:
31. December 2027
Project management:
Dr. Philipp Breidenbach
Funding:
VW-Stiftung