Reproducibility and Replicability Details


Instructions for computational reproducibility

Computational reproducibility is the ability to duplicate the results of a prior study using the same data and procedures as were used by the original investigator. Reproducibility is done using the same computer code (possibly rebuilt from scratch), but can be achieved using a different software package.

We encourage researchers interested in reproducing the analytical results of a study to use the Social Science Social Science Reproduction (SSRP) and the Guide for Accelerating Computational Reproducibility in the Social Sciences.

Instructions for replication

I4R is actively looking for replicators to replicate published studies listed here. Please reach out to us if you would like to replicate a study or if you replicated a study that is not listed. See here for definitions and types of replication.

In what follows, we provide detailed instructions for replicators. See the Guide for Accelerating Computational Reproducibility in the Social Sciences for more details. Please note that the study assigned to you has already been successfully reproduced by our team of collaborators (i.e., ran the codes), allowing you to focus on conducting sensitivity analysis or replicating the results using the raw data (if available). This being said, it is possible that you find coding errors, and you should report if it is the case in your replication paper.

The entire replication may be done using SSRP. We recommend this platform, especially for less experienced researchers. See below for more details on this platform.

Steps:

Scoping: You should first familiarize yourself with the original study and identify the scientific claims (i.e., a single major finding from a published study), methodology and results that you will be replicating. Note that a claim may be causal or descriptive. How to select claims to replicate? There are three possibilities; (1) select claims for all "hypotheses tests" in the original study, (2) select claims mentioned in the abstract or (3) select claims for what is considered the main result in the paper as stated by the original author(s).

Assessment: In this stage, you will open the data and codes/programs and identify the available reproduction materials associated with your selected claims. You will also review computational reproducibility for the selected claims or overall paper. Again, our team has already conducted the computational reproducibility part, but you may still find coding errors.

Pre-analysis plan and pre-registration: We strongly recommend to replicators to write and pre-register a pre-analysis plan (PAP) prior to conducting the replication exercise. (This step is automatically done when using SSRP.) Your replication report should precisely state which of your re-analyses were (not) included in your PAP. Your PAP may be pre-registered on any platforms, including the Open Science Framework.

Replication/Improvement: Interested researchers may also use our Template for writing up their replication paper.

Please contact us if you have any questions, suggestions or comments.

Social Science Reproduction Platform (SSRP)

SSRP streamlines the process of assessing and improving the computational reproducibility of published research. It can also be used for facilitating reproductions as class assignments in applied social science courses, allowing students to learn about fundamental concepts, research methods, and tools for reproducible research. The SSRP is free and open to all social science researchers interested in advancing the reproducibility of research. Beyond the classroom, the SSRP can be used by researchers interested in auditing the reproducibility of their own or other’s work.

SSRP allows:

(1) Assess and improve the reproducibility of published work, creating real, citable scientific contributions.

(2) Provide and receive constructive feedback from peers and original authors.

(3) Access and contribute to the creation of meaningful metrics of reproducibility of social science research.

Guide for Accelerating Computational Reproducibility in the Social Sciences (ACRe Guide)

The ACRe (guide) includes a common approach, terminology, and standards for conducting reproductions on the Social Science Reproduction Platform (SSRP) and is meant to be used in conjuction with the SSRP. Reproducers can find detailed guidance for each of the reproduction stages, including Selecting a Paper, Scoping, Assessment, Improvements, and Robustness, as well as guidance and resources for constructive communication with original authors. Instructors using the SSRP for teaching purposes should reference this chapter to find guidance for typical use cases, tips for planning assignments, and sample grading strategies.

We welcome feedback and direct contributions from all readers in all parts and aspects of the ACRe (guide). If you wish to provide feedback on specific chapters or sections, navigate to the page where you'd like to submit your feedback and then click the “edit” icon at the top of the page (this will prompt you to sign into or create a GitHub account), after which you’ll be able to suggest changes directly into the text. Please submit your suggestions using the “create a new branch and start a pull request” option and provide a summary of the changes you’ve proposed in the description of the pull request. The SSRP team will review all suggested changes and decide whether to “push” them to this Guide document or not. We will acknowledge major contribution on the ACRe Guide cover page according to the Contributor Roles Taxonomy (CRediT). Learn more here and contact ACRE@berkeley.edu for further questions.