Home » Additional Features

Reproducible Research in JASA: 5 Years On

1 August 2021 1,174 views No Comment

Readers of the July 2016 issue of Amstat News may have noticed a brief article, titled “Reproducible Research in JASA,” that introduced an initiative to increase the reproducibility of manuscripts published in JASA Applications and Case Studies (ACS). Spearheaded by Montserrat Fuentes, then editor of Applications and Case Studies, the JASA ACS reproducibility initiative was responding to what was widely viewed as a reproducibility/replication “crisis” in the scientific literature. Indeed, most statistical papers did not provide the materials necessary to reproduce their results. For example, in the first half of 2016, less than 20 percent of papers published in JASA ACS provided any supporting code or data that would enable reproduction of their results.

Success of the initiative was by no means guaranteed, as similar reproducibility-focused efforts in statistics journals had had mixed results. However, the adoption of reproducibility standards by high-profile scientific and subject-area journals indicated such efforts could indeed shift expectations for transparency.

Fuentes felt strongly that it was important to introduce the initiative to promote more reproducible analyses in the statistical literature.

Due to the effort and dedication put forth by editors, associate editors, reviewers, and journal staff, the JASA reproducibility initiative is celebrating its five-year anniversary and its future looks promising. In fact, the JASA editorial team recently endorsed a proposal to adapt the initiative for Theory and Methods manuscripts. Hence, starting on September 1, 2021, all original research manuscripts submitted to JASA will undergo reproducibility review, with authors required to provide their reproducibility materials when invited to revise their initial submission.

From its launch as a somewhat amorphous concept, the JASA reproducibility initiative has evolved into an efficient system consisting of two parts: 1) a set of concrete guidelines and resources that support authors in assembling and disseminating the code, data, and documentation needed to ensure the reproducibility of their manuscripts and 2) a formal and structured review process that provides feedback to authors on the materials they submit.

The core of the first part of the system is the author contributions checklist (ACC) form. This form, to be completed by all authors who are invited to revise their initial submission, provides a detailed list of what materials should be included to document reproducibility and asks authors to describe these materials. The final version of the form is published alongside each manuscript, providing a “key” for how to make use of the reproducibility materials. Code and data, in addition to being provided as supplementary materials on the journal website, are archived at JASA-specific GitHub and DataVerse repositories.

ACC forms and accompanying materials are reviewed by an associate editor for reproducibility (AER) in parallel with the usual manuscript review process, ensuring the final set of reproducibility materials is usable and well-documented. The initial group of three AERs that helped launch the initiative (Christopher Paciorek, Victoria Stodden, and Julian Wolfson) was expanded to six in 2018 (Lorin Crawford, Jeff Goldsmith, Michael Kane, Christopher Paciorek, Cheng Yong Tang, and Julian Wolfson). The AER group will be welcoming two new members, Stephanie Hicks and Julia Wrobel, later this year.

Over the past five years, the reproducibility initiative has gone through several iterations, each of which has been informed by robust discussion within the AER team and with JASA’s editorial team. Along with policy changes, this has resulted in openly available materials, now presented in the JASA Reproducibility Guide, including guidelines for authors and example supplementary materials whose results are well-documented and easily reproducible. Each of these iterations has attempted to increase access to the methods, results, and implementations being presented without creating undue burden on the editors, reviewers, or authors. The resulting reproducibility process increases transparency and addresses a major barrier to the reproducibility and replicability of statistical methods research: the absence of well-documented code and data.

What’s next for the reproducibility initiative? Beyond the expansion to JASA Theory and Methods, the AER team is engaging in discussions with editorial teams from other journals to coordinate efforts around reproducibility. They are also looking for ways to make the reproducibility review process more efficient, transparent, and rigorous. Feedback from the statistical community is welcome and should be sent to jasa.app.cs.aer@gmail.com.

1 Star2 Stars3 Stars4 Stars5 Stars (No Ratings Yet)
Loading...

Comments are closed.