Pre-publication Code Review
When you submit a working paper or publication, DIME Analytics will support review and release of the reproduction data and code via the World Bank GitHub. The goal is to ensure that research outputs from DIME are fully reproducible. To complete this review, please complete the Code Review Checklist and submit it and the replication materials to firstname.lastname@example.org alongside the submission of the working paper to the peer review process (currently organized by Dan Rogger). Please include the completed checklist as an exhibit or appendix to your working paper, since these materials directly attest to the transparency and credibility of your research project.
The Analytics team will get back to you within two weeks of receiving the complete package. We will edit only the top-level directory global in the master do-file and run it to reproduce the results. We will make sure that your files run and that the raw outputs are re-created exactly as you supplied them; and if possible we will check them against the publication to confirm no transcription errors.
For your review, we will return a list of specific replicability or execution errors if any occur, as well as general suggestions for code improvements and places where existing programs can save time and effort for your future work. We are happy to provide more specific suggestions to your team for improving the function and readability of the code if you have questions.
Once this is complete, we can help to organize a public release repository on GitHub (such as this example). Let us know if you would like to do this and we are happy to help you set this up. Many journals now require data and code to be made publicly available, and the World Bank GitHub and Microdata Catalog or Development Data Hub are the recommended resources for this.
Peer Code Review
It is also possible to request a review of a Research Assistant or Field Coordinator's code during the development of a project. This is recommended when a project milestone is reached, such as - Handing over a project - Finishing code for sampling and/or randomization - Finishing the cleaning of a round of data collection - Finalizing data analysis for a paper or report - Preparing data for microdata catalog submission (more info on that in the microdata catalog and microdata submission checklist articles from the Wiki)
The project review is conducted by another RA or FC, and people submitting their codes are also required to review a peer's code. The goals of this exercise are to (1) make sure all work is reproducible, (2) reduce coding mistakes, (3) encourage adoption of best practices, (4) create an opportunity to learn new coding skills from other people's codes.
Project code review is currently being piloted but is expected to happen once a month. Participants will be asked to share their codes with the reviewer on a given date and will have a week to review their assigned project, following a guidelines provided by DIME Analytics.