Skip to content

Latest commit

 

History

History
59 lines (47 loc) · 4.12 KB

reproducible_research_trainings.md

File metadata and controls

59 lines (47 loc) · 4.12 KB
layout title
default
Trainings

Reproducible Research Training Schedule - FY25

The following training courses are planned for FY25. Courses are open to all Bank staff and consultants. Please contact [email protected] for further information or to request a registration link.

Reproducible Research Fundamentals

This week-long course introduces the best practices for reproducible research through lectures and hands-on labs. Participants learn how to implement transparent and reproducible workflows, to effectively code in a team environment, and to keep personal data secure throughout the lifecycle of a research project. Participants will practice the material covered in lectures during hands-on lab sessions. Labs use GitHub and Stata and/r Rs. By the end of the course, participants will have the tools and knowledge to implement best practices for transparent and reproducible research.

  • In person: September 30 - October 3, 2024
  • A virtual edition is planned for March 2025

Using GitHub for Reproducible and Transparent Research

A virtual, hands-on training for research teams interested in transitioning their work to GitHub. An introductory session explains how GitHub increases transparency and reproducibility and introduces GitHub workflows. Targeted follow-up sessions discuss in detail the ways different team members will interact with the platform, from observer to contributor and repository maintainer. GitHub training sessions are held quarterly. World Bank staff and consultants should email [email protected] to enroll.

Creating a Reproducibility Package

This 1-hour seminar provides an overview of the necessary components of a reproducibility package. It offers practical guidance for preparing reproducibility packages for World Bank research. Recording: https://osf.io/hgz67

Technical Seminars on Reproducible Research

Reproducible Research seminars are offered monthly by the Reproducibility Team. To sign up for the mailing list, World Bank staff and consultants can email [email protected]. The recordings of past seminars are linked below. Any of the below seminars can be repeated in-person for any unit or department, to arrange please email [email protected].

Building a Reproducible Environment

This 1 hour seminar discusses how to create a reproducible environment in either Stata or R, focusing on handling of user-written commands and packages. Recording: https://osf.io/q7rjt

Creating Reproducible Tables & Graphs

This 1 hour seminar provides workflows and code suggestions to automate analytical outputs using Stata or R. Recording: https://osf.io/ezmsb

Coding Reproducible Random Processes

This 1 hour seminar identifies common Stata commands and practices that include random processes and discusses how to ensure stability of reproducibility packages that include random processes. Recording: https://osf.io/2bxn7

Reproducible Research Bootcamps

1- or 2-day in-person intensive trainings are offered by request, to specific teams or units. The objective is to explain reproducible research practices and then do hands-on sessions directly with participants to provide customized technical support and facilitate real-time adoption. Exact content will be agreed in advance with the requesting team. Possible content includes creating GitHub repositories, setting up project directories, organizing analysis through a main script.

Peer code review

This is a facilitated, structured code exchange which improves the quality and reproducibility of code in real time, open to all World Bank staff and consultants. Peer review is appropriate for any stage of a research project that has a concrete code output. The diagnostic shared at the end of the process provides staff with specific and actionable feedback and a multi-faceted assessment of their research assistant’s coding. Participants also learn from each other about how to write high-quality, reproducible code, and see firsthand how to implement different coding practices.

  • In FY25, peer code reviews will be held in November, February, and May. To participate please email [email protected].