Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Standardize and continuously update quality criteria for research software, including checklists/review guidelines #25

Open
dr-eric-jensen opened this issue Aug 24, 2023 · 0 comments
Labels
Activity: Organizing / Action This involves organizing, capacity building or other practical actions to advance research software Activity This is a policy activity Effort: Medium 1-3 person-years estimated to deliver this. Effort: Small Less than 1 person-year estimated to deliver this. Topic: Quality This covers topics relating to software quality

Comments

@dr-eric-jensen
Copy link
Collaborator

dr-eric-jensen commented Aug 24, 2023

image

Potential Activity Scope
Establish shared standards (with checklists and review guidelines to help clarify the standards) for peer-review for research software. This is an essential activity to enhance the evaluation process of software within the scientific community. Delivering this activity may involve the following steps:

  • Understanding the Scope: The initiative recognizes that software peer-review occurs at various levels and may require different approaches and criteria. The goal is to create a widely-used tiered system of checklists and guidelines that reflect these different levels, ensuring consistency, rigor, and relevance across the evaluation process.
  • Tiered Approach to Standards: The tiered approach allows for flexibility and differentiation in shared standards, acknowledging that software may be reviewed for different purposes, audiences, or stages of development. Each tier may have its specific checklist, guidelines, and expectations, providing a clear and tailored framework for evaluation.
  • Leveraging Existing Information and standards: Recognizing that many journals and other resources may already have information, standards, or practices related to software review, the initiative seeks to leverage this existing knowledge. This involves researching, analyzing, and integrating relevant materials, avoiding duplication, and building on proven methods and insights.
  • Collaboration and Customization of Standards: Agreeing on effective checklists, guidelines, and rating systems requires collaboration with various stakeholders, including software developers, reviewers, editors, and users. It may also involve customizing the tools for specific disciplines, technologies, or platforms, ensuring that they are fit for purpose.
  • Training and Implementation of Shared Standards: Creating the shared standards and supporting tools is just the first step; they need to be implemented, used, and supported effectively. This may include training, resources, mentoring, and ongoing engagement with the review community, building capacity and confidence.
  • Monitoring and Improvement of Shared Standards: If resources are available, it will be important to plan for continuous monitoring, feedback, and improvement, ensuring that the checklists, guidelines, and rating system remain up-to-date, effective, and responsive to the evolving landscape of software development and research.
  • Dissemination and Adoption of Shared Standards: Finally, this activity involves disseminating the tools widely and, ideally, working with journals, conferences, institutions, and other platforms to adopt and endorse them. This step would help to create a shared and recognized standard for software review, enhancing transparency, credibility, and collaboration across the field.

Potential Objectives

  • To understand the various levels and needs of software peer-review, recognizing the complexity and diversity of software within different scientific and technological domains.
  • To develop a tiered system of checklists and guidelines that reflect different levels of software review standards, providing a clear and tailored framework for evaluation.
  • To leverage existing information, standards, or practices related to software review from journals and other resources, building on proven methods and insights.
    To collaborate with various stakeholders, including software developers, reviewers, editors, and users, to ensure that the tools are fit for purpose and responsive to real needs and dynamics to uphold shared standards.
  • To provide training, resources, mentoring, and ongoing engagement with the review community, building capacity and confidence in using the shared standards.
  • To monitor, gather feedback, and continuously improve the shared standards and supporting tools, ensuring that they remain up-to-date, effective, and aligned with the evolving landscape of software development and research.
  • To disseminate the shared standards and tools widely and work with journals, conferences, institutions, and other platforms to adopt and endorse them, creating a shared and recognized standard for software review.

Targeted Impacts

  • To elevate the status, rigor, and transparency of shared standards for software review within the scientific and technological community, aligning it with the principles and practices of professional peer-review.
  • To empower software developers, reviewers, and users to engage in a more structured, meaningful, and accountable review process, enhancing the quality, credibility, and impact of software.
  • To foster a more collaborative, inclusive, and innovative ecosystem for software development and research, bridging disciplines, technologies, platforms, and perspectives.

This potential activity was curated as part of "Charting the Course: Policy and Planning for Sustainable Research Software," a Sloan Foundation-funded project within URSSI dedicated to supporting the future of research software through evidence-informed policy work (Project contacts are: @danielskatz and @dr-eric-jensen). If you are interested in working on this, please add a comment.

@dr-eric-jensen dr-eric-jensen added Effort: Small Less than 1 person-year estimated to deliver this. Effort: Medium 1-3 person-years estimated to deliver this. Activity: Organizing / Action This involves organizing, capacity building or other practical actions to advance research software labels Aug 24, 2023
@danielskatz danielskatz added the Activity This is a policy activity label Aug 30, 2023
@dr-eric-jensen dr-eric-jensen added the Topic: Quality This covers topics relating to software quality label Aug 31, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Activity: Organizing / Action This involves organizing, capacity building or other practical actions to advance research software Activity This is a policy activity Effort: Medium 1-3 person-years estimated to deliver this. Effort: Small Less than 1 person-year estimated to deliver this. Topic: Quality This covers topics relating to software quality
Projects
None yet
Development

No branches or pull requests

2 participants