CCGRID 2024

Call For Artifacts

CCGrid 2024 will award three IEEE reproducibility badges to papers accepted to the main Research Tracks: Open Research Objects (ORO), Reusable/Research Objects Reviewed (ROR), and Results Reproduced (ROR-R). More information about each of these badges is available below. Authors may optionally request the badge(s) for their accepted paper by submitting relevant artifact(s) and a 2-page artifact description. If their submission is successfully reviewed, they will be awarded the badge(s). The badges will appear as part of the paper in the conference proceedings.

We suggest permanent repositories for archival that promote findable, accessible, interoperable and reusable (FAIR) principles such as Zenodo, Dryad or figshare to deposit your artifacts. If you are hosting your software and/or data on GitHub, you should additionally assign it a persistent identifier by linking it to Zenodo, figshare, etc.

Open Research Objects (ORO)

This badge signals that author-created digital objects used in the research (including data and code) are permanently archived in a public repository that assigns a global identifier, guarantees persistence, and are made available via standard open licenses that maximize artifact availability.

Reusable/Research Objects Reviewed (ROR)

This badge signals that all relevant author-created digital objects used in the research (including data and software) were reviewed according to the following criteria.

  1. A publication awarded the ROR badge must also be awarded the ORO badge.
  2. This badge corresponds to the and the previously used IEEE “Code Reviewed” badge.
  3. A review of a software artifact will include the following criteria, which must be satisfied in the artifact:
    1. Documentation: There should be sufficient documentation for the reviewer to understand the core functionality of the software under review, including a statement of need/function, installation instructions, examples of usage, and API documentation
    2. Functionality: Reviewers are expected to install the software they are reviewing and verify its core functionality.
    3. Testing: Minimally, that the software includes documented manual steps that can be followed to objectively check the expected functionality of the software (e.g., a sample input file to assert behavior); and ideally, that it includes an automated test suite hooked up to continuous integration (GitHub Actions, Circle CI, or similar)
  4. A review of a data artifact will include the following criteria, which must be satisfied in the artifact:
    1. Documentation: Any dataset used in the paper must be adequately documented, including context, description, how it was sourced, and how it is used in the paper.
    2. Functionality: The documentation should be sufficient to promote reusability of the data beyond just the current paper. In the case of proprietary data formats, there should be associated code to access the data elements programmatically.

Results Reproduced (ROR-R)

This badge is awarded when evaluators have successfully reproduced the key computational results of the paper using the author-created research objects, methods, code, and conditions of analysis. The goal is not to recreate the exact results, especially when they are hardware-dependent, but rather to reproduce the behavior and validate the central claims of the research as follows:

The following additional criteria must be met for the ROR-R badge:

The ROR-R badge process will involve:

The ROR-R badge will appear next to the ROR badge in the conference proceedings, signifying that the paper has achieved this high standard of computational reproducibility.

Artifact Preparation and Submission Process

Authors seeking one or both of these badges should submit a 2-page artifact description document that includes a brief description of the artifacts and any needed details for them to be reviewed as part of the CCGrid 2024 artifact review process and then used by future readers.

For software: a link to the artifact and a description that includes language, compilation and run environment (tools, pre-built binaries, hardware), input dataset (if available), expected output and metrics, estimated time of all the compilation and run steps, etc.

For data: a link to the artifact, and a description as mentioned in the review criteria above. Make connections between the specific artifact(s) and their role and context within relevant parts of the accepted paper. You must also explicitly reference and cite your artifacts in this document, including a persistent identifier to it (e.g., DOI from Zenodo, figshare) and, for software, optionally a link to a URL where it is being developed and maintained (e.g., GitHub). Given the 2-page limit, you should include key details in the description document and more exhaustive steps in the persistent artifact link. You should format this document using the IEEE 2-column conference proceedings template. If artifacts are successfully evaluated, the authors will be allowed to add an updated 2-page version of their artifact description as an Appendix to the camera-ready paper. The review of the artifacts will follow a single-blinded process.

The artifact badging process will occur between author notifications (expected 12 Feb 2024) and the Camera-Ready paper submission deadline (4 March 2024):

Event Date
Artifact submission for accepted papers: February 16, 2024
Artifact review assignments made (after bidding): February 19, 2024
Artifact review midpoint check-in (reviewers contact authors initially if needed): February 23, 2024
Artifact review deadline: March 1, 2024
Artifact review results announcements to authors: March 4, 2024

Note: Artifacts should be able to run on commodity workstations/laptops for the evaluation. In case the artifact is tightly coupled to a specific class of system or requires more than a generic workstation to be meaningfully evaluated (e.g., an HPC cluster, cloud resources, specialized accelerators, etc.), authors should provide access to such an evaluation environment that the artifact reviewers can use. This pre-requisite should be clear in the Artifact submission and the EasyChair abstract. The relevant credentials to the specialized resource may be shared by email with the Artifact Evaluation Committee Chairs to be passed onto the reviewers anonymously. If you require further guidance, please get in touch with the Artifact Evaluation Committee Chairs before the submission deadline.

Please use the following Easychair link to submit your artifacts: https://easychair.org/conferences/?conf=ccgrid2024

Chairs and contacts

Alessandro Papadopoulos, Mälardalen University, Sweden (alessandro.papadopoulos@mdu.se)
Michael R. Crusoe, Common Workflow Language, Germany
Daniel S. Katz, UIUC, USA, North America