Artifact Evaluation Track
ICPE 2023 welcomes the submission of artifacts to the ICPE Artifact Track. According to ACM’s “Result and Artifact Review and Badging” policy, an “artifact” is “a digital object that was either created by the authors to be used as part of the study or generated by the experiment itself […] software systems, scripts used to run experiments, input datasets, raw data collected in the experiment, or scripts used to analyze results.” A formal review of such artifacts not only ensures that the “study is repeatable” by the same team, if they are available online then other researchers “can replicate the findings” as well and this also allows reuse by other teams.
In this spirit, the ICPE 2023 Artifacts Track exists to review, promote, share and catalog the research artifacts of interest to the Performance Engineering community.
Types of artifact submissions:
- Research paper artifacts (accompanying accepted papers in the Research or Industry track)
- Already have a research paper in the proceedings
- The research paper will acquire a badge on artifact acceptance
- Tool artifact
- A paper of up to 4 pages ACM, not including references, that describes the artifact, its utility, benefits, and success stories of its utilization. The paper must provide a link from where the artifact can be downloaded. At submission time, a non-permanent link (e.g., to a personal page or GitHub project) is accepted. An extra page is allowed for references.
- This category of artifacts includes tools, libraries, scripts, benchmarks, frameworks. The authors must provide instructions for installing (including software and hardware requirements) and using the artifact on the same website from where the artifact is downloaded or in the same bundle.
- Data artifact
- A paper of up to 4 pages ACM, not including references, describing the dataset, its format in a clear way, its utility, relevance for its reuse by the performance engineering research community, data acquisition process, and guidelines or examples on how to use it to foster trustability. The paper must provide a link from where the dataset can be downloaded, its utility and success stories of its utilization. At submission time, a non-permanent link (e.g., to a personal page or GitHub project) is accepted. An extra page is allowed for references.
To facilitate the review, the artifacts may include, but are not limited to, source code, executables, datasets, a virtual machine image, and documents (please use open formats for documents). Please make sure that your artifact is self-contained (with the exception of pointers to external tools or libraries, which we will not consider being part of the evaluated artifact, but which we will try to use when evaluating the artifact). All submitted artifacts must be open-access. By the camera read deadline, the artifacts must be available from a permanent URL or DOI with an archival plan, such as the Zenodo repository (a personal page is not sufficient).
If you require an exception from the conditions above, please mail the chairs before submitting them (email@example.com, firstname.lastname@example.org).
What do you get out of it?
Types tool artifact and data artifact
If your artifact is accepted, your artifact description paper will be published in the ICPE 2023 Companion Proceedings. The paper will receive one of the badges described in “Types of badges.”
Type research paper artifacts
If your artifact is accepted, your accepted research paper will receive one of the following badges in the text of the paper and the ACM Digital Library.
Types of badges awarded
The types of badges that papers can receive are:
Artifacts Evaluated - Functional: The artifacts are complete, well-documented, and allow obtaining the same results as the paper.
Artifacts Evaluated - Reusable: As above, and adding that the artifacts are of such a high quality that they can be reused as is on other data sets, by other tools, or for other purposes.
Artifacts Available: For artifacts made permanently available. This will only be awarded in conjunction with one of the Artifacts Evaluated badges.
Regarding archival, all accepted artifacts will be indexed on the conference website.
How to submit?
Submissions are made via Easychair.
Submission deadlines are listed on the important dates page.
Types tool artifact and data artifact: We request the submission of the max. 4-page ACM format paper describing the artifact. The paper must include the link to where the artifact should be downloaded. Since the paper will be published in the ICPE 2023 Companion Proceedings, it does not need to include the prerequisites or the instructions to install and execute the artifact. These instructions must be provided on the linked website or bundle (e.g. in a README.md file).
Type research paper artifacts: We request a PDF submission that includes: a) a first page, in any format, that includes a brief summary of the artifact and the link to the artifact; b) the camera-ready version of the accompanying research paper, so that the reviewers can properly judge the artifact. As in the previous types of artifacts, the hardware and software prerequisites as well as the instructions to install and run the artifacts must be provided on the linked website or bundle (e.g. in a README.md file).
Video: Optionally, authors of all types of artifacts are encouraged to submit a link to a short video (maximum 5 minutes) demonstrating the artifact. For tool and data artifacts, the link to the video is included in the 4-page paper. For research paper artifacts, the link should appear on the first page of the submitted PDF.
To submit an artifact it is important to keep in mind: a) how accessible you are making your artifact to other researchers, and b) that the ICPE artifact evaluators will have very limited time for making an assessment of each artifact. Artifact evaluation can be rejected for artifacts whose configuration and installation takes an undue amount of time. If you envision difficulties, please provide your artifact in some easily ported form, such as a virtual machine image (https://www.virtualbox.org) or a container image (https://www.docker.com).
Review Process and Selection Criteria
The artifact will go through an anonymous review. Artifacts associated with a research paper will be evaluated in relation to the expectations set by the paper. Standalone tool and data artifacts Submitted artifacts will go through a two-phase evaluation:
- Kicking the tires: reviewers check the artifact integrity and look for any possible setup problems that may prevent it from being properly evaluated (e.g., corrupted or missing files, VM does not start, immediate crashes, or platform configuration problems to run the artifact as the simplest examples). Authors are informed of the outcome of this phase and can help to resolve found issues during a brief author response period.
- Artifact assessment: reviewers evaluate the artifacts, checking if they live up to the expectations created by the paper.
Since portability bugs are easy to make, the review committee can issue additional requests to authors during the assessment to fix such bugs in the artifact. The resulting version of the artifact should be considered “final” and should allow reviewers to decide about artifact acceptance and badges. For accepted tool and data artifacts, it will be requested a camera ready version of the paper that must include the link to a permanent repository.
Artifacts will be scored using the following criteria
Artifacts Evaluated - Functional:
- Documented: Is it accompanied by relevant documentation making it easy to use?
- Consistent: Is the artifact relevant to the associated paper (the accepted paper or the standalone 4-page submitted artifact paper), and contribute in some inherent way to the generation of its main research results or motivating success stories?
- Complete: To the extent possible, are all components relevant to the research or artifact paper in question included? (Proprietary artifacts need not be included. If they are required to exercise the package then this should be documented, along with instructions on how to obtain them. Proxies for proprietary data should be included so as to demonstrate the analysis.)
- Exercisable: If the artifact is executable, is it easy to download, install, or execute? Included scripts and/or software used to generate the results in the associated paper can be successfully executed, and included data can be accessed and appropriately manipulated.
Artifacts Evaluated - Reusable:
- The artifacts are of a quality that significantly exceeds minimal functionality. That is, they have all the qualities of the Artifacts Evaluated – Functional level, but, in addition, they are very carefully documented and well-structured to the extent that reuse and repurposing is facilitated. In particular, norms and standards of the research community for artifacts of this type are strictly adhered to.
- Author-created artifacts have been placed on a publicly accessible archival repository. A DOI or link to this repository along with a unique identifier of the object is provided.
Instructions for Authors from ACM:
By submitting your article to an ACM Publication, you are hereby acknowledging that you and your co-authors are subject to all ACM Publications Policies, including ACM’s new Publications Policy on Research Involving Human Participants and Subjects. Alleged violations of this policy or any ACM Publications Policy will be investigated by ACM and may result in a full retraction of your paper, in addition to other potential penalties, as per ACM Publications Policy.
Please ensure that you and your co-authors obtain an ORCID ID, so you can complete the publishing process for your accepted paper.
ACM has been involved in ORCID from the start and we have recently made a commitment to collect ORCID IDs from all of our published authors.
The collection process has started and will roll out as a requirement throughout 2022.
We are committed to improve author discoverability, ensure proper attribution and contribute to ongoing community efforts around name normalization; your ORCID ID will help in these efforts.
Artifact Evaluation Track Chairs
- Diego Perez-Palacin, Linnaeus University, Sweden
- Katinka Wolter, Freie Universitaet zu Berlin
Submissions to be made via Easychair.