[Om-announce] IEEE ICSA 2022: Artifact Evaluation Track

Dalila Tamzalit dalila.tamzalit at univ-nantes.fr
Wed Jul 28 09:55:25 CEST 2021


**The IEEE International Conference on Software Architecture (ICSA) 2022 
is on the way! Please note that deadlines are earlier than usual: 
https://icsa-conferences.org/2022/
**

**

*Artifact Evaluation Track – ICSA 2022*

**

*----------------------------------------------------*

*

CFP ICSA 2022 - Artifact Evaluation Track

12-15 March | Honolulu, Hawaii (USA)

https://icsa-conferences.org/2022/

----------------------------------------------------

**

ICSA 2022 will be the second ICSA featuring an Artifact Evaluation 
Track. An artifact evaluation track aims to review and promote the 
research artifacts of accepted papers.

Artifacts can be software systems, scripts, or datasets. High quality 
artifacts of published research papers are a foundation for the research 
results to be reproduced by other researchers and are thus a desirable 
part of the publication itself.

The artifact evaluation system is based on the upcoming NISO Recommended 
Practice on Reproducibility Badging and Definitions, which is supported 
by our publisher, IEEE, and the evaluation criteria are also inspired by 
ACM’s artifact review and badging system as well as the criteria used by 
ICSE 2021.

Who can submit?

------------------------------------

Authors of the papers accepted to the following ICSA 2022 tracks are 
invited to submit an artifact for evaluation for the Research Object 
Reviewed (ROR) – reusable badge and the Open Research Object (ORO) 
badge: Technical Track, Journal-First Track, Software Architecture in 
Practice Track, and New and Emerging Ideas Track. All authors of papers 
related to the topics mentioned in the call for papers of the ICSA 
technical track are invited to submit studies for the Results Reproduced 
(ROR-R) and Results Replicated (RER) badges.

In addition, authors of any prior software architecture research are 
invited to submit an artifact for evaluation as Results Reproduced 
(ROR-R) and Results Replicated (RER).

Please note that we require one author of each artifact submission to 
peer-review 2-3 other artifacts!

Candidate Artifacts

------------------------------------

Artifacts of interest include (but are not limited to) the following:

- Software, which are implementations of systems or algorithms 
potentially useful in other studies.

- Data repositories, which are data (e.g., logging data, system traces, 
survey raw data) that can be used for multiple software engineering 
approaches.

- Frameworks, which are tools and services illustrating new approaches 
to software engineering that could be used by other researchers in 
different contexts.

This list is not exhaustive, so the authors are asked to email the 
chairs before submitting if their proposed artifact is not on this list. 
Further information on data sharing principles and approaches are 
further introduced along an introduction of the general notion of open 
science in the book chapter Open Science in Software Engineering by 
Méndez, Graziotin, Wagner, and Seibold: https://arxiv.org/abs/1904.06499.

The best artifact selected by the reviewers will be awarded the best 
artifact award.

For accepted ICSA 2022 papers, we plan to integrate the badge on the 
paper in the official IEEE proceedings.

Evaluation Criteria and Badges

------------------------------------

Evaluation criteria for badges and additional information have been 
taken from ICSE 2021, which are based on the ACM policy on Artifact 
Review and Badging Version 1.1, and from the upcoming NISO Recommended 
Practice on Reproducibility Badging and Definitions, which is supported 
by our publisher, IEEE.

The ICSA artifact evaluation track uses a single-blind review process. 
The reviewers for the artifacts will be a combination of artifact 
authors and ICSA program committee members.

Artifacts will be evaluated using the criteria in the last row of the 
table below. The goal of this track is to encourage reusable research 
products.

In order to obtain any badge every artifact will be required to be 
Functional, that is, it must be properly documented, consistent, 
complete, exercisable, and include appropriate evidence of verification 
and validation. The badges to be awarded as well as the evaluation 
criteria are the following ones:

RESEARCH OBJECT REVIEWED (ROR) – REUSABLE: Functional + very carefully 
documented and well-structured to the extent that reuse and repurposing 
is facilitated. In particular, norms and standards of the research 
community for artifacts of this type are strictly adhered to.

OPEN RESEARCH OBJECT (ORO): Functional + placed on a publicly accessible 
archival repository. A DOI or link to this repository along with a 
unique identifier for the object is provided.

(this matches the ACM “Available” badge)

RESULTS REPRODUCED (ROR-R): ROR + ORO + main results of the paper have 
been obtained in a subsequent study by a person or team other than the 
authors, using, in part, artifacts provided by the author.

RESULTS REPLICATED (RER): ROR + ORO + the main results of the paper have 
been independently obtained in a subsequent study by a person or team 
other than the authors, without the use of author-supplied artifacts.

Papers with such badges contain functional and reusable products that 
other researchers can use to bootstrap their own research. Experience 
shows that such papers earn increased citations and greater prestige in 
the research community.

General information and further suggested reading about artifact 
evaluation can be viewed in a public draft standard by Baldassarre, 
Ernst, Hermann and Menzies.

Important Dates

------------------------------------

Artifact Evaluation Registration Deadline: January 11, 2022 (mandatory)

Artifact Evaluation Submissions Deadline: January 14, 2022

Artifact Evaluation Notification: January 31, 2022

Submission and Review

Submission and Review

------------------------------------

Please, see details at: 
https://icsa-conferences.org/2022/conference-tracks/artifact-evaluation-track/

Organization

------------------------------------

Elena Navarro, University of Castilla-La Mancha, Artifact Evaluation Chair

Peng Xin, Fudan University, Artifact Evaluation Chair


*

-- 

*Dalila Tamzalit*
**/Maître de Conférences HC HdR - Associate-Professor/
Page web <http://dalila-tamzalit.com/fr/accueil/> - Home page 
<http://dalila-tamzalit.com/>



Laboratoire des Sciences du Numérique de Nantes  - CNRS UMR 6004
Équipe NaoMod <https://www.ls2n.fr/equipe/naomod/>
   4 rue Alfred Kastler, 44307 Nantes Cedex 3 - FRANCE
   33 (0)2.51.85.80.54

<http://www.ls2n.fr>
<http://www.ls2n.fr>

IUT Nantes - Département Informatique
   3, rue du Maréchal Joffre
   44 041  Nantes cedex 1 - FRANCE
                      Tel : 33 (0)2.40.30.60.57
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mailman.openmath.org/pipermail/om-announce/attachments/20210728/be181f10/attachment-0001.htm>


More information about the Om-announce mailing list