[FM-India] 1st Intl. Competition of Software for Runtime Verification: call for participation

Madhavan Mukund madhavan at cmi.ac.in
Tue Nov 26 08:24:38 IST 2013

 From: Runtime Verification <rvconference at gmail.com>
 To: Runtime Verification <rvconference at gmail.com>
 Date: Mon, 25 Nov 2013 21:37:11 +0100
 Subject: 1st Intl. Competition of Software for Runtime Verification: call for participation
 [Apologizes for duplicates]
 *1st Intl. Competition of Software for Runtime Verification (CSRV-2014)*
 *held with RV 2014 in Toronto, Canada*
 CSRV-2014 is the *1st International Software Runtime Verification
 Competition* as a part of the 14th International Conference on Runtime
 Verification. The event will be held in September 2014, in Toronto, Canada.
 CSRV-2014 will draw attention to the invaluable effort of software
 developers and researchers who contribute in this field by providing the
 community with new or updated tools, libraries and frameworks for the
 instrumentation and runtime verification of software.
 Runtime Verification is a verification technique for the analysis of
 software at execution time based on extracting information from a running
 system and checking if the observed behaviors satisfy or violate the
 properties of interest. During the last decade, many important tools and
 techniques have been developed and successfully employed. However, there is
 a pressing need to compare such tools and techniques, since we currently
 lack of a common benchmark suite as well as scientific evaluation methods
 to validate and test new prototype runtime verification tools.
 The main aims of CSRV-2014 competition are to:
    - Stimulate the development of new efficient and practical runtime
    verification tools and the maintenance of the already developed ones.
    - Produce a benchmark suite for runtime verification tools, by sharing
    case studies and programs that researchers and developers can use in the
    future to test and to validate their prototypes.
    - Discuss the metrics employed for comparing the tools.
    - Provide a comparison of the tools running with different benchmarks
    and evaluating using different criteria.
    - Enhance the visibility of presented tools among the different
    communities (verification, software engineering, cloud computing and
    security) involved in software monitoring.
 Please direct any enquiries to the competition co-organizers (
 csrv14.chairs at imag.fr):
    - Ezio Bartocci (Vienna University of Technology, Austria),
    ezio.bartocci at tuwien.ac.at;
    - Borzoo Bonakdarpour (University of Waterloo, Canada),
    borzoo at cs.uwaterloo.ca;
    - Yliès Falcone (Université Joseph Fourier, France),
    ylies.falcone at ujf-grenoble.fr.
 *CSRV-2014 Jury *The CSRV Jury will include a representative for each
 participating team and some representatives of the Demonstration
 Tools Committee of Runtime Verification Conference.
 *Call for Participation *The main goal of CSRV-2014 competition is to
 compare tools for runtime verification. We invite and encourage
 the participation with benchmarks and tools for the competition.The
 competition will consist of three main tracks based on the input
 language used:
    - Track on monitoring Java programs (online monitoring);
    - Track on monitoring C programs (online monitoring);
    - Track on monitoring of traces (offline monitoring).
 The competition will follow three phases:
    - Benchmarks/Specification collection phase - the participants are
    invited to submit their benchmarks (C or Java programs and/or traces). The
    organizers will collect them in a common repository (publicly available).
    The participants will then train their tools using the shared benchmarks;
    - Monitor collection phase - the participants are invited to submit
    their monitors. The participants with the tools/monitors (see more
    information in the following section) that meet the qualification
    requirements will be qualified for the evaluation phase;
    - Evaluation phase - the qualified tools will be evaluated running the
     benchmarks and they will be ranked using different criteria (i.e., memory
    utilization/overhead, CPU utilization/overhead, ...). The final results
    will be presented at RV 2014 conference.
 Please refer to the dedicated pages for more details on the three phases.
 *Important Dates**Dec. 15, 2013* - Declaration of intent (by email
 csrv14.chairs at imag.fr <csrv.chairs at imag.fr>).
 *March 1, 2014* - Submission deadline for benchmark programs and the
 properties to be monitored.
 *March 15, 2014* - Tool training starts by participants.
 *June 1, 2014* - Monitor submission.
 *July 1, 2014* - Notifications and reviews.

More information about the FMIndia mailing list