Skip to content

Repeatability evaluation package for the ARCH2022 AFF competition

License

Notifications You must be signed in to change notification settings

JuliaReach/ARCH2022_AFF_RE

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

3 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

ARCH2022 AFF

This is the JuliaReach repeatability evaluation (RE) package for the ARCH-COMP 2022 category report: Continuous and Hybrid Systems with Linear Continuous Dynamics of the 6th International Competition on Verifying Continuous and Hybrid Systems (ARCH-COMP '22).

To cite the work, you can use:

@inproceedings{AlthoffFSW22,
  author    = {Matthias Althoff and
               Marcelo Forets and
               Christian Schilling and
               Mark Wetzlinger},
  editor    = {Goran Frehse and
               Matthias Althoff},
  title     = {{ARCH-COMP22} Category Report: Continuous and Hybrid Systems with
               Linear Continuous Dynamics},
  booktitle = {{ARCH}},
  series    = {EPiC Series in Computing},
  volume    = {90},
  pages     = {58--85},
  publisher = {EasyChair},
  year      = {2022},
  url       = {https://doi.org/10.29007/mmzc},
  doi       = {10.29007/mmzc}
}

Installation

Note: Running the full benchmark suite should take no more than three hours with a reasonable internet connection.

There are two ways to install and run this RE: either using the Julia script or using the Docker script. In both cases, first clone this repository.

Using the Julia script. First install the Julia compiler following the instructions here. Once you have installed Julia, execute

$ julia startup.jl

to run all the benchmarks.

Using the Docker container. To build the container, you need the program docker. For installation instructions on different platforms, consult the Docker documentation. For general information about Docker, see this guide. Once you have installed Docker, start the measure_all script:

$ ./measure_all

The Docker container can also be run interactively:

$ docker run -it juliareach bash

$ julia

julia> include("startup.jl")

Outputs

After the benchmark runs have finished, the results will be stored in the folder result and plots are generated in your working directory.


How the Julia environment was created

julia> ]

(@v1.7) pkg> activate .
  Activating new environment at `.../ARCH2022_AFF/Project.toml`

pkg> add BenchmarkTools
pkg> add ExponentialUtilities
pkg> add FastExpm
pkg> add IntervalArithmetic
pkg> add JLD2
pkg> add LaTeXStrings
pkg> add LazySets
pkg> add MathematicalSystems
pkg> add Plots
pkg> add ReachabilityAnalysis
pkg> add https://github.com/JuliaReach/SpaceExParser.jl#b6647c9
pkg> add Symbolics

About

Repeatability evaluation package for the ARCH2022 AFF competition

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published