Skip to content

ms2017/STEB

Repository files navigation

Repository for Synthetic Time series Evaluation Benchmark (STEB)

Code

The code is located in src. The folder structure is similar to the benchmark architecture (as depicted in the paper).

Experiments

The experiments folder contains the workspace for each experiment with its configuration file. Note that there are two additional experiments not mentioned in the paper.

Documentation

Please find the documentation of the project in the doc folder. To view the documentation with more details, please open file doc/_build/html/index.html in a webbrowser of your choice.

Reproducibility

Further instructions to reproduce the conducted experiments, can be found in the dedicated reproducibility section in the documentation.

Tests

Tests for various parts of the benchmark, including Measures, Embedder, and Transformations are located in folder tests.

Experimental data

For transparency reasons, we included a MongoDB database dump experimental_data.zip which can be loaded into another MongoDB instance for inspection. It contains the records of the experiments and tests conducted and direct experimental results, i.e., the measures' scores and running times.

About

A benchmark for evaluating the reliability and running time of quality measures for synthetic time series.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors