A short conference paper on benchmarking
TeX Julia Makefile
Switch branches/tags
Nothing to show
Clone or download
Fetching latest commit…
Cannot retrieve the latest commit at this time.
Permalink
Failed to load latest commit information.
experiments
figures
misc
old
.gitignore
.travis.yml
Makefile
README.md
biblio.bib
main.tex

README.md

Robust benchmarking in noisy environments

A paper by Jiahao Chen and Jarrett Revels, Julia Labs, MIT CSAIL, to be published in the Proceedings of the 20th Annual IEEE High Performance Extreme Computing Conference (HPEC 2016)

Build Status

Abstract

We propose a benchmarking strategy that is robust in the presence of timer error, OS jitter and other environmental fluctuations, and is insensitive to the highly nonideal statistics produced by timing measurements. We construct a model that explains how these strongly nonideal statistics can arise from environmental fluctuations, and also justifies our proposed strategy. We implement this strategy in the BenchmarkTools Julia package, where it is used in production continuous integration (CI) pipelines for developing the Julia language and its ecosystem.

Code and data

The main benchmarking code is available from the BenchmarkTools Julia package, v0.0.3. The specific code used to run these experiments and the data generated on our test machine is available from the experiments directory in this repository.