Julia performance monitoring
This directory contains tests and related utilities to monitor Julia's performance over time. The results are presented on http://speed.julialang.org/.
Running the performance tests
make. It will run the
perf.jl script in all
the sub-directories and display the test name with the minimum,
maximum, mean and standard deviation of the wall-time of five repeated
test runs in micro seconds.
make codespeed is for generating the results displayed on
not what you want.
There is also a
perfcomp.jl script but it may not be working with
the rest at the moment.
First decide whether the new tests should go into one of the existing suites:
micro: A set of micro-benchmarks commonly used to compare programming languages; these results are shown on http://julialang.org/.
lapack: Performance tests for linear algebra tasks from low-level operations such as matrix multiplies to higher-level operations like eigenvalue problems.
cat: Performance tests for concatenation of vectors and matrices.
kernel: Performance tests used to track real-world code examples that previously ran slowly.
shootoutTracks the performance of tests taken from the Debian shootout performance tests.
sort: Performance tests of sorting algorithms.
spellPerformance tests of Peter Norvig's spelling corrector.
sparse: Performance tests of sparse matrix operations.
Otherwise add a subdirectory containing the file
Makefile as well.
include("../perfutil.jl") and then run the
performance test functions with the
@timeit macro. For example:
@timeit(spelltest(tests1), "spell", "Peter Norvig's spell corrector")
with arguments: test function call, name of the test, description,
and, optionally, a group (only used for codespeed).
@timeit will do
a warm-up and then 5 timings, calculating min, max, average and standard
deviation of the timings.
If possible aim for the tests to take about 10-100 microseconds.
Using the framework for your own tests
@timeit on the functions to be
benchmarked. Alternatively have a look at the