Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Criterion benchmarks are incompatible with critcmp #4037

Closed
brson opened this issue Jan 8, 2019 · 4 comments · Fixed by #4169
Closed

Criterion benchmarks are incompatible with critcmp #4037

brson opened this issue Jan 8, 2019 · 4 comments · Fixed by #4169
Assignees
Labels
component/build-time Component: Compilation time component/test-bench Component: Unit tests, Integration tests, CI, Benchmarks, etc. help wanted Denotes an issue that needs help from a contributor. Must meet "help wanted" guidelines. status/mentor This issue is currently mentored

Comments

@brson
Copy link
Contributor

brson commented Jan 8, 2019

Bug Report

What version of Rust are you using?

nightly-2018-07-18-x86_64-unknown-linux-gnu
nightly-2018-12-06-x86_64-unknown-linux-gnu

What operating system and CPU are you using?

EC2 t2.xlarge, Ubuntu.

What did you do?

I am trying to automatically generate benchmark comparisions of all TiKV benchmarks. The cargo-benchcmp tool works for the standard benchmarker, but critcmp does not work on our Criterion benchmarks.

What did you expect to see?

Normal critcmp output comparing criterion 'baselines'

What did you see instead?

Two of the three criterion benchmark crates don't present the standard criterion CLI so they ignore the necessary --save-baseline flag.

In hierarchy, the only benchmark crate that presents the criterion CLI, there is an error:

~/tikv$ critcmp --list
/home/ubuntu/tikv/target/criterion/table_scan_datum_absent_large_row/new: benchmark.json: No such file or directory (os error 2)

The solution to this is twofold:

  1. Upgrade criterion from 0.2.4 to 0.2.5/0.2.7 (0.2.5 is claimed to work on the critcmp website).
  2. Reorganize two crates to create their main function with criterion_main!.
@brson brson self-assigned this Jan 8, 2019
@breezewish
Copy link
Member

breezewish commented Jan 9, 2019

I guess a simple alternative is to write

Criterion::default().configure_from_args().sample_size(10);

instead of

Criterion::default().sample_size(10);

In this way, we can have both advantages (a low sample size which is fast to run and support saving baseline!)

@nolouch nolouch added the component/test-bench Component: Unit tests, Integration tests, CI, Benchmarks, etc. label Jan 9, 2019
@brson
Copy link
Contributor Author

brson commented Jan 11, 2019

Ah yeah that will probably work.

@brson
Copy link
Contributor Author

brson commented Jan 29, 2019

I may not get to this any time soon, but I could help mentor.

@brson brson added help wanted Denotes an issue that needs help from a contributor. Must meet "help wanted" guidelines. status/mentor This issue is currently mentored component/build-time Component: Compilation time labels Jan 29, 2019
@brson
Copy link
Contributor Author

brson commented Jan 31, 2019

I'm throwing this into the 'compile time' project because I want to be able to benchmark thoroughly while messing with the build.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
component/build-time Component: Compilation time component/test-bench Component: Unit tests, Integration tests, CI, Benchmarks, etc. help wanted Denotes an issue that needs help from a contributor. Must meet "help wanted" guidelines. status/mentor This issue is currently mentored
Projects
None yet
Development

Successfully merging a pull request may close this issue.

3 participants