New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Revised/additional summary page #437
Conversation
This looks awesome, thanks !! About "what does "recently" mean? 1 month, 10 revisions, ...?" Looking the latest step (regression or improvement) could be enough. It seems this is what you're currently doing with Now we have this "improvement" information, we could provide a "per commit" view that show all regressions and improvement where a particular commit is involved in. This will help for our use case (mercurial revsets) where a single change can impact multiple benchmarks in a good or bad manner, in this case we may want to promote improvement on very used revsets and accept some regression on rarely used ones. But indeed this new page overlap with the regression page. Maybe we could merge them in a single page and add some selectors to filter showed data. I really like the colors and the percentage here. What represent the data after the |
Yes, I think it is relatively straightforward to add different views to the data, and the per-commit one could also be useful. The difficult part is just deciding what to present and how. (I'm a bit time-constrained currently, so don't let this PR/WIP block if you're thinking of something, since this may take long to finish.) I don't see the overlap with the regression pageas necessarily as a problem in itself, but it can be useful to import the ideas that can be imported. Potentially for the present page: (added to top) The +- shows the noise level of the value (= average deviation from the median in the last piecewise step). It maybe should be multiplied by 2 to be more intuitive... Added as inspired by speed.pypy.org. |
b4919d9
to
d03e8d7
Compare
983f8d0
to
3343255
Compare
I think this is done now. Additions can be added later. |
Move navigation links to the top bar. Make regressions page appear full width.
If it's too small, the bootstrap top bar overlaps the regression list and the popup plot doesn't appear.
The cached test data is cleared on asv version number changes, or manually via py.test --cache-clear
The tests assume regressions only occur in specific benchmarks, but e.g. time_* benchmark results may vary. Include only deterministic benchmarks.
Merging --- after dogfooding for a while, seems to work ok. |
Add a second summary page, showing a list of benchmarks (instead of a grid).
The list shows benchmarks only for one choice of environment parameters (machine, cpu, packages, ...) at once.
Also show how the benchmark performance has evolved "recently", some ideas borrowed from http://speed.pypy.org/changes/
Some overlap with the Regressions display, with the exception that regressions shows regressions across all branches, and does not show improved results.
Demo: https://pv.github.io/numpy-bench/#summarylist
https://pv.github.io/scipy-bench/#summarylist