Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Detect performance regressions #75

Closed
wilzbach opened this issue Apr 11, 2016 · 2 comments
Closed

Detect performance regressions #75

wilzbach opened this issue Apr 11, 2016 · 2 comments

Comments

@wilzbach
Copy link
Member

wilzbach commented Apr 11, 2016

Perfomance is crucial, so we should have a way to detect regressions or evaluate improvements.

Here is an idea that I have:

  • We add some more complicated "performance" unittest and exclude them from compilation by default
  • They probably have to use a special API or mixin to report a unique name and their runtime.
  • On a PR a CI checks out the new version and the master branch and runs for both the "performance" tests several times in random order and then calculates the average for every tests (that's why we need a unique name for mapping) and difference between master and the PR / feature branch.
  • Probably some variance due to different loads has to be tolerated and shouldn't be reported
  • The CI could complain via git bot (like coverage), email or the CI status icon
  • Maybe we then want to use a different CI, so that is just additional info and doesn't block Travis

Btw this is also a topic that often comes up in Phobos, but afaik currently it always depends on manual benchmarking.

std.algorithm.sort perfomance: dlang/phobos#3922
std.regex JIT compiling: dlang/phobos#4120
Faster pairwise summation: dlang/phobos#4069
Faster topN: dlang/phobos#3934

--- Want to back this issue? **[Post a bounty on it!](https://www.bountysource.com/issues/32834334-detect-performance-regressions?utm_campaign=plugin&utm_content=tracker%2F18251717&utm_medium=issues&utm_source=github)** We accept bounties via [Bountysource](https://www.bountysource.com/?utm_campaign=plugin&utm_content=tracker%2F18251717&utm_medium=issues&utm_source=github).
@9il 9il added the Benchmarks label Apr 11, 2016
@wilzbach
Copy link
Member Author

@9il I think getting a basic solution to this problem could be quite useful for you when you try to benchmark your achievements with Blas. Can you think about a simpler solution?

@9il
Copy link
Member

9il commented Sep 9, 2018

wontfix for now

@9il 9il closed this as completed Sep 9, 2018
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

2 participants