-
-
Notifications
You must be signed in to change notification settings - Fork 302
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
libpython: Add helper library for benchmarking #1670
Conversation
@aaronsms I would be interested in what you think (I can't put you on review directly). |
This is addressing some of the needs from @aaronsms's GSoC 2021, but see also [GRASS-dev] measuring cpu time for each module and estimating modules scalability which was not taken into account at this point. |
I think it would be more useful to plot them all in the same plot just like |
Agreed. Done. |
This now has documentation and test. Additional benchmarking, outputs, plots are left for future PRs. |
9da78eb
to
4858c3c
Compare
An experimental collection of simple functions to help benchmarking and reduce code duplication between benchmarks. The current design ideas are be minimalist, pragmatic with no API promisses. The tools are meant for developers tracking the latest development version. Co-authored-by: Aaron Saw Min Sern <aaronsms>
4858c3c
to
9ea6bc8
Compare
I'm not sure if the test will run in the CI, but I did two rounds of dealing with the dependencies, so hopefully it will. I can't confirm that now, due to addon installation failing which fails the tests in CI. I merged this anyway so it is available for @aaronsms and anybody who will be testing his work. |
An experimental collection of simple functions to help benchmarking and reduce code duplication between benchmarks. The design ideas are: be minimalist, pragmatic with no API promises, but provide convenient functions for writing benchmarking scripts. The functions are meant for developers tracking the latest development version. The plotting functions can be imported with missing dependencies for convenience, but running them requires Matplotlib. Co-authored-by: Aaron Saw Min Sern <aaronsms@u.nus.edu>
An experimental collection of simple functions to help benchmarking and reduce code duplication between benchmarks. The design ideas are: be minimalist, pragmatic with no API promises, but provide convenient functions for writing benchmarking scripts. The functions are meant for developers tracking the latest development version. The plotting functions can be imported with missing dependencies for convenience, but running them requires Matplotlib. Co-authored-by: Aaron Saw Min Sern <aaronsms@u.nus.edu>
An experimental collection of simple functions to help benchmarking
and reduce code duplication between benchmarks.
The current design ideas are be minimalist, pragmatic with no API promisses.
The tools are meant for developers tracking the latest development version.
Co-authored-by: Aaron Saw Min Sern (thread/nprocs benchmarking function)