Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

RFC: enhanced benchmark discovery #481

Open
philpep opened this issue Oct 28, 2016 · 3 comments
Open

RFC: enhanced benchmark discovery #481

philpep opened this issue Oct 28, 2016 · 3 comments
Labels
idea Low-priority enhancement suggestion

Comments

@philpep
Copy link
Contributor

philpep commented Oct 28, 2016

Currently benchmark discovery (and categorization in web ui) is done by introspecting python files in benchmark_dir and looking for __name__ of functions and classes and module name relative to benchmark_dir.

This make hard to generate benchmark function programmatically because we have to take care of __name__ and create a module for each category and separate the entire benchmark code in a specific directory. Also changing the function name or moving it to another module requires to update all benchmark results.

I wonder if we could use the same logic than https://github.com/celery/celery for discovering tasks eg:

  • benchmark_load config could be a list of importables modules
  • autoregister functions <import_name_relative_to_pythonpath>.module.Class.function
  • support named benchmarks (eg by adding a name attribute on the function).
  • categorization is handled with dots in benchmark name like the current behavior.

What do you think ?

After reviewing my list, I think just adding named benchmark is a good start (and backward compatible), I try to write a PR for this.

philpep added a commit to philpep/asv that referenced this issue Oct 28, 2016
This version is less optimized than the previous, but more readable and
will help to implement custom named benchmarks.

Related to airspeed-velocity#481
philpep added a commit to philpep/asv that referenced this issue Oct 28, 2016
  * move benchmark discovery and instanciation related code in
    disc_benchmark()
  * rename from_name() to get_benchmark_from_name() outside of Benchmark
    class
  * use disc_benchmark() in get_benchmark_from_name(), less optimized
    version but more readable and no more rely on benchmark name.

Related to airspeed-velocity#481
philpep added a commit to philpep/asv that referenced this issue Oct 28, 2016
This allow to control the full name of a benchmark function

Closes airspeed-velocity#481
philpep added a commit to philpep/asv that referenced this issue Oct 28, 2016
This allow to control the full name of a benchmark function

Related to airspeed-velocity#481
philpep added a commit to philpep/asv that referenced this issue Oct 29, 2016
This allow to control the full name of a benchmark function

Related to airspeed-velocity#481
philpep added a commit to philpep/asv that referenced this issue Oct 29, 2016
This allow to control the full name of a benchmark function

Related to airspeed-velocity#481
@pv
Copy link
Collaborator

pv commented Nov 6, 2016

I would like to keep it simple here, and not offer customization options on the benchmark discovery, unless the use cases are clearly necessary. I think it's fine for asv to have an opinion on how benchmark suites should be structured.

A benchmark_name attribute perhaps is motivated, as the name encodes not only the benchmark but also the benchmark results which may be useful to keep for backward compatibility. A question however is whether the presence of this attribute should be taken to mean that the object in question is a benchmark. Probably yes, so the attribute should be benchmark_name instead as name is likely too ambiguous.

I think at least Nose had a feature like this. I don't recall exactly how pytest handles this.

@pv pv added the enhancement Triaged as an enhancement request label Nov 6, 2016
philpep added a commit to philpep/asv that referenced this issue Dec 16, 2016
  * move benchmark discovery and instanciation related code in
    disc_benchmark()
  * rename from_name() to get_benchmark_from_name() outside of Benchmark
    class
  * use disc_benchmark() in get_benchmark_from_name(), less optimized
    version but more readable and no more rely on benchmark name.

Related to airspeed-velocity#481
philpep added a commit to philpep/asv that referenced this issue Dec 16, 2016
This allow to control the full name of a benchmark function

Related to airspeed-velocity#481
philpep added a commit to philpep/asv that referenced this issue Dec 16, 2016
This allow to control the full name of a benchmark function

Related to airspeed-velocity#481
philpep added a commit to philpep/asv that referenced this issue Dec 16, 2016
This allow to control the full name of a benchmark function

Related to airspeed-velocity#481
@anntzer
Copy link
Contributor

anntzer commented Jul 8, 2017

It would be great if asv could rely on pytest to perform benchmark discovery/collection, as this would e.g. also allow using pytest parametrization and fixtures, as pytest-benchmark (http://pytest-benchmark.readthedocs.io/en/stable/usage.html) does. See e.g. https://github.com/anntzer/mpl_cairo/blob/master/tests/test_speed.py for a case where parametrization makes sense.

(While I think pytest-benchmark is much better at benchmark collection, its reporting/postprocessing facilities are nowhere as good as asv's).

@pv
Copy link
Collaborator

pv commented Sep 20, 2018

Current situation, in case someone stumbles upon here: https://asv.readthedocs.io/en/stable/benchmarks.html#general
benchmark_name is added, probably sufficient for most benchmark autogeneration, with suitable amount of work.

pytest-benchmark integration is #567, I'm not planning to work on implementing/maintaining it however at the moment, and whether its runtime model is compatible is unclear.

@pv pv added the idea Low-priority enhancement suggestion label Jun 1, 2019
@pv pv removed the enhancement Triaged as an enhancement request label Jun 30, 2019
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
idea Low-priority enhancement suggestion
Projects
None yet
Development

No branches or pull requests

3 participants