Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Comparative Benchmarking #478

Open
MSeifert04 opened this issue Oct 23, 2016 · 7 comments
Open

Comparative Benchmarking #478

MSeifert04 opened this issue Oct 23, 2016 · 7 comments
Labels
enhancement Triaged as an enhancement request

Comments

@MSeifert04
Copy link

MSeifert04 commented Oct 23, 2016

I'm currently working on a benchmarking project where I want to benchmark different implementations of some function. It's not really hard to write the benchmarks but it's not so easy (impossible?) to display them as comparative benchmarks in one graph.

I know that it's possible to parametrize the benchmarks but these have weird names (they use the full name of the function and if the function is defined in the benchmarking file it's not displayed at all) and obviously different implementations of the same functionality might differ (argument names, argument order, ...) in a way that makes it really ugly to parametrize the test.

Would it be possible to either:

  • include a section in the documentation how one could do comparative benchmarks (if it's already possible)
  • enhance the functionality of asv to do this kind of comparative benchmarks?
@MSeifert04
Copy link
Author

MSeifert04 commented Oct 23, 2016

Sorry for closing and reopening this.

It is possible with a combination of params, param_names and then choosing in the graph the appropriate "x-axis" value to display comparative benchmarks.

However this suffers (at least on the 0.2 version) from several drawbacks:

  • If I set the "x-axis" to any parameter I can't use the "plot_settings" (every option there makes the bars disappear).

2
1

  • If I have more complicated benchmarks I need a very extensive machinery to make sure the order of arguments is appropriate for the comparison function.

Besides fixing the first point (or did I do something wrong?) would it be possible to have a specialized class for comparative benchmarks? That class would be displayed like a parametrized benchmark but instead of using the parametrization as parameters use the methods?

I don't know if that's within the scope of asv, if not feel free to ignore the enhancement request..

@pv
Copy link
Collaborator

pv commented Oct 24, 2016

The "plotting options" thing is a bug (probably best discussed in a separate issue ticket).

I'm not completely sure from the above description what is the problem with using parameterized benchmarks for what you are trying to do. Could you post some example code of what you are currently doing and what you'd like to do to gist.github.com, to clarify this?

If you are trying to have each method in a benchmark class define a parameter value and an associated benchmark, I think that can be achieved generically with a class decorator or a metaclass --- I'm not sure if it would be a good idea to include such wrapper in asv itself, as this introduces multiple ways to declare the same thing.

@MSeifert04
Copy link
Author

@pv I haven't used class decorators/metaclasses to that extend yet, so I don't know how I would go about implementing this.

But currently I have some implementation like this: https://gist.github.com/MSeifert04/d2d4013093362c9e71c60b16ef53e355.

For example I have different functions, say scipy.signal.medfilt, scipy.ndimage.medfilt2d, scipy.ndimage.filters.median_filter and another implementation from another package. The function signatures differ slightly so I either have to put a lot of if .. elif ... ... else ... branches inside each test or map different calls (like in the gist) to different arguments.

It hope it's not too much of a mess and I also think that this is not really so general as to be useful for lots of people. 😄

@pv

This comment has been minimized.

@MSeifert04

This comment has been minimized.

@pv

This comment has been minimized.

@MSeifert04

This comment has been minimized.

@pv pv added the enhancement Triaged as an enhancement request label Dec 11, 2016
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement Triaged as an enhancement request
Projects
None yet
Development

No branches or pull requests

2 participants