-
Notifications
You must be signed in to change notification settings - Fork 177
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Labelling specific commits #191
Comments
Such a feature doesn't currently exist, but it seems like it would be very useful. It does support showing any You could probably experiment with it from outside in fact. When you do an
The first is a string, the second is a javascript time stamp of an associated commit. You could add comments in the string part as long as you know the date time stamp, and they should appear in the graph. Of course, all that would need to be wrapped up in something more convenient that took pairs on commits and comments from the One wrinkle is that you may not want these comments to appear for all benchmarks. Certain refactoring changes may only impact certain parts of your library/application. That would require a mapping from benchmark names to these annotations as well, and then some more smarts on the javascript side about when to show certain annotations. That's a lot more work than the approach I describe above, but not impossible by any means. |
Ok, thanks. That sounds promising. Yes, the git tag or even the git commit message shows some information, but not necessarily what will help us understand performance changes retroactively, and so your proposal of generating such a mapping specifically for annotations seems appropriate. I too thought of the issue of multiple benchmarks for changes that affect only one of them, but that level of specificity seems like a feature that could be left safely far in the future. The worst that would happen with a simple benchmark-global annotation is that we'd get an irrelevant annotation on some point of the graph where the performance didn't actually change for that particular benchmark. I think it would be easy enough to ignore such annotations (since no one will be closely investigating parts of a benchmark graph where nothing interesting is happening!), and so probably not worth the effort to refactor the tags support at both ends as you describe. |
asv looks great, and we're currently considering using it for performance monitoring for our simulator instead of our own handcrafted and long-broken scripts. From what I can see, the only feature our scripts had that I don't see in asv is the ability to label specific commits with an explanation of what happened with them. We found that when we investigated the commit history, there were a few commits that caused major changes in performance (whether for good or for bad), and once we had completed our investigation we labelled those points on the performance charts, with a key that explains what happened. (E.g. "Added new optimized component XX", "Switched to slower but more general component XX", "Removed potentially unsafe optimization in XX.x", "Rewrote C component in Python using numpy"). Adding these explanations made it clear that we had understood what happened on those specific commits, so that when we came back to look later we didn't start all over again wondering what the big performance drop (or jump) was from. Does such a facility exist in asv? If not, would it be difficult to add? We certainly found it very useful, and appear to have a similar use case to yours...
The text was updated successfully, but these errors were encountered: