New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add --subtract option #265
Comments
Thank you for the feedback.
Interesting idea!
Oh, you mean we would measure
Sounds like a great idea to me. The only issue I see is that we would have to deal with negative times in some way or another. This would be quite difficult to communicate to the user, I guess. Setting negative times to 0 could be an option, but also doesn't feel quite right, if we do not print a warning or similar. |
Oh, right, that can result in negative times, hadn't thought of that. Maybe adding a note about this to the man page/--help output is enough? And then printing a warning if the init times are too large (and vary too much) relatively to the benchmarked command |
Yeah, I think that could work. I'm inclined to accept this as a new feature request if it can be cleanly implemented within the current codebase (i.e. without requiring very large scale changes to the program logic). |
I've worked around this by creating a custom "shell" (which is just a shell script, but can be used as a
|
This has been suggested as a possible solution in #170, and would also sort of solve #39. My usecase for this is that I want to benchmark page loads, for which I'm using headless chromium, but want to subtract the time that chromium takes to start up. Here a small example of what I mean, this is how it looks today:
I can do the math myself for now, but adding this in hyperfine would be really cool.
As for how I think this could be implemented: Currently, the shell spawn time is measured and removed (#15), and a
--subtract
option could just be a more generic version of that, with an empty command being the default and serving as the replacement for the shell spawn time. How the subtraction is calculated into the result can just be taken from the shell spawn time subtraction for now, but IMO that should change: Right now, you just subtract the mean shell spawn time from every single execution time, but wouldn't it make more sense to measure all the execution times and then calculate the shell spawn times into mean, min and max? I mean the shell spawn time shouldn't change enough that this is actually a problem, but it feels a bit off, and when things like chromium start up are measured, it starts to make a small difference.The text was updated successfully, but these errors were encountered: