proposal: testing: custom benchmark labels #28398
Currently go benchmark output reports some labels:
These are added by
It would be great if the user could add custom labels relevant to the performance of their code. For example:
I have noticed that the
The text was updated successfully, but these errors were encountered:
An option to
My concrete proposal is an additional function in the
// SetBenchmarkLabel records a value relevant to benchmark performance. This // will be included in benchmark output ahead of individual benchmark results. // Labels "goos", "goarch" and "pkg" are set by default. SetBenchmarkLabel(key, value string)
Not sure about naming, but hopefully this gives the idea.
I recently implmented Meow hash for Golang. This hash is designed to take advantage of hardware AES instructions, therefore the implementation dispatches to one of three backends based on CPUID flags at runtime (pure Go, AES-NI and VAES-256/512).
To provide useful context in the test/benchmark output, I have two test functions that exist
I would prefer to include these as "official" labels:
testing.SetBenchmarkLabel("backend", implementation) testing.SetBenchmarkLabel("hasaesni", cpu.HasAES) testing.SetBenchmarkLabel("hasavx", cpu.HasAVX) ...
Comparison: Google Benchmark
Note that Google benchmark also provides context with benchmark runs. For
It would be great it was also possible to add CPU/cache information in Go
Note that this output has been formalized in the following document:
In particular these "labels" are called "Configuration Lines".
Moreover, this document makes
The existing implementation (and prototype CL) print the labels inside a
// SetBenchmarkLabel records a value relevant to benchmark performance. This // will be included in benchmark output ahead of individual benchmark results. // Any SetBenchmarkLabel calls after the first benchmark is executed will be // ignored. Labels "goos", "goarch" and "pkg" are set by default. SetBenchmarkLabel(key, value string)
Did you have some other behavior in mind?
Yes it would be great to get @aclements input.
This seems easy enough, but I'd like to understand the use case you have in mind. Are you thinking about
I'd like to understand what @rsc is proposing here a bit better, too.
It does seem a bit odd that
I was thinking that this would be something done inside a Benchmark function, but Austin points out that at that point the "BenchmarkName " prefix has been printed, making it inconvenient to print new key:value pairs. Also, the examples in the initial request are all global to the process, so maybe it makes sense to scope down to something global to the process that gets printed before any benchmarks start.
Then the only question is how do we know benchmarks are about to start. TestMain could print them unconditionally even today. If we give TestMain more access to the list of intended tests and benchmarks (see #28592) then it could avoid the print if no benchmarks will run. But maybe the lines in question would not be so bad to print always, so maybe we don't need to do anything special at all. For that matter,
As far as the case study, that could be done today with:
In my opinion there's value in making this a "first class citizen". Printing works but feels hacky.
Here's a hypothetical (but I believe not far fetched) reason why this could matter down the line. Google Benchmark offers
Now my print statements give me:
I'm not arguing for this specific feature right now. My point is just that if people are printing labels, you've lost the semantics, and are unable to manipulate the labels in any structured way.
Finally, I think adding this as a function encourages people to use it, which I think is a positive too.