-
Notifications
You must be signed in to change notification settings - Fork 96
Description
Hi 👋
As users of crater, it would be helpful if it recorded how long each job took. I don't know if this is already available but I don't think so.
We'll run crater for in-progress features, and sometimes it may encounter big performance issues, but without any way of gathering these durations we can't know about them except by random chance during development, or when the feature lands and users of these crates report issues. Too late of course.
I'm asking about this in the context of the trait-solver rewrite, where by random chance we noticed big performance discrepancies when testing individual regressions found by crater. We thought it could interesting to look for these cases when they happen on crater itself.
Now I know crater is not a good tool for benchmarking so this is not asking for that, and I also expect the variance to be high, not least because the order in which jobs are executed is surely non-deterministic. That's ok. It's bound to be a noisy signal.
It's not hugely important to be shown in the report: people will ignore durations the vast majority of the time, but they'd be available when they need it. When specifically searching for huge outliers, people would look at the durations in the json results, and then try to reproduce to remove the noise.
Do you think this could work? I'd be happy to try and help implementing it if so.