-
Notifications
You must be signed in to change notification settings - Fork 9
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add a page to compare durations of the interpreter and YJIT #220
Comments
Just noting a few observations as I understand the current state:
The bar graphs are a snapshot of the latest values so there may not be a way to visualize the slowdown there. |
Yeah the "YJIT vs CRuby Memory Usage Over Time" page might be the closest to the graph we want. We already have "Memory Usage" comparison there, but we also need "Time" comparison. |
@k0kubun I had a conversation with @rwstauner about something related to this. I had made a suggestion which I think could solve this problem. Basically, our goal for end of year 2024 is to make YJIT 3.4 perform better than YJIT 3.3. In that respect, I think it would be very helpful if the main page of speed.yjit.org showed bars comparing YJIT 3.4 vs YJIT 3.3 on the headline benchmarks. We could do the same for CRuby 3.4 vs CRuby 3.3 So, the performance graph on the main page would have 3 bars. All normalized to the speed of the CRuby 3.3 interpreter. One for CRuby 3.4, one for YJIT 3.3, and one for YJIT 3.4. That way, we would immediately notice any performance regressions compared to the previous year's version. It would be obvious if the interpreter's performance dips below 1.0. It would also be obvious if this year's YJIT dips below the previously released YJIT. We could also highlight bars which show performance regressions. We could also have a similar bar graph for memory usage. This could also help detect GC regressions and such. |
Having a CRuby 3.3 vs CRuby 3.4 graph would allow you to find an interpreter regression as you said. When you find a regression using it, however, it'd be still useful to have the page I'm talking about to investigate when it started failing because we may notice it days/weeks/months later. |
Problem
When the interpreter slows down and YJIT's performance doesn't change (e.g. ruby/ruby#8650 (comment)), YJIT speedup goes up, but we can't know whether YJIT is actually sped up or not.
Proposed solution
Make it possible to see the time spent on each benchmark for each implementation (interpreter or YJIT). One idea is to add
yjit_time
andinterp_time
to the "YJIT Stats Over Time" page, but it'd be nice if we can show them at the same time.Basically, we want this view (click "Time" on this page) but on speed.yjit.org.
The text was updated successfully, but these errors were encountered: