Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add a page to compare durations of the interpreter and YJIT #220

Open
k0kubun opened this issue Jan 23, 2024 · 4 comments
Open

Add a page to compare durations of the interpreter and YJIT #220

k0kubun opened this issue Jan 23, 2024 · 4 comments
Assignees

Comments

@k0kubun
Copy link
Member

k0kubun commented Jan 23, 2024

Problem

When the interpreter slows down and YJIT's performance doesn't change (e.g. ruby/ruby#8650 (comment)), YJIT speedup goes up, but we can't know whether YJIT is actually sped up or not.

Proposed solution

Make it possible to see the time spent on each benchmark for each implementation (interpreter or YJIT). One idea is to add yjit_time and interp_time to the "YJIT Stats Over Time" page, but it'd be nice if we can show them at the same time.

Basically, we want this view (click "Time" on this page) but on speed.yjit.org.

Screenshot 2024-01-23 at 14 57 56
@rwstauner
Copy link
Contributor

rwstauner commented Jan 23, 2024

Just noting a few observations as I understand the current state:
On speed.yjit.org:

  • The first graph always shows "no JIT" at 1, so a slowdown here would simply make the YJIT bar appear higher
  • On the results over time graph (and the speedup and statistics page) there is only one value, so presumably it's for yjit (no "no-jit" for comparison).
  • There is a "YJIT vs CRuby" graph that shows "no jit" vs "yjit" for each selected benchmark

The bar graphs are a snapshot of the latest values so there may not be a way to visualize the slowdown there.
However for any "over time" graph we should be able to add the values for each executable.

@k0kubun
Copy link
Member Author

k0kubun commented Jan 23, 2024

There is a "YJIT vs CRuby" graph that shows "no jit" vs "yjit" for each selected benchmark

The bar graphs are a snapshot of the latest values so there may not be a way to visualize the slowdown there.
However for any "over time" graph we should be able to add the values for each executable.

Yeah the "YJIT vs CRuby Memory Usage Over Time" page might be the closest to the graph we want. We already have "Memory Usage" comparison there, but we also need "Time" comparison.

@maximecb
Copy link
Contributor

maximecb commented Jan 24, 2024

@k0kubun I had a conversation with @rwstauner about something related to this. I had made a suggestion which I think could solve this problem.

Basically, our goal for end of year 2024 is to make YJIT 3.4 perform better than YJIT 3.3. In that respect, I think it would be very helpful if the main page of speed.yjit.org showed bars comparing YJIT 3.4 vs YJIT 3.3 on the headline benchmarks. We could do the same for CRuby 3.4 vs CRuby 3.3

So, the performance graph on the main page would have 3 bars. All normalized to the speed of the CRuby 3.3 interpreter. One for CRuby 3.4, one for YJIT 3.3, and one for YJIT 3.4. That way, we would immediately notice any performance regressions compared to the previous year's version. It would be obvious if the interpreter's performance dips below 1.0. It would also be obvious if this year's YJIT dips below the previously released YJIT. We could also highlight bars which show performance regressions.

We could also have a similar bar graph for memory usage. This could also help detect GC regressions and such.

@k0kubun
Copy link
Member Author

k0kubun commented Jan 24, 2024

Having a CRuby 3.3 vs CRuby 3.4 graph would allow you to find an interpreter regression as you said.

When you find a regression using it, however, it'd be still useful to have the page I'm talking about to investigate when it started failing because we may notice it days/weeks/months later.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants