Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Use of opaque types causes an extreme slow down in dialyzer #7835

Open
hazardfn opened this issue Nov 7, 2023 · 6 comments
Open

Use of opaque types causes an extreme slow down in dialyzer #7835

hazardfn opened this issue Nov 7, 2023 · 6 comments
Assignees
Labels
bug Issue is reported as a bug team:VM Assigned to OTP team VM types The issue is related to types

Comments

@hazardfn
Copy link

hazardfn commented Nov 7, 2023

Describe the bug

We have a fairly large Elixir project that has ~500 declared opaque types. With these opaque types in the project the dialyzer run time can be upwards of 2 hours, this slow down also seems to affect the creation of the PLTs. A coworker has created a table that compares our dialyzer run times:

Build PLTs Run Dialyzer
With Opaque with fresh PLT build ~ 1h 50m timed out
With Opaque using cached PLTs ~ 2m ~ 55m
Opaque Removed with fresh PLT build ~ 11m ~ 6m
Opaque Removed using cached PLTs ~ 1m ~ 6m

The Opaque Removed run times were collected by replacing all of our @opaque types with @type

NOTE: Our CI Pipeline has a timeout of 2 hours across the 2 steps hence the timeout in the first value.

To Reproduce
I'm unsure exactly how noticeable the slowdown is per @opaque type, however doing a run with a number of @opaque types and comparing once all of those are changed to @type should highlight the difference. This time appears to grow as more opaque types are added.

Expected behavior
While I understand the additional checks that may be required for an opaque type will extend runtime, I wouldn't have expected the difference to be so drastic. Perhaps there is room for optimization here.

Affected versions
Tested only with 25.3.2.7

Additional context
N/A

@hazardfn hazardfn added the bug Issue is reported as a bug label Nov 7, 2023
@jhogberg jhogberg self-assigned this Nov 7, 2023
@jhogberg jhogberg added team:VM Assigned to OTP team VM types The issue is related to types labels Nov 7, 2023
@jhogberg
Copy link
Contributor

jhogberg commented Nov 7, 2023

Hi! Can you cherry-pick the top of https://github.com/jhogberg/otp/tree/john/dialyzer/investigate-extreme-opaque-slowdown and see if it helps?

@TD5
Copy link
Contributor

TD5 commented Nov 10, 2023

I have a custom version of Dialyzer which is optimised for the codebase I usually work with. It has a lot of changes, so the impact in the OSS version might be different, but I found memoising opaques from records in their own ETS table beside records in codeserver helped a lot, since this operation is performed potentially quite a few times:

-spec lookup_mod_record_opaques(module(), codeserver()) -> [erl_types:erl_type()].

lookup_mod_record_opaques(Mod, #codeserver{opaques = OpaqueDict}=CS) when is_atom(Mod) ->
  case ets:lookup_element(OpaqueDict, Mod, 2, not_memoised_yet) of
    not_memoised_yet ->
      Records = lookup_mod_records(Mod, CS),
      Opaques = erl_types:t_opaque_from_records(Records),
      ets_map_store(Mod, Opaques, OpaqueDict),
      Opaques;
    Memoised ->
      Memoised
  end.

I do hope to make a PR with the full change at some point, but I thought I'd share this snippet now since it may be relevant.

@jhogberg jhogberg added the waiting waiting for changes/input from author label Nov 20, 2023
@jhogberg
Copy link
Contributor

@hazardfn have you had a chance to look at that branch?

@hazardfn
Copy link
Author

@hazardfn have you had a chance to look at that branch?

I gave it a quick look before having to move onto other things, it didn't seem to have a massive impact on the timing, but certainly some, I wrote down that it did have a reasonable impact on the CPU usage, it stopped my M1 from constantly being thermally throttled during the process.

I have been meaning to swing back and actually collect some timing data for you but just haven't had the chance at work yet 😞.

I'll try to find a moment tomorrow, doing it from my machine I need to collect a whole new set of baselines for you than the ones above, as they were done in a GitHub runner.

@jhogberg jhogberg removed the waiting waiting for changes/input from author label Nov 20, 2023
@jhogberg
Copy link
Contributor

That'd be great, that branch is still expected to be horrifically slow but if we can confirm that it helps it shouldn't take too long to bring -opaque to par with -type performance-wise. There's no hurry on our end though. 🙂

@jhogberg
Copy link
Contributor

jhogberg commented May 9, 2024

Just as a short update, the work we're doing with nominal types (erlang/eep#60) will fix this problem. It'll most likely make it into OTP 28.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Issue is reported as a bug team:VM Assigned to OTP team VM types The issue is related to types
Projects
None yet
Development

No branches or pull requests

3 participants