-
Notifications
You must be signed in to change notification settings - Fork 1.1k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Impose implicit search limit #13886
Impose implicit search limit #13886
Conversation
Impose a configurable limit on the total number of nodes constructed during an implicit search. Fixes scala#13838
- show what the root query was - under -explain, show the trace until the overflow occurred.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Otherwise LGTM
@@ -218,6 +218,7 @@ private sealed trait XSettings: | |||
val Xtarget: Setting[String] = ChoiceSetting("-Xtarget", "target", "Emit bytecode for the specified version of the Java platform. This might produce bytecode that will break at runtime. When on JDK 9+, consider -release as a safer alternative.", ScalaSettings.supportedTargetVersions, "", aliases = List("--Xtarget")) | |||
val XcheckMacros: Setting[Boolean] = BooleanSetting("-Xcheck-macros", "Check some invariants of macro generated code while expanding macros", aliases = List("--Xcheck-macros")) | |||
val XmainClass: Setting[String] = StringSetting("-Xmain-class", "path", "Class for manifest's Main-Class entry (only useful with -d <jar>)", "") | |||
val XimplicitSearchLimit: Setting[Int] = IntSetting("-Ximplicit-search-limit", "Maximal number of expressions to be generated in an implicit search", 100000) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
With this default, it takes the original problem 330 seconds (5.5 minutes) to finish. It seems a bit long, was that intended? I also noticed that if we use 90000 it takes 90 seconds and with 50000 it only takes 50 seconds to finish, the original time does not seem to be proportional. I wonder if we reached some memory limitations. Not sure if we should make this limit smaller.
Half the limit so that we fail in a bit less than a minute instead of more than 5 minutes.
- Use the wildcard approximation instead of the original type since that one determined what is eligible, and the goal is to refuse the search if everything is eligible. - Also refuse underspecified implicit parameters, not just conversions. - Treat wildcard types as underspecified. Two tests had to be reclassified. But the original tests were not meant to compile anyway. They were bout misleading error messages (no longer the case) and crashers.
Make another test to exercise the original large search behavior
) Two improvements for implicit searches involving type variables. 1. We now always add a comment when an implicit search is rejected due to the "too unspecific" criterion of #13886, commit [Refine checking for underspecified implicit queries](db5956b). There have been quite a few regressions that hit that problem, so it is good to know immediately what the issue is. 2. There is now a better wildcard approximation of higher-kinded type applications. This makes several programs (including original #15998) compile, which were classified as not specific enough before. Fixes #15998 Fixes #15820 Fixes #15670 Fixes #15160 Fixes #13986
Impose a configurable limit on the total number of nodes constructed during an implicit search.
Fixes #13838