Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

fix #42078, improve the idempotency of callsite inlining #42082

Merged
merged 4 commits into from
Sep 4, 2021
Merged

Conversation

aviatesk
Copy link
Sponsor Member

@aviatesk aviatesk commented Sep 1, 2021

After #41328, inference can observe statement flags and try to re-infer
a discarded source if it's going to be inlined.
The re-inferred source will only be cached into the inference-local
cache, and won't be cached globally.

@aviatesk aviatesk added the compiler:optimizer Optimization passes (mostly in base/compiler/ssair/) label Sep 1, 2021
frame.src = finish!(interp, result)
end
run_optimizer && (frame.cached = true)
typeinf(interp, frame)
Copy link
Sponsor Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The previous form here was intended to be an example of running an external optimization pipeline. I think cached=false also has some follow-on effects of deciding how to resolve (and optimize) cycles?

Copy link
Sponsor Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yeah, that's why I made this change. Currently the optimization behavior really changes depending on the cache configuration, and it causes typeinf_code to behave slightly different from actual execution, which is very confusing.

For example,

@test length(code_typed(f_ifelse, (String,))[1][1].code) <= 2
was marked as @test_broken for long, but I found it has been functional actually
(the problem was that the previous cached = false hinders type_annotate! to do additional clean up work).

And this change is necessary for the test case added in this PR, because callsite inlining depends on typeinf_edge, which actually changes its behavior depending on caller's cache configuration.

External consumer can still plugin their own optimization pipeline by overloading OptimizationState, so I think this change is okay.

test/compiler/inline.jl Outdated Show resolved Hide resolved
Copy link
Sponsor Member

@vtjnash vtjnash left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Changing this to a symbol seems like a great idea

After #41328, inference can observe statement flags and try to re-infer
a discarded source if it's going to be inlined.
The re-inferred source will only be cached into the inference-local
cache, and won't be cached globally.
@aviatesk aviatesk closed this Sep 3, 2021
@aviatesk aviatesk reopened this Sep 3, 2021
@aviatesk
Copy link
Sponsor Member Author

aviatesk commented Sep 3, 2021

Test failures on linux 32 seem unrelated to this PR.

@vtjnash vtjnash added the status:merge me PR is reviewed. Merge when all tests are passing label Sep 3, 2021
@aviatesk aviatesk merged commit 876df79 into master Sep 4, 2021
@aviatesk aviatesk deleted the avi/fix42078 branch September 4, 2021 06:36
aviatesk added a commit to JuliaDebug/Cthulhu.jl that referenced this pull request Sep 4, 2021
aviatesk added a commit to aviatesk/JET.jl that referenced this pull request Sep 4, 2021
aviatesk added a commit to aviatesk/JET.jl that referenced this pull request Sep 4, 2021
aviatesk added a commit to aviatesk/JET.jl that referenced this pull request Sep 4, 2021
aviatesk added a commit to JuliaDebug/Cthulhu.jl that referenced this pull request Sep 6, 2021
aviatesk added a commit to JuliaDebug/Cthulhu.jl that referenced this pull request Sep 6, 2021
@DilumAluthge DilumAluthge removed the status:merge me PR is reviewed. Merge when all tests are passing label Sep 6, 2021
LilithHafner pushed a commit to LilithHafner/julia that referenced this pull request Feb 22, 2022
…liaLang#42082)

After JuliaLang#41328, inference can observe statement flags and try to re-infer
a discarded source if it's going to be inlined.
The re-inferred source will only be cached into the inference-local
cache, and won't be cached globally.
LilithHafner pushed a commit to LilithHafner/julia that referenced this pull request Mar 8, 2022
…liaLang#42082)

After JuliaLang#41328, inference can observe statement flags and try to re-infer
a discarded source if it's going to be inlined.
The re-inferred source will only be cached into the inference-local
cache, and won't be cached globally.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
compiler:optimizer Optimization passes (mostly in base/compiler/ssair/)
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

3 participants