-
-
Notifications
You must be signed in to change notification settings - Fork 5.4k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
simple inference graph profiler #34613
Conversation
base/compiler/typeinfer.jl
Outdated
@@ -453,8 +453,15 @@ function resolve_call_cycle!(linfo::MethodInstance, parent::InferenceState) | |||
return false | |||
end | |||
|
|||
const inference_graph = IdDict{Any,IdSet{Any}}() | |||
const record_edges = Ref(false) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
const record_edges = Ref(false) | |
const record_edges = Bool[false] |
maybe (for bootstrap)
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Is Ref
not defined early enough? Maybe a comment to that effect since that was the first thing I wondered when seeing the current state of the PR.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Ref
is available, but no constructors. Use RefValue
02ff9d9
to
8366e8b
Compare
I'm not sure if this is worth merging. On one hand, it was useful information for you. On the other hand, it's inactive code and might be unstable as an API over time (it's a very specific subset of information, and we will probably want more general result reports some day). |
I did not really intend to merge this; it's just here in case anybody wants to use this method again. |
I think we can effectively declare this present now |
This is the method I used in #34596.
Here is a gist with the analysis code and an example run:
https://gist.github.com/JeffBezanson/e381ea9cbe790d6d8f9bcf473a664f8c
Basically this is way to find which calls are leading to the most inference work, roughly speaking. It's not time-based and might not be helpful in every case, but in this case it gave a useful profile.