Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

simple inference graph profiler #34613

Closed
wants to merge 1 commit into from
Closed

Conversation

JeffBezanson
Copy link
Sponsor Member

This is the method I used in #34596.
Here is a gist with the analysis code and an example run:
https://gist.github.com/JeffBezanson/e381ea9cbe790d6d8f9bcf473a664f8c

Basically this is way to find which calls are leading to the most inference work, roughly speaking. It's not time-based and might not be helpful in every case, but in this case it gave a useful profile.

@JeffBezanson JeffBezanson added compiler:inference Type inference compiler:latency Compiler latency labels Jan 31, 2020
@@ -453,8 +453,15 @@ function resolve_call_cycle!(linfo::MethodInstance, parent::InferenceState)
return false
end

const inference_graph = IdDict{Any,IdSet{Any}}()
const record_edges = Ref(false)
Copy link
Sponsor Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
const record_edges = Ref(false)
const record_edges = Bool[false]

maybe (for bootstrap)

Copy link
Sponsor Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Is Ref not defined early enough? Maybe a comment to that effect since that was the first thing I wondered when seeing the current state of the PR.

Copy link
Sponsor Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Ref is available, but no constructors. Use RefValue

@vtjnash
Copy link
Sponsor Member

vtjnash commented Feb 17, 2020

I'm not sure if this is worth merging. On one hand, it was useful information for you. On the other hand, it's inactive code and might be unstable as an API over time (it's a very specific subset of information, and we will probably want more general result reports some day).

@JeffBezanson
Copy link
Sponsor Member Author

I did not really intend to merge this; it's just here in case anybody wants to use this method again.

@vtjnash
Copy link
Sponsor Member

vtjnash commented Oct 27, 2020

I think we can effectively declare this present now

@DilumAluthge DilumAluthge deleted the jb/inferenceedges branch March 25, 2021 21:57
@DilumAluthge DilumAluthge removed the status:DO NOT MERGE Do not merge this PR! label Jun 18, 2021
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
compiler:inference Type inference compiler:latency Compiler latency
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

5 participants