Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[TTFS] Improve type inference in MadNLP #227

Merged
merged 1 commit into from
Oct 6, 2022
Merged

Conversation

frapac
Copy link
Collaborator

@frapac frapac commented Sep 30, 2022

No description provided.

@frapac frapac requested a review from sshin23 September 30, 2022 14:32
@frapac
Copy link
Collaborator Author

frapac commented Oct 2, 2022

For information, this PR does not improve by a lot the precompilation time (apparently Julia is quite good to deal with this type of non-infered type).

For the records, on MadNLP#master, we get with SnoopCompile:

InferenceTimingNode: 5.120791/8.009915 on Core.Compiler.Timings.ROOT() with 9 direct children

On MadNLP#fp/type_inference:

InferenceTimingNode: 4.226086/6.882473 on Core.Compiler.Timings.ROOT() with 2 direct children

x-ref: #186

@frapac frapac changed the title Improve type inference in MadNLP [TTFS] Improve type inference in MadNLP Oct 2, 2022
@sshin23 sshin23 merged commit 968245a into master Oct 6, 2022
@frapac frapac deleted the fp/type_inference branch May 31, 2023 08:20
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

2 participants