Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Replace opaque return types in optim #1767

Merged
merged 5 commits into from
May 14, 2024
Merged

Conversation

benbaarber
Copy link
Contributor

Pull Request Template

Checklist

  • Confirmed that run-checks all script has been executed.
  • Made sure the book is up to date with changes in this PR.

Related Issues/PRs

#1755

Changes

Updated Adam, AdamW, and AdaGrad config init to return OptimizerAdaptor<O<B::InnerBackend>, M, B> instead of the opaque type impl Optimizer<M, B>. Now the result of OptimizerConfig.init() can be stored in structs, solving the main problem in the above linked issue.

Testing

No additional testing required, just function signature changes.

Copy link

codecov bot commented May 13, 2024

Codecov Report

All modified and coverable lines are covered by tests ✅

Project coverage is 86.61%. Comparing base (e4d0cf3) to head (8c841b1).

Additional details and impacted files
@@           Coverage Diff           @@
##             main    #1767   +/-   ##
=======================================
  Coverage   86.61%   86.61%           
=======================================
  Files         700      700           
  Lines       83473    83479    +6     
=======================================
+ Hits        72301    72307    +6     
  Misses      11172    11172           

☔ View full report in Codecov by Sentry.
📢 Have feedback on the report? Share it here.

@nathanielsimard nathanielsimard merged commit d3cd6c4 into tracel-ai:main May 14, 2024
14 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

2 participants