Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

feat: support Pytorch multiple optimizers and LR schedulers #807

Merged
merged 1 commit into from
Jul 1, 2020

Conversation

shiyuann
Copy link
Contributor

@shiyuann shiyuann commented Jul 1, 2020

In the new interface for this support, users need to initialize the
model, optimizers, and LR schedulers in the constructor of the trial
interface. They need to call the wrapper functions in the context object
to initialize these objects under distributed training settings.
They also need to initialize amp themselves.

Description

Fixed a bug for #707

Test Plan

Use the test suite.

Commentary (optional)

N/A

…DET-3195, DET-3196, DET-3197, DET-3198]

In the new interface for this support, users need to initialize the
model, optimizers, and LR schedulers in the constructor of the trial
interface. They need to call the wrapper functions in the context object
to initialize these objects under distributed training settings.
They also need to initialize amp themselves.
@shiyuann
Copy link
Contributor Author

shiyuann commented Jul 1, 2020

See https://github.com/determined-ai/determined/runs/824799253 for details on the tests.

@shiyuann shiyuann merged commit 1836016 into master Jul 1, 2020
@shiyuann shiyuann deleted the pytorch-primitive branch July 1, 2020 15:15
@dannysauer dannysauer added this to the 0.12.11 milestone Feb 6, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants