New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[create_supervised_trainer] add automatic mixed precision #1589
[create_supervised_trainer] add automatic mixed precision #1589
Conversation
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks a lot for the PR @ydcjeff !
Currently, launched it manually : https://app.circleci.com/pipelines/github/pytorch/ignite/1195/workflows/27d5b840-72bb-41e1-8d1c-84640f1f623c but I think either next commit or new PR will run auto on GPUs |
Thank you! |
I need help with the tests specifically with |
I'll try to implement something from my side and we'll see. |
…/ydcjeff/ignite into engine/create_supervised_trainer
We discussed this PR and related issue with the team and we think that we should explore a bit different approach. Helper method Probably, it would more helpful to provide public methods like:
and inside Basically, the idea is something like that : def get_training_step_1(a):
def training_step(e, b):
print(a, e, b)
return training_step
def get_training_step_2(a):
def training_step(e, b):
print(a, e, b, "with amp")
return training_step
def create_supervised_trainer(a, opt):
training_step = None
if opt == 1:
training_step = get_training_step_1(a)
elif opt == 2:
training_step = get_training_step_2(a)
e = Engine(training_step)
return e cc @sdesrozis any other ideas or thoughts ? |
That would be great for users to have these functions, helpful to check under the hood. My thoughts on this topic is about the update function. The dream would be to pass a generic function and have an automatic (or close) tools to adapt to features like amp, tpu, etc. |
Shall we also accept |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks for the update!
@sdesrozis can you review the PR please |
Thank you @sdesrozis for your review. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Looks good to me as well ! Thanks a lot @ydcjeff !
I left few nit comments about removing TPU mentions where it is inappropriate.
The comment about the warning and adding usage examples of these features could be done in a follow-up PR...
Could we add tests to decrease codecov warnings ? |
Let's do that all in a follow-up PR :) |
Co-authored-by: vfdev <vfdev.5@gmail.com>
…/ydcjeff/ignite into engine/create_supervised_trainer
…/ydcjeff/ignite into engine/create_supervised_trainer
I think those warnings are from
will do that. |
Found out that |
Do we know why it does not work ? |
I don't know very well, but it can be codecov failed to upload to its server. |
Asked here: codecov/codecov-bash#411 Anyway, if there is no way to fix the uploading we can remove |
@sdesrozis can you please merge this PR once the ci is OK. |
Thank you for your help and reviews. |
Fixes #1235
Description: Add automatic mixed precision using
torch.cuda.amp
andapex
.Usage:
Check list:
https://deploy-preview-1589--pytorch-ignite-preview.netlify.app/engine.html#