-
Notifications
You must be signed in to change notification settings - Fork 6.3k
7529 do not disable autocast for cuda devices #7530
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Changes from all commits
688051c
14a72b8
20a7c8f
ad3eb80
381f73d
59b9b41
2f9e7c6
35cdecd
561bf2d
55912ed
455bb6f
0ce09ec
b1b8bc9
1483581
d084ad5
File filter
Filter by extension
Conversations
Jump to
Diff view
Diff view
There are no files selected for viewing
Original file line number | Diff line number | Diff line change |
---|---|---|
|
@@ -752,6 +752,10 @@ def main(args): | |
project_config=accelerator_project_config, | ||
) | ||
|
||
# Disable AMP for MPS. | ||
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. @bghira possible to add a more descriptive comment here? There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Also, I see this is not added to some scripts such as the advanced diffusion or consistency distillation scripts. |
||
if torch.backends.mps.is_available(): | ||
accelerator.native_amp = False | ||
|
||
# Make one log on every process with the configuration for debugging. | ||
logging.basicConfig( | ||
format="%(asctime)s - %(levelname)s - %(name)s - %(message)s", | ||
|
Uh oh!
There was an error while loading. Please reload this page.