Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fix default_device to correctly detect + use mps (Apple Silicon) #3858

Merged
merged 1 commit into from
Jan 16, 2023

Conversation

wolever
Copy link
Contributor

@wolever wolever commented Jan 14, 2023

This fixes two minor bugs preventing fastai.torch_core.default_device from detecting mps (Apple Silicon) GPUs.

  1. The torch.backends.mps.is_available attribute does not exist in PyTorch 1.13.1. If this attribute is unavailable, this patch uses torch.has_mps instead.
  2. A minor logic bug in default_device prevented _has_mps from being called.

Additionally, this fixes #3785 by removing the duplicate default_device function.

@wolever wolever requested a review from jph00 as a code owner January 14, 2023 06:19
@review-notebook-app
Copy link

Check out this pull request on  ReviewNB

See visual diffs & provide feedback on Jupyter Notebooks.


Powered by ReviewNB

@wolever wolever changed the title Fix default_device to correctly detect + uses mps (Apple Silicon) Fix default_device to correctly detect + use mps (Apple Silicon) Jan 14, 2023
`torch.backends.mps.is_available` does not exist in PyTorch 1.13.1. If
this attribute is unavailable, try using `torch.has_mps` instead.
@jph00
Copy link
Member

jph00 commented Jan 16, 2023

Many thanks.

@jph00 jph00 merged commit f109acd into fastai:master Jan 16, 2023
@wolever wolever deleted the fix-default-device-with-mps branch January 18, 2023 09:11
@jph00 jph00 added the bug label Feb 15, 2023
@ypauchard
Copy link

ypauchard commented Mar 14, 2023

@wolever the new _has_mps() function now in fastai 2.7.11 causes an error on Intel Macs with OS < 12.3:

RuntimeError: The MPS backend is supported on MacOS 12.3+.Current OS version can be queried using `sw_vers`

I think it is because torch.backends.mps.is_available() returns False (the correct value), but torch.has_mps is True. In this case _has_mps() returns True and the backend is set to mps. Using mps will cause the error above.

I suggest changing _has_mps() to return the value of torch.backends.mps.is_available() if it exists and only default to torch.has_mps if this function is not available

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

Successfully merging this pull request may close these issues.

"default_device" logic is repeated twice, related to mps / OSX support.
3 participants