New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
BUG: Cannot convert a MPS Tensor to float64 dtype
on Apple M1 Max
#700
Comments
Copying my reply from the forum:
(I first thought it was because we were loading spectrograms as float64, but in fact we apply a transform to make them float32)
I will raise a separate issue stating we need all metrics to return tensors, apply the fix I have in progress in that branch, and then release a new version |
Makes functions in `vak.transforms.distance.functional` return tensors so we don't cause errors when lightning tries to convert from numpy to tensors to log. Letting lightning do the conversion kind of works, but it can cause a fatal error for someone using an Apple M1 with 'mps' as the accelerator, see https://forum.vocalpy.org/t/vak-tweetynet-with-an-apple-m1-max/78/4?u=nicholdav I don't find any explicit statement in either the Lightning or Torchmetrics docs that metrics should always be tensors, and that this guarantees there won't be weird issues (right now we get a warning on start-up that all logged scalars should be float32, but I would expect one should be able to log integers too?). But from various issues I read, it seems like that should be the case, Lightning-AI/pytorch-lightning#2143 and I notice that torchmetrics classes tend to do things like convert to a float tensor
@all-contributors please add @VenetianRed for bug |
I've put up a pull request to add @VenetianRed! 🎉 |
Before submitting a bug, please make sure the issue hasn't been already addressed by searching through the past issues
Describe the bug
This is a bug reported by @VenetianRed in the vocalpy forum here:
https://forum.vocalpy.org/t/vak-tweetynet-with-an-apple-m1-max/78
Environment file attached:
condaList.txt
The text was updated successfully, but these errors were encountered: