-
Notifications
You must be signed in to change notification settings - Fork 5.1k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Torch >= 2.2.0 inference issues on MPS #458
Comments
davmacario
changed the title
Torch >= 2.0.0 inference issues on MPS
Torch >= 2.2.0 inference issues on MPS
Mar 20, 2024
davmacario
added a commit
to davmacario/MDI-LLM
that referenced
this issue
Mar 22, 2024
I had the same issue on a M1 Pro Macbook with Torch 2.2.0 |
I encountered a similar issue as well. I reproduced a Transformer on |
Alright, thanks @sun1638650145, I'll give it a try and possibly update the issue! |
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
When running
having set
device = 'mps'
on my M1 Pro MacBook (MacOS 14.4), with Torch 2.2.1 and 2.2.0, I get this output:The character
!
corresponds to token 0, meaning that the model only generates 0's at the output.This does not happen when using Torch 2.1.x.
I know this is probably a Torch bug, but I could use the help trying to pinpoint the actual cause of this issue (and possibly submit a bug report to Torch).
Let me know if someone has had the same issue.
The text was updated successfully, but these errors were encountered: