Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[torch-neuron] Solve NaN issue when using transformers>=4.20 #474

Closed
aws-maens opened this issue Aug 22, 2022 · 1 comment
Closed

[torch-neuron] Solve NaN issue when using transformers>=4.20 #474

aws-maens opened this issue Aug 22, 2022 · 1 comment

Comments

@aws-maens
Copy link
Contributor

Solve NaN issue when using transformers>=4.20 (#456)

@aws-maens aws-maens created this issue from a note in AWS Neuron roadmap (obsolete) (Working on it) Aug 22, 2022
@aws-maens aws-maens changed the title Solve NaN issue when using transformers>=4.20 (https://github.com/aws/aws-neuron-sdk/issues/456) [torch-neuron] Solve NaN issue when using transformers>=4.20 Sep 26, 2022
@jluntamazon
Copy link
Contributor

This should no longer cause problems on inf1 when using torch-neuron as of the 2.5.0 release: https://awsdocs-neuron.readthedocs-hosted.com/en/latest/release-notes/prev/rn.html#neuron-2-5-0-11-23-2022

This resolves the behavior seen in #456. The script in the ticket now produces the correct results.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
Development

No branches or pull requests

2 participants