Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[fix] T5 ONNX test: model.to(torch_device) #5769

Merged
merged 1 commit into from
Jul 15, 2020
Merged

Conversation

mfuntowicz
Copy link
Member

@mfuntowicz mfuntowicz commented Jul 15, 2020

Should fix #5724

Signed-off-by: Morgan Funtowicz morgan@huggingface.co

… export test.

Signed-off-by: Morgan Funtowicz <morgan@huggingface.co>
@mfuntowicz
Copy link
Member Author

Code Quality is failing due to files that were not modified in this PR, not addressing to avoid rebasing somewhere else.

@sshleifer sshleifer changed the title Ensure model and inputs are on the same device when executing T5 ONNX export test [fix] T5 ONNX test: model.to(torch_device) Jul 15, 2020
@sshleifer sshleifer merged commit d533c7e into master Jul 15, 2020
@sshleifer sshleifer deleted the t5_onnx_export_test_gpu branch July 15, 2020 14:11
@sshleifer
Copy link
Contributor

ill fix code quality.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

T5 ONNX Export Test Failing on GPU
2 participants