Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Transformer/Transformer(Greedy_decoder)-Torch.py on gpu #35

Open
kangkang61 opened this issue Aug 9, 2019 · 2 comments
Open

Transformer/Transformer(Greedy_decoder)-Torch.py on gpu #35

kangkang61 opened this issue Aug 9, 2019 · 2 comments

Comments

@kangkang61
Copy link

Hello, I want to put the Transformer (Greedy_decoder)-Torch.py code on the gpu, using model=model.to(device), input_data also to (device), but the error still appears "Expected object of backend CUDA but backend CPU for argument #2 'mat2”

@zhangbo2008
Copy link

i fix the code in my clones
https://github.com/zhangbo2008/nlp-tutorial/blob/master/1gpu.py
use:pytorch 1.4 can run

@zhangbo2008
Copy link

i am learning transformer too , i still wonder why we use target_sentence in traning?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants
@kangkang61 @zhangbo2008 and others