Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Can't train in a PC with no GPU #7

Closed
MCCTT opened this issue Jun 18, 2021 · 1 comment
Closed

Can't train in a PC with no GPU #7

MCCTT opened this issue Jun 18, 2021 · 1 comment

Comments

@MCCTT
Copy link

MCCTT commented Jun 18, 2021

For device = cpu it fails because it tries to run: self.scale.cuda()
in encoder_decoder_layers and transformer_tree_model
self.scale is already set to_device this ".cuda()" shouldn't be needed
To reproduce try training in a machine with no GPU
---> cd baseline_model && python run_similarity_check.py

@ChengFu0118
Copy link

For multiple device training, remove cuda() will result in node mismatch. If you train it on cpu, you can simply remove it or check the training device first. Actually, I never tried to train the model on CPU-only.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants