Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

I encountered a problem while training the model. #4

Closed
DeepAIDD opened this issue Apr 12, 2022 · 1 comment
Closed

I encountered a problem while training the model. #4

DeepAIDD opened this issue Apr 12, 2022 · 1 comment

Comments

@DeepAIDD
Copy link

File "train.py", line 308, in
main(args)
File "train.py", line 127, in main
loss, acc = model(batch)
File "/root/anaconda3/envs/g2s/lib/python3.6/site-packages/torch/nn/modules/module.py", line 722, in _call_impl
result = self.forward(*input, **kwargs)
File "/Z70177/prog/wzp/g2s/models/graph2seq_series_rel.py", line 114, in forward
padded_memory_bank, memory_lengths = self.encode_and_reshape(reaction_batch)
File "/Z70177/prog/wzp/g2s/models/graph2seq_series_rel.py", line 106, in encode_and_reshape
reaction_batch.distances
File "/root/anaconda3/envs/g2s/lib/python3.6/site-packages/torch/nn/modules/module.py", line 722, in _call_impl
result = self.forward(*input, **kwargs)
File "/Z70177/prog/wzp/g2s/models/attention_xl.py", line 263, in forward
out = layer(out, mask, distances)
File "/root/anaconda3/envs/g2s/lib/python3.6/site-packages/torch/nn/modules/module.py", line 722, in _call_impl
result = self.forward(*input, **kwargs)
File "/Z70177/prog/wzp/g2s/models/attention_xl.py", line 202, in forward
context, _ = self.self_attn(input_norm, mask=mask, distances=distances)
File "/root/anaconda3/envs/g2s/lib/python3.6/site-packages/torch/nn/modules/module.py", line 722, in _call_impl
result = self.forward(*input, **kwargs)
File "/Z70177/prog/wzp/g2s/models/attention_xl.py", line 139, in forward
b_d = torch.matmul(query + v, rel_emb_t
RuntimeError: CUDA error: device-side assert triggered
Please help me solve it!

@zhengkaitu
Copy link
Collaborator

Addressed in #2

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants