Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Again Device issue #30

Closed
kmr2017 opened this issue May 5, 2022 · 3 comments
Closed

Again Device issue #30

kmr2017 opened this issue May 5, 2022 · 3 comments
Assignees

Comments

@kmr2017
Copy link

kmr2017 commented May 5, 2022

I am trying to the code. But I face problem

when I execute below line:
output = docformer(v_bar, t_bar, v_bar_s, t_bar_s) # shape (1, 512, 768)

I get this error.

/usr/local/lib/python3.7/dist-packages/torch/functional.py in einsum(*args)
328 return einsum(equation, *_operands)
329
--> 330 return _VF.einsum(equation, operands) # type: ignore[attr-defined]
331
332 # Wrapper around _histogramdd and _histogramdd_bin_edges needed due to (Tensor, Tensor[]) return type
RuntimeError: Expected all tensors to be on the same device, but found at least two devices, cuda:0 and cpu! (when checking argument for argument mat2 in method wrapper_bmm)

This time, I face issue in wrapper_

@kmr2017 kmr2017 changed the title Again Device issues Again Device issue May 5, 2022
@uakarsh
Copy link
Collaborator

uakarsh commented May 5, 2022

Maybe, I think, this line is responsible for the issue:

self.scale = torch.sqrt(torch.FloatTensor([embed_dim]))

And, which the below lines are having problem of .to(device), especially when the device is cuda. Would shortly modify it and let you know, if the problem still persists. If possible, can you do try to change the above line from:
self.scale = torch.sqrt(torch.FloatTensor([embed_dim])) to self.scale = embed_dim**0.5, and remove all the .to(device) parts in the below set of lines and let us know?

I would try from my end as well and would let you know. Thanks for pointing this issue out.

Maybe, this couldn't be the case, and there could be something else, but I would let you know soon. And, can you do let us know, the whole part of the code, since the above mentioned line of code, won't help me recreate the bug

@uakarsh
Copy link
Collaborator

uakarsh commented May 17, 2022

Can you do let me know, if the issue has been resolved or not? If not resolved, can you help me with reproducing the error on Google Colab, since then I can definitely try to solve th bug, and update the same in this repo

Regards,
Akarsh

uakarsh referenced this issue in uakarsh/docformer Jun 2, 2022
@uakarsh
Copy link
Collaborator

uakarsh commented Jun 2, 2022

Hi @kmr2017, sorry for the late reply, but I faced this issue just now, and I think I have managed to solve it. You can just clone: https://github.com/uakarsh/docformer, and this would do the thing (as far as I know). And, do let me know if that solves the issue or not. I would shortly include the update in the main branch as well.

Regards,

@uakarsh uakarsh self-assigned this Jun 2, 2022
@uakarsh uakarsh closed this as completed Jun 21, 2022
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants