Expected all tensors to be on the same device, but found at least two devices... #13117
Replies: 3 comments 1 reply
-
@hensel-f Could you provide your complete script and error so that I see where exactly the error is being raised? As I quickly skimmed in your partial script, I realised that you're calling |
Beta Was this translation helpful? Give feedback.
-
I'm seeing a similar error and would love to know the resolution. |
Beta Was this translation helpful? Give feedback.
-
I would love to see better documentation on the Lightning website. |
Beta Was this translation helpful? Give feedback.
-
Hi, I'm very new to pytorch and pytorch lightning, and I'm uncertain how different devices (e.g. CPU/GPU) need to be handled. I'm want to train a network in such a way that after a fixed number of epochs there is an additional term (defined by a custom function) added to the loss function. However, as soon as the first epoch where the custom loss term occurs is reached I get the following error:
What am I doing wrong and how can I fix this issue?
Any help and insight is very much appreciated, thanks in advance!
If you have any clarifying questions please let me know.
Here is the model definition (except for the custom loss functions):
The following part is used for loading the data and training the model:
Beta Was this translation helpful? Give feedback.
All reactions