You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Thanks for the open source code! And there is a problem confuing me.
When i running the code with [python main.py],there is a Pytorch error:
RuntimeError: one of the variables needed for gradient computation has been modified by an inplace operation: [torch.cuda.FloatTensor [512, 38]], which is output 0 of AsStridedBackward0, is at version 2; expected version 1 instead. Hint: enable anomaly detection to find the operation that failed to compute its gradient, with torch.autograd.set_detect_anomaly(True).
Code error is located in solver.py,line 188
//Minimax strategy
loss1.backward(retain_graph=True)
self.optimizer.step()
loss2.backward()
self.optimizer.step()
when i change the code to
loss1.backward(retain_graph=True)
//self.optimizer.step()
loss2.backward()
self.optimizer.step()
It worked!
I wonder if the first [self.optimizer.step()] should be annotated,if not, how to resolve the error.
Thank you.
The text was updated successfully, but these errors were encountered:
Thanks for the open source code! And there is a problem confuing me.
When i running the code with [python main.py],there is a Pytorch error:
RuntimeError: one of the variables needed for gradient computation has been modified by an inplace operation: [torch.cuda.FloatTensor [512, 38]], which is output 0 of AsStridedBackward0, is at version 2; expected version 1 instead. Hint: enable anomaly detection to find the operation that failed to compute its gradient, with torch.autograd.set_detect_anomaly(True).
Code error is located in solver.py,line 188
//Minimax strategy
loss1.backward(retain_graph=True)
self.optimizer.step()
loss2.backward()
self.optimizer.step()
when i change the code to
loss1.backward(retain_graph=True)
//self.optimizer.step()
loss2.backward()
self.optimizer.step()
It worked!
I wonder if the first [self.optimizer.step()] should be annotated,if not, how to resolve the error.
Thank you.
The text was updated successfully, but these errors were encountered: