Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Deprecation warning when using torch 1.8 #90

Closed
egpbos opened this issue Apr 28, 2021 · 1 comment
Closed

Deprecation warning when using torch 1.8 #90

egpbos opened this issue Apr 28, 2021 · 1 comment

Comments

@egpbos
Copy link
Collaborator

egpbos commented Apr 28, 2021

While trying to fix warnings, I stumbled upon a new warning that only comes up since pytorch 1.8: UserWarning: Using a non-full backward hook when the forward contains multiple autograd Nodes is deprecated and will be removed in future versions. This hook will be missing some grad_input. Please use register_full_backward_hook to get the documented behavior.

The warning is triggered in the basic and transformer tests:

platalea/basic.py:39: in cost
    speech_enc, image_enc = self.forward(item['audio'], item['audio_len'], item['image'])
platalea/basic.py:34: in forward
    speech_enc = self.SpeechEncoder(audio, audio_len)
../../../sw/miniconda3/envs/platalea/lib/python3.8/site-packages/torch/nn/modules/module.py:914: in _call_impl
    self._maybe_warn_non_full_backward_hook(input, result, grad_fn)

(Note: I added the forward function in SpeechImage as a test, this call to SpeechEncoder is the actual troublemaker, and it was in SpeechImage.cost() before.)

I frankly have no idea what a backward hook is, let alone a non-full one. Anyone have any clue?

My best guess is that it has something to do with the multiple inputs and multiple outputs and some kind of missing element regarding autograd somewhere. I tried to search the other models/experiments (asr, mtl) for hints in this direction, but couldn't really find anything.

@egpbos
Copy link
Collaborator Author

egpbos commented May 17, 2021

It seems that the warning is gone now. Testing on Torch 1.8.1 now, so the issue may have been with 1.8.0 specifically.

@egpbos egpbos closed this as completed May 17, 2021
@egpbos egpbos mentioned this issue May 17, 2021
5 tasks
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant