Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We鈥檒l occasionally send you account related emails.

Already on GitHub? Sign in to your account

[jit] Support for modules that have hooks when compiling #17571

Closed
rami3e opened this issue Feb 28, 2019 · 11 comments
Closed

[jit] Support for modules that have hooks when compiling #17571

rami3e opened this issue Feb 28, 2019 · 11 comments
Labels
low priority We're unlikely to get around to doing this in the near future oncall: jit Add this issue/PR to JIT oncall triage queue

Comments

@rami3e
Copy link

rami3e commented Feb 28, 2019

馃殌 Feature

Currently forward or backward hooks (i.e concatenation of features in UNET) are not supported in jit tracing.

Run into this error when attempting: ValueError: Modules that have hooks assigned can't be compiled

source: torch.jit.TracedModule(ScriptModule)

@pytorchbot pytorchbot added the oncall: jit Add this issue/PR to JIT oncall triage queue label Feb 28, 2019
@driazati driazati self-assigned this Feb 28, 2019
@driazati driazati removed their assignment Feb 28, 2019
@suo suo self-assigned this Mar 11, 2019
@suo
Copy link
Member

suo commented Mar 11, 2019

Hi, could you explain a little bit more about your use case? For example, would it be possible as a workaround to inline the hook into the forward() of your model?

@suo suo removed the jit-triaged label Mar 11, 2019
@suo suo removed their assignment Mar 13, 2019
@suo suo added the low priority We're unlikely to get around to doing this in the near future label Mar 13, 2019
@rami3e
Copy link
Author

rami3e commented Mar 15, 2019

Sure. It is similar to U-net style skip connections but I also pass these features through another conv layer. It is actually in the forward() currently. ie.

def forward(self, input, enc_feats):  
     # ....
     y2 = torch.cat((y0, (nn.Conv2d...)))

@rami3e
Copy link
Author

rami3e commented May 23, 2019

has this been addressed elsewhere ?

@eugeneware
Copy link

@suo It's for getting U-nets to work where hooks are used to store activations from the forward pass. See code here for the model that many people in the fast.ai course are trying to compile.

It turns out that if you just comment out that error the compilation works.

So it would be great if there was some option that we could pass to bypass that exception, and perhaps just get a warning.

You can follow some of the discussion in this forum thread here for some more context.

Thanks for your great work!

@eugeneware
Copy link

FYI this issue is related to this issue also.

@suo
Copy link
Member

suo commented Jun 12, 2019

@zdevito do you remember why we explicitly disallow hooks?

@zdevito
Copy link
Contributor

zdevito commented Jun 12, 2019

I can't remember. We should revisit it and see if wean just enable it.

@pshashk
Copy link

pshashk commented Jul 19, 2019

nn.utils.spectral_norm is also implemented via hook.

torch.jit.trace(nn.utils.spectral_norm(nn.Linear(8,8)), torch.rand(4,8))
ValueError: Modules that have hooks assigned can't be compiled

@fuzic
Copy link
Contributor

fuzic commented Jul 20, 2019

weight norm also. Are models using these just unexportable?

https://pytorch.org/docs/stable/_modules/torch/nn/utils/weight_norm.html

@eellison
Copy link
Contributor

@zdevito @suo any thoughts ?

@eugeneware
Copy link

Thank you @eellison ! 馃帀 馃憦

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
low priority We're unlikely to get around to doing this in the near future oncall: jit Add this issue/PR to JIT oncall triage queue
Projects
None yet
Development

Successfully merging a pull request may close this issue.

9 participants