Have the ability to disable __torch_function__
dispatch for torch.nn.functional functions
#55440
Labels
module: nn
Related to torch.nn
module: __torch_function__
triaged
This issue has been looked at a team member, and triaged and prioritized into an appropriate module
torch.nn.Module
instances andtorch.
namespace ops represent a pretty good split between high-level (stateful) blocks and low-level (stateless) operations.torch.nn.functional
calls seem to just get in the way in the vast majority of cases that I've seen. The fact that these participate in__torch_function__
makes this weird middle-layer of abstraction that cannot be easily skipped. I can't even figure out how to skip them manually in a__torch_function__
handler, we can't redispatch into the functional with a layer of tensor-like objects stripped because then we wouldn't get the calls inside the functional (but not the functional itself). Can we have some way to turn off dispatching to__torch_function__
fortorch.nn.functional
calls?(Additionally there's a separate issue where
torch.nn.functional
calls have this icky TorchScriptboolean_dispatch
thingcc @hameerabbasi @rgommers @peterbell10 @albanD @mruberry @jbschlosser
The text was updated successfully, but these errors were encountered: