Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Don't use plain forward in python #3

Open
justusschock opened this issue Apr 17, 2019 · 3 comments
Open

Don't use plain forward in python #3

justusschock opened this issue Apr 17, 2019 · 3 comments

Comments

@justusschock
Copy link

In opposite to what you say in the section

A nn.module can be used on input data in two ways whereas the latter one is commonly used for better readability. self.net(input) simply uses the call() method of the object to feed the input through the module.

output = self.net.forward(input)
# or
output = self.net(input)

it is not recommended, to call the plain forward in python.

If this would, be the same, the __call__ method (which is called if you call a class) would be something like this:

def __call__(self, *args, **kwargs):
    return self.forward(*args, **kwargs)

But instead it is slightly more complex :

 def __call__(self, *input, **kwargs):
        for hook in self._forward_pre_hooks.values():
            hook(self, input)
        if torch._C._get_tracing_state():
            result = self._slow_forward(*input, **kwargs)
        else:
            result = self.forward(*input, **kwargs)
        for hook in self._forward_hooks.values():
            hook_result = hook(self, input, result)
            if hook_result is not None:
                raise RuntimeError(
                    "forward hooks should never return any values, but '{}'"
                    "didn't return None".format(hook))
        if len(self._backward_hooks) > 0:
            var = result
            while not isinstance(var, torch.Tensor):
                if isinstance(var, dict):
                    var = next((v for v in var.values() if isinstance(v, torch.Tensor)))
                else:
                    var = var[0]
            grad_fn = var.grad_fn
            if grad_fn is not None:
                for hook in self._backward_hooks.values():
                    wrapper = functools.partial(hook, self)
                    functools.update_wrapper(wrapper, hook)
                    grad_fn.register_hook(wrapper)
        return result

because it also deals with all the registered hooks (which wouldn't be considered when calling the plain forward).

@IgorSusmelj
Copy link
Owner

Thanks,

I didn't know that. I added the changes. Do you know whether it only affects the "hooks" or whether it has an impact on the forward/ backward pass?

@justusschock
Copy link
Author

It does not have an impact on the forward/backward passes in general, but some more complicated models use these hooks as part of their logic, because it is highly dynamical.

@IgorSusmelj
Copy link
Owner

thanks for the explanation!

@IgorSusmelj IgorSusmelj reopened this Apr 30, 2019
IgorSusmelj added a commit that referenced this issue Jan 11, 2021
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants