-
Notifications
You must be signed in to change notification settings - Fork 635
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Make jit compilation optional for function and use nn.Module #314
Conversation
Looks safe to leave it in. See the docs for it: https://pytorch.org/docs/stable/jit.html#torch.jit.ignore.
I don't think so, but let's ask. cc @wanchaol, I remember you (or maybe it was someone else) was looking into no_grad support for the JIT, does it support it now? |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Are there tests that check that the behavior of everything is the same in eager mode vs JIT mode?
Yeah I was looking into |
Need to add that indeed :) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Change looks safe to me.
Now that we have recursive scripting, it's not necessary anymore to torch.jit.script
every function beforehand, as long as the main entrypoint is scripted.
So one could do something like
transforms = T.Compose([...])
if use_script:
transforms = torch.jit.script(transforms)
and this should work
jit script has recently become recursive. We therefore no longer need to decorate every function. Advantage of removing @torch.jit.script decoration:
Need to add test for jit and non-jit. This should have code generator to either
Things to keep in mind:
Decision from offline discussion: move forward with removing decorator and adding nn.Module where possible. |
The jit tests will need to be updated. |
I've added tests for jit version is close to python, and we have test for python version is close to ground truth. We could decide to also add jit version to truth, but that would add extra code and ressource to run. |
I had to keep one torch.jit.ScriptModule due to python 2 and constants. We'll be able to get rid of that once python 2 is deprecated. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Since we are rolling back torch.jit.ScriptModule
and adding nn.Module
back, we should explain the reason behind it to our users. It's better to include more details in the text.
I think adding these tests for now is fine, but a more rigorous approach in my opinion would be to run all tests once jitted and once not-jitted. You could generate tests for that. pytest allows you to parameterize tests and one of those could be whether the function is to be jitted or not. Of course that also means the tests will run a lot longer (twice plus the compilation itself). |
Sounds good. I've updated the description of the PR. |
Some functions and modules do not have tests against a known solution, and are just tested for shape or successful run. Some are not even tested (e.g. augmentations.py). The main worry with jit is that it compiled something that yields different results than what the original code would give. As such, the consistency test with jit at least ensures that is done correctly.
Are you suggesting migrating completely to pytest? If the reason is simply for test parameterization, unittest can do that too :) It would be nice to remove one of the two framework, say keep only pytest or unittest. Since pytorch uses unittest, we should probably just stick with that one though.
If resources are not an issue, I'm ok with having tests re-running with jit. I'd suggest doing that as part of a separate PR. Side note: is there a way with jit to compile a whole python module?
I'm not saying this is the best way to parametrize, but it seems fun to do :) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM
Removing
@torch.jit.script
decoration and using the recursive scripting now available innn.Module
has the advantage ofSince
torch.jit.ScriptModule
is also being phased out, torchaudio is starting the process of migratingnn.Module
where possible. The deprecation is explained here. Once python 2 is deprecated, we will migrate the remaining tonn.Module
, see here for currently supported syntax in python 2 and 3.Additional details
Current changes:
torch.nn.Module
is preferred for transforms and also makes transforms jitable.Questions:
torch.no_grad()
? see here (Not yet.)CC #326