-
Notifications
You must be signed in to change notification settings - Fork 25.6k
[jit] Trace in-place ops #14254
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[jit] Trace in-place ops #14254
Conversation
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM
There's a bunch of clang-tidy issues in script/init.cpp https://travis-ci.org/pytorch/pytorch/jobs/457803711 but they don't seem related to your changes |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@suo has imported this pull request. If you are a Facebook employee, you can view this diff on Phabricator.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
If I understand correctly, we're only keeping the out-of-place mode only for the sake of ONNX export. If so, this should really be a private mode, as no user should use it (as it results in strictly worse traces).
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@suo has imported this pull request. If you are a Facebook employee, you can view this diff on Phabricator.
@apaszke going to land this if you don't have any objection |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Generally LGTM, although I find the new codegen very confusing (and generally significantly harder to understand than what we had before)
} | ||
|
||
inline void ensureUnique(const char * name, const at::Tensor& tensor) { | ||
auto& state = getTracingState(); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
It might be a good idea to rename this function to ensureUniqueIfOutOfPlaced
or sth like that. It's not really used outside of auto generated code, but it would clarify its semantics a bit
output_names_outplace = [arg['name'] for arg in declaration['arguments'] if arg.get('output', False)] | ||
output_names_inplace = [r['name'] for r in declaration['returns']] | ||
|
||
if output_names_outplace == output_names_inplace: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
When does that happen? A comment would be nice
tools/autograd/gen_variable_type.py
Outdated
def rename(trace_name): | ||
if trace_name in RENAME_TRACE: | ||
trace_name = RENAME_TRACE[trace_name] | ||
return trace_name |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This is just RENAME_TRACE.get(trace_name, trace_name)
?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
python!
I'll try to add some comments explaining the high-level flow. The messiness seems unavoidable if we want to avoid duplication in generated code though (due to how many weird semantic quirks we have between aten and the JIT schema 😢). I'll address the rest of your comments on the fbcode side since I have changes there that I don't want to blow away. |
Summary: This PR adds a `try_outplace` option to the tracer. When `try_outplace` is true, the tracer will attempt to out-of-place ops (similar to how things are done today). When it's false, the correct in-place op is emitted. I made `try_outplace` false by default, but flipped it to true for ONNX export utils. zdevito jamesr66a, anywhere else I should preserve the existing behavior? Pull Request resolved: pytorch#14254 Differential Revision: D13166691 fbshipit-source-id: 2e8735f5d0e355bea784cbdc344eaeedf93c2640
Summary: This PR adds a `try_outplace` option to the tracer. When `try_outplace` is true, the tracer will attempt to out-of-place ops (similar to how things are done today). When it's false, the correct in-place op is emitted. I made `try_outplace` false by default, but flipped it to true for ONNX export utils. zdevito jamesr66a, anywhere else I should preserve the existing behavior? Pull Request resolved: #14254 Reviewed By: eellison Differential Revision: D13166691 Pulled By: suo fbshipit-source-id: ce39fdf73ac39811c55100e567466d53108e856b
This PR adds a
try_outplace
option to the tracer. Whentry_outplace
is true, the tracer will attempt to out-of-place ops (similar to how things are done today). When it's false, the correct in-place op is emitted.I made
try_outplace
false by default, but flipped it to true for ONNX export utils. @zdevito @jamesr66a, anywhere else I should preserve the existing behavior?