-
Notifications
You must be signed in to change notification settings - Fork 10.6k
Taught tracer to support concrete, intermediate tensors created within tracee #22100
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Taught tracer to support concrete, intermediate tensors created within tracee #22100
Conversation
…n tracee.
These tensors become additional input tensors to feed into the generated trace
graph function. For example, if the tracee is:
```swift
func foo(x: TensorPair) -> Tensor {
let y = Tensor<1.0>
return x.first + x.second + y
}
```
Then the generated trace graph function has 3 input tensors: `x.first`,
`x.second`, and `y`, where `y` is the additional input.
|
@swift-ci please test tensorflow |
|
Also cc @jekbradbury as FYI |
Did you mean |
|
Good catch. :) Fixed. |
| /// The number of additional input tensors to the trace graph function, | ||
| /// created from concrete intermediate tensors in the tracee, such as `y` in | ||
| /// the code snippet above. | ||
| var additionalInputTensorCount: Int32 = -1 |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Is there a reason why the initial value is -1?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This increases the chance of catching bugs (since -1 is never valid), if we forget to set it to some legal value.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I see where you are coming from, however, there's a safer way to do that in Swift -- since you expect additionalInputTensorCount to always be set during class initialization, removing the default value makes you able to catch initialization bugs at compile-time. When you forget to set it in init, Swift gives you a compile-time error saying "member is not initialized", which is significantly better than catching this at runtime.
| // x) is symbolic. The second one for y is concrete, and is computed at | ||
| // trace creation time, not trace execution time. | ||
| // Also see the comment block above finalizeAndExecuteTraceFn(). | ||
| for (i, output) in outputs.enumerated() { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
| for (i, output) in outputs.enumerated() { | |
| for (i, output) in outputs.enumerated() where TFE_TensorHandleIsConcrete(output) == 0 { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Nice! Done.
| // trace creation time, not trace execution time. | ||
| // Also see the comment block above finalizeAndExecuteTraceFn(). | ||
| for (i, output) in outputs.enumerated() { | ||
| if TFE_TensorHandleIsConcrete(output) != 0 { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The TF_GraphToFunction call below sets the number of outputs to symbolicOutputs.count. As far as I understand symbolicOutputs.count <= outputs.count after this PR. Should TF_GraphToFunction have outputs.count for the number of outputs?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think I understand it now. Basically, the concrete outputs are captured and returned by the _graph function and don't have to be part of the TFFunction outputs.
| // For example, let the tracee be: | ||
| // func foo(x: Tensor) -> (Tensor, Tensor) { | ||
| // let y = Tensor<Float>(1.0) | ||
| // return (x + x, y) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The test added here does not exercise this scenario?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Discussed in person, and TracerTests.testAllBackends("Basic_IntermediateTensors") should cover this case.
|
@swift-ci please test tensorflow |
| /// The number of additional input tensors to the trace graph function, | ||
| /// created from concrete intermediate tensors in the tracee, such as `y` in | ||
| /// the code snippet above. | ||
| var additionalInputTensorCount: Int32 = -1 |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I see where you are coming from, however, there's a safer way to do that in Swift -- since you expect additionalInputTensorCount to always be set during class initialization, removing the default value makes you able to catch initialization bugs at compile-time. When you forget to set it in init, Swift gives you a compile-time error saying "member is not initialized", which is significantly better than catching this at runtime.
| cTraceContext) | ||
| for i in 0..<additionalInputTensorCount { | ||
| symbolicInputs.append(TFE_GetInputGraphNodeFromTraceContext( | ||
| cTraceContext, UInt32(i))) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
| cTraceContext, UInt32(i))) | |
| cTraceContext, UInt32(i))) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
cTraceContext is indented two spaces after the pervious line on TFE_GetInputGraphNodeFromTraceContext( -- I believe that's the correct indentation?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Oh, I misread.
| debugLog("Running tracee in tracing mode.") | ||
| // The tracee output can contain a mixture of symbolic and concrete tensors | ||
| // (see the comment block within TraceContext.finalize()). | ||
| debugLog("Running tracee in tracing mode.") |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
| debugLog("Running tracee in tracing mode.") | |
| debugLog("Running tracee in tracing mode.") |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Good catch, thanks!
Nit: func foo(x: TensorPair<Float, Float>) -> Tensor<Float> {
let y = Tensor<Float>(1.0)
return x.first + x.second + y
} |
|
@rxwei: On |
|
Ah, I misread the diff. I thought it was in |
|
Thanks all for the review! Submitting this patch. Happy to address any follow-up comments. |
These tensors become additional input tensors to feed into the generated trace
graph function. For example, if the tracee is:
Then the generated trace graph function has 3 input tensors:
x.first,x.second, andy, whereyis the additional input.