New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[AutoDiff] Fix @differentiable
attribute SILGen and serialization.
#21837
Merged
dan-zheng
merged 12 commits into
apple:tensorflow
from
dan-zheng:fix-differentiable-attr-silgen
Jan 16, 2019
Merged
[AutoDiff] Fix @differentiable
attribute SILGen and serialization.
#21837
dan-zheng
merged 12 commits into
apple:tensorflow
from
dan-zheng:fix-differentiable-attr-silgen
Jan 16, 2019
Conversation
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Propagate `@differentiable` attributes in `SILFunctionBuilder::addFunctionAttributes`. - Previously, this logic was missing, causing SIL `[differentiable]` attributes to be missing during SILGen. - This unblocks using VJP definitions in the stdlib (with some extra work). Serialize `AutoDiffParameterIndices` for `DifferentiableAttr` and use it as the primary source for differentiation parameter info (instead of `ParsedAutoDiffParameter`). Fix `AutoDiff` tests. There are two `TensorFlowRuntime` test regressions in GPE mode. - Both are because used device set includes RUNTIME device in `GraphFunctionDeviceInfo::finalizeUsedDevices()`.
rxwei
approved these changes
Jan 14, 2019
@@ -7,7 +7,7 @@ public struct Foo : Differentiable { | |||
} | |||
|
|||
// CHECK-AST-LABEL: public struct Foo : Differentiable { | |||
// CHECK-AST: @sil_stored @differentiable() | |||
// CHECK-AST: @sil_stored @differentiable(wrt: (self)) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I love that things have a canonical printout now when you use checked parameter indices.
dan-zheng
force-pushed
the
fix-differentiable-attr-silgen
branch
2 times, most recently
from
January 15, 2019 01:42
c17b517
to
c765cc6
Compare
dan-zheng
force-pushed
the
fix-differentiable-attr-silgen
branch
from
January 15, 2019 01:54
c765cc6
to
30714d8
Compare
@swift-ci Please test tensorflow |
@swift-ci Please clean test tensorflow macOS |
@swift-ci Please test tensorflow Linux GPU |
dan-zheng
force-pushed
the
fix-differentiable-attr-silgen
branch
8 times, most recently
from
January 16, 2019 02:37
189593d
to
7f9effe
Compare
rxwei
approved these changes
Jan 16, 2019
- Propagate `@differentiable` attributes in `SILFunctionBuilder::addFunctionAttributes`. - Previously, this code was missing, causing `@differentiable` AST attributes to not propagate during SILGen. - Rework differentiation to never load SIL functions. - Previously, differentiation relied on explicit loading to get SIL `[differentiable]` attributes. That was a hack. - Now that `@differentiable` attribute is propagated and serialized, such loading is no longer necessary. - Tighten differentiation infrastructure to uphold the following invariants: - Differentiable functions' primal/adjoint are visible only in defining module. - Differentiable functions' JVP/VJP are always visible to other modules. The differentiation pass guarantees that JVP/VJPs exist. Todos: - Replace adjoint definitions in stdlib with VJP definitions. - Eliminate primal/adjoint from `@differentiable` attribute. Use JVP/VJPs everywhere.
dan-zheng
force-pushed
the
fix-differentiable-attr-silgen
branch
2 times, most recently
from
January 16, 2019 03:20
5d8016b
to
0aca275
Compare
dan-zheng
force-pushed
the
fix-differentiable-attr-silgen
branch
from
January 16, 2019 03:25
0aca275
to
61ccf59
Compare
Calling `module.getSILLoader()` here is necessary to prevent crashes related to `lookUpFunctionInWitnessTable`. This might be a bug in the `SILLoader` implementation.
dan-zheng
changed the title
[AutoDiff] Fix SILGen for
[AutoDiff] Fix Jan 16, 2019
@differentiable
attribute.@differentiable
attribute SILGen and serialization.
`test/AutoDiff/simple_model.swift` failed during argument epxlosion optimization.
@swift-ci Please test tensorflow |
This was joint work with @rxwei. 🤝 |
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
@differentiable
attributes inSILFunctionBuilder::addFunctionAttributes
.[differentiable]
attributesto be missing during SILGen.
AutoDiffParameterIndices
forDifferentiableAttr
and use it as theprimary source for differentiation parameter info (instead of
ParsedAutoDiffParameter
).[differentiable]
attributes. That was a hack.@differentiable
attribute is propagated and serialized,such loading is no longer necessary.
The differentiation pass guarantees that JVP/VJPs exist.
There are two
TensorFlowRuntime
test regressions in GPE mode.GraphFunctionDeviceInfo::finalizeUsedDevices()
.Todos:
@differentiable
attribute.Use JVP/VJPs everywhere.