Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Updated Transformer model to use callable. #113

Merged
merged 2 commits into from
Apr 20, 2019

Conversation

leoxzhao
Copy link
Contributor

I have to use .call explicitly for callable with inout. Once "Static callable does not support inout" (https://bugs.swift.org/browse/TF-443) get fixed, we can simply remove .call in the code.

func applied(to input: AttentionInput, state: inout AttentionContext)
-> Tensor<Float> {

func call(_ input: AttentionInput, state: inout AttentionContext) -> Tensor<Float> {
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

call functions with state do not have @differentiable. I keep it that way.

@brettkoonce brettkoonce mentioned this pull request Apr 19, 2019
@brettkoonce
Copy link
Contributor

Upstream: swiftlang/swift#24157.

@rxwei
Copy link
Contributor

rxwei commented Apr 20, 2019

swiftlang/swift#24157 has been merged. If you could update it to the callable syntax for the call sites that take an inout, that'd be great!

@rxwei rxwei merged commit 7f94321 into tensorflow:master Apr 20, 2019
@rxwei
Copy link
Contributor

rxwei commented Apr 20, 2019

Thanks Leo!

@leoxzhao leoxzhao deleted the lzhao/apply_callable branch April 21, 2019 22:52
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

3 participants