Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fixing SequenceLoss Keras incompatibility #503

Merged

Conversation

kazemnejad
Copy link
Contributor

Fixing the incompatibility issue between the SequenceLoss and Keras built-in training loops by adding support for dense targets. For more information please refer to #375

@kazemnejad kazemnejad changed the title SequenceLoss keras compatibility Fixing SequenceLoss Keras incompatibility Sep 11, 2019
Copy link
Contributor

@guillaumekln guillaumekln left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Looks great! Just a minor comment.

tensorflow_addons/seq2seq/loss.py Outdated Show resolved Hide resolved
@guillaumekln
Copy link
Contributor

@qlzh727 Can you take a quick look when you have time?

@qlzh727
Copy link
Member

qlzh727 commented Sep 12, 2019

Sure, let me take a look

@kazemnejad
Copy link
Contributor Author

@qlzh727 Have you found any problem in the PR? Thanks.

Copy link
Member

@qlzh727 qlzh727 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Looks good, thanks for the contribution.

@guillaumekln guillaumekln merged commit 4022d3a into tensorflow:master Sep 12, 2019
@kazemnejad kazemnejad deleted the seq2seq_loss_keras_compatibility branch September 15, 2019 10:52
tomerk pushed a commit to tomerk/addons that referenced this pull request Sep 17, 2019
* Fix SequenceLoss incompatibility with Keras built-in loops

* Remove debugging prints

* Change the attribute existence checking to use more pythonic way
facaiy pushed a commit that referenced this pull request Sep 19, 2019
* Namespaced all of the custom ops

* Updated C++ namespaces to not conflict w/ TF contrib ones

* Ran code reformatting tool

* Port bug fix in TF contrib to addons. (#497)

* Port bug fix in TF contrib to addons.

Original change at
tensorflow/tensorflow@a913689.

* Fix lint warning.

* check pass through and do the expand_dims() only if needed (#464)

* check pass through and do the expand_dims() only if needed
* add indent to the fixed line
* merge return condition to if state

* add hardshrink kernel (#500)

* add hardshrink kernel
* make linter happy

* Fixing SequenceLoss Keras incompatibility (#503)

* Fix SequenceLoss incompatibility with Keras built-in loops

* Remove debugging prints

* Change the attribute existence checking to use more pythonic way

* Replace some compat.v1 APIs by their v2 equivalent (#507)

* Replace some compat.v1 APIs by their v2 equivalent

* Fix lint error

* Add documentation for LazyAdam (#515)

* Updated hardshrink custom ops & made #ifdef names more consistent.

* Fix to undef
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

7 participants