Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Implement triplet_semihard_loss #28

Merged
merged 8 commits into from
Feb 14, 2019

Conversation

facaiy
Copy link
Member

@facaiy facaiy commented Feb 7, 2019

Related with: #26

Copy link
Member

@seanpmorgan seanpmorgan left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Looks great, thank you again. Triplet loss is a great thing to have included in our first package. I like the registering decorator as well... I'm fine with it living in //tensorflow_addons/util.py Also, I think we can decorate all the methods with @tf.function without issue.

When time allows could you decorate the methods as a tf.function, and move the custom object register. Then should be ready to merge.

@facaiy
Copy link
Member Author

facaiy commented Feb 7, 2019

I'd like to create a new PR later to move the custom object register, what do you think?

And I agree that we can decorate all the methods with @tf.function. I'm not sure whether it's right time to do it, however, because keras.losses hasn't decorated its functions. By the way, if the base class Loss uses tf.function for its __call__ method in the future, it seems not necessary?

@seanpmorgan
Copy link
Member

seanpmorgan commented Feb 7, 2019

No issue with a separate PR for moving the register.. it'll need to be applied to all our keras objects anyways.

For the function decorator, I don't see a reason we should hold back just because it's not in tf.keras yet. We're building against tf2-nightly and adapting to TF2 best practices so I think we should include it. The base class implementing @tf.function on __call__ is a good point... but then are all of these other methods such as pairwise_distance never meant to be called outside of Triplet Loss? If so, I would prefer them to have an underscore prefix (But I can imagine it could be used outside of this loss function)

@facaiy
Copy link
Member Author

facaiy commented Feb 7, 2019

No issue with a separate PR for moving the register.. it'll need to be applied to all our keras objects anyways.

+1

We're building against tf2-nightly and adapting to TF2 best practices so I think we should include it.

Make sense, I agree with you. And I'll decorate the methods with tf.function in a few days (the day after tomorrow), as I need to attend a wedding tomorrow :-)

Copy link
Member

@seanpmorgan seanpmorgan left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Getting a failing test TripletSemiHardLossTest.test_unweighted due to this commit tensorflow/tensorflow@4604d1e

After that good to merge.

@facaiy
Copy link
Member Author

facaiy commented Feb 12, 2019

Fixed. Thank you :-)

@seanpmorgan seanpmorgan merged commit bfa919e into tensorflow:master Feb 14, 2019
@facaiy facaiy deleted the ENH/migrate_metric_loss branch February 14, 2019 01:44
tabshaikh pushed a commit to tabshaikh/addons that referenced this pull request Feb 18, 2019
* ENH: Implement triplet_semihard_loss
Squadrick pushed a commit to Squadrick/addons that referenced this pull request Mar 26, 2019
* ENH: Implement triplet_semihard_loss
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

2 participants