Skip to content

Conversation

WindQAQ
Copy link
Member

@WindQAQ WindQAQ commented Jul 8, 2020

Closes #591.

@WindQAQ WindQAQ requested a review from facaiy as a code owner July 8, 2020 00:57
@WindQAQ WindQAQ requested a review from a team July 8, 2020 00:57
Copy link
Member

@facaiy facaiy left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Looks greate, leave a minor comment :-)

super().__init__(name, **kwargs)
self._set_hyper("learning_rate", kwargs.get("lr", learning_rate))
self.l1_regularization_strength = l1_regularization_strength
self.l2_regularization_strength = l2_regularization_strength
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

How about creating l1_regularxx and l2_regularxx as hyper-parameter?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Will do. Just copy paste the name from compat.v1. Maybe l1 and l2 are more suitable?

https://www.tensorflow.org/api_docs/python/tf/keras/regularizers/L1L2

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I agree, l1 and l2 are more concise :-)

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Tzu-Wei, perhaps we could use self._set_hyper for l1 and l2, what do you think?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

of course. let me update it real quick :-)

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Seems that _set_hyper/_get_hyper supports more types of inputs.

@WindQAQ
Copy link
Member Author

WindQAQ commented Jul 12, 2020

Umm, sorry for the confusion, but tf.keras.optimizers.Ftrl use l{1,2}_regularization_strength instead :-(
I think it might be better to be consistent with other optimizers?

https://github.com/tensorflow/tensorflow/blob/master/tensorflow/python/keras/optimizer_v2/ftrl.py#L47-L50

@facaiy facaiy self-requested a review July 12, 2020 07:19
Copy link
Member

@facaiy facaiy left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Looks good, Tzu-Wei. Let's merge it after all tests pass :-)

@WindQAQ WindQAQ merged commit 3d1b3a6 into tensorflow:master Jul 12, 2020
ashutosh1919 pushed a commit to ashutosh1919/addons that referenced this pull request Jul 12, 2020
* Port ProximalAdagrad optimizer

* Update code owner

* Address comments

* Fix tests

* run formatter

* Reorder import

* Use set_hyper/get_hyper

* Change back to *_regularization_stength
jrruijli pushed a commit to jrruijli/addons that referenced this pull request Dec 23, 2020
* Port ProximalAdagrad optimizer

* Update code owner

* Address comments

* Fix tests

* run formatter

* Reorder import

* Use set_hyper/get_hyper

* Change back to *_regularization_stength
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Add Keras implementation for ProximalAdagradOptimizer
3 participants