Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Implement Maxout layer #19

Merged
merged 6 commits into from Jan 26, 2019
Merged

Conversation

facaiy
Copy link
Member

@facaiy facaiy commented Jan 17, 2019

Fix #9

class MaxOutTest(test.TestCase):
def test_simple(self):
# TODO: more simple way to deserialize the layers in addons.
with generic_utils.custom_object_scope({'Maxout': Maxout}):
Copy link
Member Author

@facaiy facaiy Jan 17, 2019

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

tf.keras cannot recognize custom layers, I think there are at least two ways to mitigate the problem:

  1. We inject our custom layers into _GLOBAL_CUSTOM_OBJECTS of keras (or use CustomObjectScope.__enter__()) when importing. Users don't need do anything.
  2. We create a custom_objects for addons.layers, and ask users to pass them to Model.from_config(config, custom_objects) explicitly.
    or
with tf.keras.utils.custom_object_scope(addons_custom_objects):
    Model.from_config(config)

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Do we need to create an issue in addons to discuss the problem?

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

At first glance I lean toward option 1. Perfectly okay with an issue to discuss it though. I started #23 so we could formalize this requirement once a decision is made

@googlebot
Copy link

So there's good news and bad news.

👍 The good news is that everyone that needs to sign a CLA (the pull request submitter and all commit authors) have done so. Everything is all good there.

😕 The bad news is that it appears that one or more commits were authored or co-authored by someone other than the pull request submitter. We need to confirm that all authors are ok with their commits being contributed to this project. Please have them confirm that here in the pull request.

Note to project maintainer: This is a terminal state, meaning the cla/google commit status will not change from this state. It's up to you to confirm consent of all the commit author(s), set the cla label to yes (if enabled on your project), and then merge this pull request when appropriate.

@seanpmorgan seanpmorgan merged commit 98895ca into tensorflow:master Jan 26, 2019
@facaiy facaiy deleted the ENH/maxout_layer branch January 27, 2019 03:42
Squadrick pushed a commit to Squadrick/addons that referenced this pull request Mar 26, 2019
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

3 participants