New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add GeM layer #16747
Add GeM layer #16747
Conversation
The 4th page of the paper states
I think we can set the 'power' as a learnable parameter instead of treating it as a hyperparameter, which is again demonstrated in the experiments in the paper itself. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks for the PR! IMO we can introduce a single GeMPooling
layer that works for 1D/2D/3D inputs. No need for a base class either.
@old-school-kid |
IMO it seems fine to keep |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks for the update!
Close keras-team/tf-keras#550