New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
How could we use Leaky ReLU and Parametric ReLU as activation function ? #117
Comments
There's a PReLU example in the Kaggle Otto example; it can be used as a template for all of the Advanced Activation: from keras.layers.advanced_activations import LeakyReLU, PReLU
..
..
model.add(Dense(512, 512, activation='linear')) # Add any layer, with the default of an identity/linear squashing function (no squashing)
model.add(LeakyReLU(alpha=.001)) # add an advanced activation
...
...
model.add(Dense(512, 123, activation='linear')) # Add any layer, with the default of an identity/linear squashing function (no squashing)
model.add(PReLU((123,))) # add an advanced activation |
Thanks @patyork . But what if I would like to use them as inner_activation ? |
Hmm, it doesn't appear that that is possible at the moment to use the advanced activation within recurrent layers as the inner activation. @fchollet may be able to say for sure. This is a good point. |
Not currently possible with Keras. It would be possible to modify a recurrent layer to add this capability, though. |
@fchollet would a reasonable stopgap approach here be to add a "dummy" layer whose Finally: when adding the recurrent layer you can set its activation function to
|
This issue has been automatically marked as stale because it has not had recent activity. It will be closed after 30 days if no further activity occurs, but feel free to re-open a closed issue if needed. |
The problem is still appearing.
I got the following warning:
And eventually |
@iamtodor, I think the problem here is that you didn't define any activation function in the first layer. |
five years later I have the same question. but seems it is rather a warning now. can you elaborate please. |
* Implement the LearningRateScheduler callback * Add missing files * Cleanup * Formatting * Remove test util for building model * Initial implementation of ReduceLROnPlateau * Remove unused test variables * Tests for ReduceLROnPlateau * Improve docstrings * Review comments
* Implement the LearningRateScheduler callback * Add missing files * Cleanup * Formatting * Remove test util for building model * Initial implementation of ReduceLROnPlateau * Remove unused test variables * Tests for ReduceLROnPlateau * Improve docstrings * Review comments
Leaky ReLU and Parametric ReLU are implemented as classes, how could we use them as activation functions ?
Thanks
The text was updated successfully, but these errors were encountered: