New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

LeakyReLU error when using #6532

inspiralpatterns opened this Issue May 7, 2017 · 2 comments


None yet
3 participants
Copy link

inspiralpatterns commented May 7, 2017

I have a network built using the Model API and I'm using LeakyReLU activation functions for my layers. When it comes to the moment of saving the structure, I get this:

AttributeError: 'LeakyReLU' object has no attribute '__name__'

Example of a layer of mine:
x = Conv2D(256, (3, 3), activation=LeakyRelu(alpha=1e-1), padding='same', activity_regularizer=SparseReg(beta=5e-1,p=1e-2), name='lay3')(x)

Is creating a custom non-linearity the only way to using the LeakyReLU with a Model API? I read from #3816 and #2272 that you can't use an activation layer as activation function inside another layer, say a Conv2d but those refer to a model built using Sequential API.


This comment has been minimized.

Copy link

Bisgates commented May 9, 2017

Try put LeakyReLU in a new line, like this

from keras.layers.advanced_activations import LeakyReLU

x = Conv2D(256, (3, 3))(x)
x = LeakyReLU(alpha=1e-1)(x)

This comment has been minimized.

Copy link

sven-mayer commented Dec 30, 2018

The problem is still present in Keras 2.2.4 using TF 1.12.0

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment