-
Notifications
You must be signed in to change notification settings - Fork 19.4k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
AttributeError: can't set attribute #7736
Comments
When I went back to keras-2.0.6, everything works fine. Or when I switch to tensorflow-1.3.0 binaries from pip, everything works fine... |
This was fixed in 619259c, which came after the 2.0.7 release. You'll need to install the latest Keras from GitHub. |
This issue has been automatically marked as stale because it has not had recent activity. It will be closed after 30 days if no further activity occurs, but feel free to re-open a closed issue if needed. |
I'm having the same problem with keras-2.1.2 using both Tensorflow and Theano backends |
Here's the code that generates the error: #! /usr/bin/env python3
#-*- coding: utf-8
from __future__ import print_function
from keras import backend as K
from keras.engine.topology import Layer
from keras.layers import multiply
class Attention(Layer):
"""
Probably the simplest attention mechanism. Dots the attention weights
with the input, does a softmax on the resulting vector, then elementwise
multiplies the softmax with the input to create the output.
"""
def __init__(self, **kwargs):
super(Attention, self).__init__(**kwargs)
def build(self, input_shape):
self.weights = self.add_weight(name='attention_weights',
shape=(input_shape[1], input_shape[1]),
initializer='uniform',
trainable=True)
super(Attention, self).build(input_shape)
def call(self, x):
return multiply([x, K.softmax(K.dot(x, self.weights))])
def compute_output_shape(self, input_shape):
return input_shape
def test_attention():
from keras.models import Sequential
from keras.layers import GRU, Bidirectional
kmodel = Sequential()
kmodel.add(GRU(512,input_shape=(200,300)))
kmodel.add(Attention())
if __name__ == "__main__":
test_attention() and here's the error:
|
Don't override
|
merci beaucoup! 🥇 |
Closing as this is resolved |
I get the same error from the following code: model_1.summary() ganModel = simple_gan(model_1, model_2, normal_latent_sampling((100,))) Traceback (most recent call last): Any Help? |
I updated my keras from 2.0.6 to 2.0.7. Then the following code (works fine with 2.0.6) produces strange errors:
The backend is tensorflow 1.3 compiled with CUDA (it works fine with keras-2.0.6).
The text was updated successfully, but these errors were encountered: