New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
NoneType object has no attribute '_keras_shape' #113
Comments
based on your minimal working example and https://stats.stackexchange.com/questions/270546/how-does-keras-embedding-layer-work I have implemented a LRP resolution for Embedding layers (pull request waiting for merge) My understanding is that the EmbeddingLayer serves as a mapping from index-type inputs (numbers, strings) into a vector space accessible to other layers of the network. The only thing remaining to do here should be the aggregation of the relevance scores of each mapping vector and their attribution to their corresponding index-type inputs That being said, I am not sure how analysis methods based on computing derivatives could be supported wrt the method canonically meaningfully. Did I get anything wrong about the way the EmbeddingLayer operates, or is anything unclear? |
The understanding of the Embedding layer is correct and the proposed changes solve the problem. |
check keras version |
Has the fix of this issue been merged? I ran into the same problem with 1.0.8 of innvestigate. Thanks. |
It seems that the changes from the feature branch did not make it into the master branch finally. |
Hi guys, |
Has it been fixed yet? I have the latest innvestigate version (1.0.8) but I get the same error as above. It would be great if you could release the feature branch, if it is not fully ready to be merged yet! 😉 🙂 My CNN is similar to the one in the link above (summary below).
With your fix, I hope to be able to:
Thank you for your awesome work! :) |
Hi, I understand this post is pretty old and I am new to this Innvestigate lib. I have model structure somewhat like below. input_text = Input(shape=(1,), dtype="string") I am using this on NLP. Any help is highly appreciated. Regards, |
As mentioned by Sebastian, the main problem with these embedding layers is that they map a token to a vector and this mapping has no backpropagation rule atm - the most intuitive thing is probably to sum the relevances of each dimension to obtain the relevance of the token. A workaround I would suggest is the following: Build a new model, that is like your old model but starts at layer dense3 (and with input_shape(1024,)). Transform your data to the embedding space using the output of the Lambda-Layer. Then, you can use lrp with the transformed data and obtain relevances for each dimension, eventually you should sum them up like mentioned above. |
Thanks for the quick one. So if I understood it right, I should transform data with 1024 dimension of embedding & start the NN from there. |
Exactly. |
Closing this as it looks like it should have been automatically closed when merging PR #222. |
The innvestigate analyzer is not able to analyze the following network (created with keras). I suppose that it is related to the Embedding Layer from keras.
I get the following error when trying to execute the above:
I tested with the newest version 1.0.4 of the innvestigate package.
The text was updated successfully, but these errors were encountered: