Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

How to make DeepLIFT support customized keras layer? #21

Open
KeLanhhh opened this issue Mar 31, 2017 · 10 comments
Open

How to make DeepLIFT support customized keras layer? #21

KeLanhhh opened this issue Mar 31, 2017 · 10 comments

Comments

@KeLanhhh
Copy link

Hi,

I was trying to use DeepLIFT to interpret my CNN model. I don't know how to convert my model because I built a customized layer.
My model is built with keras, and that customized layer is something like global max pooling. My results show it does work for my data, so I should not remove it.
However, I really want to understand my model, and DeepLIFT is such a good choice. So I was wondering how to make DeepLIFT support my customized layer?

Thanks!

@AvantiShri
Copy link
Collaborator

Hi @KeLanhhh,

Thanks for reaching out. Is it not possible to achieve global maxpooling by specifying a pooling width that covers your entire input?

To make DeepLIFT support your customized layer, you would need to define a DeepLIFT layer object that corresponds to your layer, and then define a conversion function. I can explain how to do these things, but it may be better to wait until after a DeepLIFT update I am planning to put out in the next week or so (the update will be accompanied by a new ArXiv preprint). Would it be possible to wait that long? If not, let me know and I can guide you on how to achieve your desired layer in the current implementation (assuming that a pooling layer with a sufficiently large pooling width does not satisfy your use-case)

@KeLanhhh
Copy link
Author

KeLanhhh commented Apr 1, 2017

Thank you for the quick reply! @AvantiShri, a week or longer would be okay for me, just do you work first. By the way, my input length is not too long, So selecting top k max features might perform better than usual local max pooling.
ps: looking forward to your big update!

@ttgump
Copy link

ttgump commented Apr 1, 2017

Does DeepLIFT support to use deep residual? I have a model with deep residual, and want to use DeepLIFT to interpret it too. Thanks!

@AvantiShri
Copy link
Collaborator

Hi @ttgump - not yet, but it would not be hard to add. Can you tell me more about the layer types you use to implement the resnet (are you using Keras?)

@ttgump
Copy link

ttgump commented Apr 3, 2017

@AvantiShri Yes, I am using Keras.

@AvantiShri
Copy link
Collaborator

@ttgump Great, if I understand correctly you are probably using a merge layer with "sum" as the merge mode. Currently I have support for the "concat" merge mode but I can add "sum". Just to be on the safe side, can you let me know any other layer types you might be using?

@KeLanhhh
Copy link
Author

Hi, @AvantiShri

I read your new preprint of DeepLift, nice job! I think the separated positive and negative contribution might be very useful to my work.

Like what we discussed before, I want to apply DeepLift to my model which contains a customized k max pooling layer. At your convenience, could you teach me how to achieve my desired layer in the current DeepLift implementation? If need, I can also post the code of my layer here.

Thank you :)

@AvantiShri
Copy link
Collaborator

@KeLanhhh Glad you liked the paper! Yes, it would be helpful if you tell me about the implementation of your layer here, and then I can advise you on how to adapt it for DeepLIFT.

@KeLanhhh
Copy link
Author

KeLanhhh commented Apr 24, 2017

@AvantiShri Here is code of k max pooling layer used in my model.

from keras import backend as K
from keras.engine.topology import Layer

class KMaxPooling(Layer):
    
    def __init__(self, K, **kwargs):
        super(KMaxPooling, self).__init__(**kwargs)
        self.K = K
    
    def get_output_shape_for(self,input_shape):
        shape = list(input_shape)
        shape[1] = self.K
        return tuple(shape)

    def call(self,x,mask = None):
        k = theano.tensor.cast(self.K, dtype="int32")
        sorted = theano.tensor.sort(x, axis = 1)
        out = sorted[:, -k:, :]
        return out

    def get_config(self):
        config = {"pool_size": self.K}
        base_config = super(KMaxPooling, self).get_config()
        return dict(list(base_config.items()) + list(config.items()))

Is it hard to convert? Please let me know if you need any further information!
Thanks!

@nicofarr
Copy link

nicofarr commented Dec 14, 2017

Hi there,

I also have a customized layer implemented in Keras (code is here). It may be slightly more complicated than @KeLanhhh 's case though as it is a trainable layer. Could you help in supporting this in DeepLift ?

Thanks a lot in advance,

Best

Nicolas

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants