Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

get mid-model activations is very slow #4006

Closed
2 tasks done
IraZarI opened this issue Oct 9, 2016 · 7 comments
Closed
2 tasks done

get mid-model activations is very slow #4006

IraZarI opened this issue Oct 9, 2016 · 7 comments

Comments

@IraZarI
Copy link

IraZarI commented Oct 9, 2016

Please make sure that the boxes below are checked before you submit your issue. Thank you!

  • Check that you are up-to-date with the master branch of Keras. You can update with:
    pip install git+git://github.com/fchollet/keras.git --upgrade --no-deps
  • Provide a link to a GitHub Gist of a Python script that can reproduce your issue (or just copy the script here if it is short).

Hey,
i implemented this tool to get the activations of a net for a specific layer like suggested in #41 (and many more)

def get_activations1(model, layer, X_batch):
    if layer==-1: #InputLayer returns data itself
        return [X_batch]
    else: #Output of Layer X
        get_activations = K.function([model.layers[0].input, K.learning_phase()], [model.layers[layer].output])
        activations = get_activations([X_batch,0])
        return activations    

so everythings works flawless and i love this functionallity, but it has one gamebreaker for me: Speed.
this Calculation for a given model with 3-4 Layers (20 neurons width each, Dense) takes about half of a second.
Since i want to work with the activations and do some (many) calculations (like 240000) this can´t do the trick.

is there a way to speed up things?
thank you

@robertomest
Copy link
Contributor

You could maybe try reusing the get_activations function. Something like this:

def make_get_activations(model, layer):
    get_activations = K.function([model.layers[0].input, K.learning_phase()], [model.layers[layer].output])
    return get_activations

get_activations = make_get_activations(model, layer)
activations = get_activations(X_batch) # Call this many times

@Abhinav-Duggal
Copy link

+1, Speed is also a problem for me. Also is there a way to send batch test data (like multiple images, say 100K) to this function and get activations of a layer for all the 100K images in a list? Will that be faster due to some matrix multiplication perhaps (not sure)? I have seen some answers where u send a batch test data but end up iterating over them when passed as arguments to the K.function which I would think is same as calling get_activations many times with a list of images? Please help..

@stale stale bot added the stale label May 23, 2017
@stale
Copy link

stale bot commented May 23, 2017

This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs, but feel free to re-open it if needed.

@stale stale bot closed this as completed Jun 22, 2017
@vinayakumarr
Copy link

from keras import backend as K

def get_activations(model, layer, X_batch):
get_activations = K.function([model.layers[0].input, K.learning_phase()], model.layers[layer].output)
activations = get_activations([X_batch,0])
print(activations)
return activations

my_featuremaps = get_activations(cnn, 1, ([X_train[:10], 0])[0])
np.savetxt('featuremap.txt', my_featuremaps)

The above code is generating the below error with TensorFlow as backend

TypeError: outputs of a TensorFlow backend function should be a list or tuple.

Actually, this works fins with theano as backend

1 similar comment
@vinayakumarr
Copy link

from keras import backend as K

def get_activations(model, layer, X_batch):
get_activations = K.function([model.layers[0].input, K.learning_phase()], model.layers[layer].output)
activations = get_activations([X_batch,0])
print(activations)
return activations

my_featuremaps = get_activations(cnn, 1, ([X_train[:10], 0])[0])
np.savetxt('featuremap.txt', my_featuremaps)

The above code is generating the below error with TensorFlow as backend

TypeError: outputs of a TensorFlow backend function should be a list or tuple.

Actually, this works fins with theano as backend

@jona-sassenhagen
Copy link

I have the same problem - retrieving hidden layer activations takes as long as training the model. It would be very cool if there was an efficient way of retrieving hidden layer activity for multiple layers and multiple inputs.

@alyato
Copy link

alyato commented May 15, 2018

@IraZarI @jona-sassenhagen @robertomest
thanks.
I have two input and one output ,want to extract feature map.
I find the issuse #4600 is well for rnn. Im using the cnn.
If you guys can check #4600, it would be nice.
The following code not works.

def getFeatureMap(model,layer,X_batch):
featureMap=K.function([model.layers[0].input,model.layers[0].input],[model.layers[layer].output])
gmp = featureMap([X_batch,0])
return gmp

how to rewrite it? thanks.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

6 participants