-
Notifications
You must be signed in to change notification settings - Fork 19.4k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
How to change the activation function of VGG16 model in keras? #9370
Comments
You can take the model up to the last layer (before the softmax) and attach a dense relu layer at the end. Of course you'll need to retrain the last layer of the model. |
Simply change activation to None If you look at the source code of dense layer you will see def call(self, inputs):
output = K.dot(inputs, self.kernel)
if self.use_bias:
output = K.bias_add(output, self.bias)
if self.activation is not None:
output = self.activation(output)
return output |
@23pointsNorth @rex-yue-wu Thanks, i found the problem. The model must be retrained to make it works. |
Hi. what do you mean by retrain here? I compile the model again after I reset the activation like you did it doesn't seem to work. |
I have the same problem. What is retraining here? |
I recompiled the model then it worked. For illustration, I changed all activation to sigmoid. see the example below
|
I want to change the activation of the last layer 'softmax' to 'relu', the following is the code:
from keras.applications.vgg16 import VGG16
from keras.activations import softmax, relu, sigmoid
import numpy as np
import cv2
model_vgg16 = VGG16(weights='imagenet', include_top= True)
img_path = './vision_project/filter-visualization/doberman.png'
img = cv2.imread(img_path)
img = cv2.resize(img, (224, 224))
img = img.astype('float32') - 127.5
output = model_vgg16.predict(img[np.newaxis, :])
print(output[0, np.argmax(output)],np.argmax(output))
OUT:0.822323 236
model_vgg16.layers[-1].activation = relu
output = model_vgg16.predict(img[np.newaxis, :])
print(output[0, np.argmax(output)],np.argmax(output))
OUT:0.822323 236
The result is the same, but when i check the activation layer in the model.layers[-1].activation, it has changed with success.
model_vgg16.layers[-1].activation
OUT: <function keras.activations.relu>
Could anyone tell me how to solve this problem?
The text was updated successfully, but these errors were encountered: