You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hi this is really great work!
I just want to provide something I find might be useful to others.
I find that in order to dump model correctly. We need to build model with sequential model and add Activation layer separately.
For example, the second method will get dumped correctly. While for the first method, dumped model has no Activation layer.
from keras.models import Sequential, Model
from keras.layers import Input, Dense, Dropout, Activation
def get_model_by_sequential():
model = Sequential()
model.add(Dense(64,input_dim=15,init = 'uniform',activation='relu'))
model.add(Dense(128,init = 'uniform',activation='relu'))
model.add(Dense(256,init = 'uniform',activation='relu'))
model.add(Dense(1,activation='sigmoid'))
return model
def get_model_by_sequential_with_separate_activation():
model = Sequential()
model.add(Dense(64,input_dim=15,init = 'uniform'))
model.add(Activation('relu'))
model.add(Dense(128,init = 'uniform'))
model.add(Activation('relu'))
model.add(Dense(256,init = 'uniform'))
model.add(Activation('relu'))
model.add(Dense(1))
model.add(Activation('sigmoid'))
return model
def get_model_by_functional_API():
a = Input(shape=(15,))
b = Dense(64,input_dim=15,init = 'uniform',activation='relu')(a)
b = Dense(128,init = 'uniform',activation='relu')(b)
b = Dense(256,init = 'uniform',activation='relu')(b)
b = Dense(1,activation='sigmoid')(b)
model = Model(input=a, output=b)
return model
The text was updated successfully, but these errors were encountered:
Hi this is really great work!
I just want to provide something I find might be useful to others.
I find that in order to dump model correctly. We need to build model with sequential model and add Activation layer separately.
For example, the second method will get dumped correctly. While for the first method, dumped model has no Activation layer.
The text was updated successfully, but these errors were encountered: