Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

model with function API #16

Open
blackarrow3542 opened this issue Mar 1, 2017 · 2 comments
Open

model with function API #16

blackarrow3542 opened this issue Mar 1, 2017 · 2 comments

Comments

@blackarrow3542
Copy link
Contributor

blackarrow3542 commented Mar 1, 2017

Hi this is really great work!
I just want to provide something I find might be useful to others.
I find that in order to dump model correctly. We need to build model with sequential model and add Activation layer separately.
For example, the second method will get dumped correctly. While for the first method, dumped model has no Activation layer.

from keras.models import Sequential, Model
from keras.layers import Input, Dense, Dropout, Activation
def get_model_by_sequential():
        model = Sequential()
        model.add(Dense(64,input_dim=15,init = 'uniform',activation='relu'))
        model.add(Dense(128,init = 'uniform',activation='relu'))
        model.add(Dense(256,init = 'uniform',activation='relu'))
        model.add(Dense(1,activation='sigmoid'))
        return model

def get_model_by_sequential_with_separate_activation():
        model = Sequential()
        model.add(Dense(64,input_dim=15,init = 'uniform'))
        model.add(Activation('relu'))
        model.add(Dense(128,init = 'uniform'))
        model.add(Activation('relu'))
        model.add(Dense(256,init = 'uniform'))
        model.add(Activation('relu'))
        model.add(Dense(1))
        model.add(Activation('sigmoid'))
        return model

def get_model_by_functional_API():
        a = Input(shape=(15,))
        b = Dense(64,input_dim=15,init = 'uniform',activation='relu')(a)
        b = Dense(128,init = 'uniform',activation='relu')(b)
        b = Dense(256,init = 'uniform',activation='relu')(b)
        b = Dense(1,activation='sigmoid')(b)
        model = Model(input=a, output=b)
        return model
@pplonski
Copy link
Owner

pplonski commented Mar 1, 2017

Thanks for that information! I think it should be easy to handle both situations. Would you like to prepare changes for it?

@blackarrow3542
Copy link
Contributor Author

Hi, I will work on that. I also added 'sigmoid' Activation layer for binary classification.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants