Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Custom layers #17

Closed
lenlen opened this issue Apr 5, 2016 · 12 comments
Closed

Custom layers #17

lenlen opened this issue Apr 5, 2016 · 12 comments

Comments

@lenlen
Copy link

lenlen commented Apr 5, 2016

It seems that custom layers are not recognized in the train phase of spark model, specifically when loaded from yaml (model_from_yaml). I have tried with a custom activation function and it didn't work, the get_from_module function raised an exception "Invalid activation function: ...".

@maxpumperla
Copy link
Owner

Thanks. Is that maybe a problem specific to Keras, i.e. can you .to_yaml and model_from_yaml() outside of Elephas? Does that work for your layer with plain Keras?

@maxpumperla
Copy link
Owner

I'm asking because the get_from_module from Keras will probably only look into the activations submodule of core Keras.

@lenlen
Copy link
Author

lenlen commented Apr 5, 2016

Thanks @maxpumperla, you are right, the model_from_yaml with custom activation function doesn't work in keras too.
This is an example code:

#model is the keras module
#my_function is my activation function
y_string = model.to_yaml()
new_model = model_from_yaml(y_string, custom_objects={"my_function":my_function})

Maybe this is not the right way to get the model with custom object... but I can't find an example of usage in keras documentation.

@maxpumperla
Copy link
Owner

Sure, no problem. If you really want to test this right now, fork Keras and put your special stuff there, i.e. in your case put the new activation next to the other keras activations. Then install your fork locally (cd your_keras && python setup.py install). This way there won't be a difference between custom and core.

Currently I see no quick way around this. You have to "register" your custom things somewhere, otherwise functionality like model_from_yaml simply has no way of knowing.

@maxpumperla
Copy link
Owner

Feel free to close, if you like

@lenlen lenlen closed this as completed Apr 5, 2016
@lenlen
Copy link
Author

lenlen commented Apr 5, 2016

Ok thanks, I closed this issue.

@lenlen
Copy link
Author

lenlen commented Apr 6, 2016

Hi @maxpumperla I reopen this ticket because I have some news. Using a custom layer instead of custom activation function, the read from yaml works in keras but it doesn't work in elephas. I think that the problem is in spark_model.py when the model_from_yaml(..) is called during the train because it needs the custom_objects parameter. Using custom_objects parameter directly in keras works for example:

model = model_from_yaml(self.yaml,  custom_objects={"MyLayer":MyLayer, "myLoss":myLoss})

@lenlen lenlen reopened this Apr 6, 2016
@deli4iled
Copy link

I've the same problem. I created a custom layer but it doesn't work.

@maxpumperla
Copy link
Owner

Alright, I will have to have a closer look at this. I'll have to see how to get the custom_objects parameter properly into an elephas model. Thanks for this, I'll get back to you

@FelipeMarcelino
Copy link

I have the same problem with custom loss function. Somone have a quick fix for this??

@danielenricocahall
Copy link
Collaborator

I will look into this - I think it will just be a matter of ensuring custom_objects can be optionally supplied every place we call it.

@danielenricocahall
Copy link
Collaborator

This is officially resolved in 1.2.0 - you'll just have to ensure to supply the custom_objects anywhere the model is loaded i.e;

    # define custom activation function 
    def custom_activation(x):
        ....
    # define model
     model = Sequential()
     model.add(Dense(1, input_dim=1, activation=custom_activation))
     ...
    model.add(Dense(1, activation='sigmoid'))
    spark_model = SparkModel(model, frequency='epoch', mode=mode,
                             custom_objects={'custom_activation': custom_activation})

Same process applies for a custom layer.In the context of an estimator:

    # define custom activation function 
    def custom_activation(x):
        ....
    # define model
     model = Sequential()
     model.add(Dense(1, input_dim=1, activation=custom_activation))

    estimator = ElephasEstimator()
    estimator.set_keras_model_config(model.to_yaml())
    estimator.set_optimizer_config(sgd_conf)
    ...
    estimator.set_epochs(10)
    estimator.set_batch_size(32)
    estimator.set_validation_split(0.01)
    estimator.set_custom_objects({'custom_activation': custom_activation}) # provide custom_objects in setter method

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

5 participants