Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

relu_limited and risk_estimation definition #16

Open
mg64ve opened this issue Jan 10, 2019 · 1 comment
Open

relu_limited and risk_estimation definition #16

mg64ve opened this issue Jan 10, 2019 · 1 comment

Comments

@mg64ve
Copy link

mg64ve commented Jan 10, 2019

Hello, nice code, congrats!
I have a question regarding the 2 functions above.
Can relu_limited and risk_estimation be defined inside the python code instead of $PYTHON_DIR/dist-packages/keras/losses.py and $PYTHON_DIR/dist-packages/keras/activations.py ?
I want to run this in a docker container and it would be simpler if I can define them, for instance, in gossip.py.
Please let me know what you think about it.

@mr8bit
Copy link

mr8bit commented Jun 12, 2019

Hi, @mg64ve
You can add a few lines of code to gossip.py. To not change the source code keras

from keras import backend as K
from keras.utils.generic_utils import get_custom_objects
class ReluLimited(Activation):
    def __init__(self, activation, **kwargs):
        super(ReluLimited, self).__init__(activation, **kwargs)
        self.__name__ = 'ReluLimited'

def relu_limited(x, alpha=0., max_value=1.):
    return K.relu(x, alpha=alpha, max_value=max_value)

get_custom_objects().update({'relu_limited': ReluLimited(relu_limited)})


def risk_estimation(y_true, y_pred):
    return -100. * K.mean((y_true - 0.0002) * y_pred)

And change the WindPuller initialization function.

def __init__(self, input_shape, lr=0.01, n_layers=2, n_hidden=8, rate_dropout=0.2, loss=risk_estimation):
        print("initializing..., learing rate %s, n_layers %s, n_hidden %s, dropout rate %s." %(lr, n_layers, n_hidden, rate_dropout))
        self.model = Sequential()
        self.model.add(Dropout(rate=rate_dropout, input_shape=(input_shape[0], input_shape[1])))
        for i in range(0, n_layers - 1):
            self.model.add(LSTM(n_hidden * 4, return_sequences=True, activation='tanh',
                                recurrent_activation='hard_sigmoid', kernel_initializer='glorot_uniform',
                                recurrent_initializer='orthogonal', bias_initializer='zeros',
                                dropout=rate_dropout, recurrent_dropout=rate_dropout))
        self.model.add(LSTM(n_hidden, return_sequences=False, activation='tanh',
                                recurrent_activation='hard_sigmoid', kernel_initializer='glorot_uniform',
                                recurrent_initializer='orthogonal', bias_initializer='zeros',
                                dropout=rate_dropout, recurrent_dropout=rate_dropout))
        self.model.add(Dense(1, kernel_initializer=initializers.glorot_uniform()))
        # self.model.add(BatchNormalization(axis=-1, moving_mean_initializer=Constant(value=0.5),
        #               moving_variance_initializer=Constant(value=0.25)))
        self.model.add(BatchRenormalization(axis=-1, beta_init=Constant(value=0.5)))
        self.model.add(Activation('relu_limited'))
        opt = RMSprop(lr=lr)
        self.model.compile(loss=loss,
                      optimizer=opt,
                      metrics=['accuracy'])

This option works in Google Colab

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants