Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Updating arguments to Lambda Layer after compiling #8170

Closed
nicksam112 opened this issue Oct 17, 2017 · 4 comments
Closed

Updating arguments to Lambda Layer after compiling #8170

nicksam112 opened this issue Oct 17, 2017 · 4 comments

Comments

@nicksam112
Copy link

Is there any way to be able to pass in new arguments to a Lambda layer after a model is compiled?

I'm looking to pass in a different external argument every time I run the model, however it looks like the arguments passed in are locked once it's compiled

Any and all help would be appreciated, thank you

@kgrm
Copy link

kgrm commented Oct 20, 2017

No, Lambda layers are for applying parameter-free backend functions on input tensors.

@akshaychawla
Copy link
Contributor

akshaychawla commented Oct 23, 2017

Deprecated : check my comment below.

You could create a new layer that implements the same lambda functionality. The external argument can be created as a keras variable (K.placeholder) and passed to the layer. So when you have defined your new layer, you will add it to the model somewhat like this:

external_argument = K.placeholder(shape=(1,))
inputs = Input(shape=(784,))
x = Dense(64, activation='relu')(inputs)
x = Dense(64, activation='relu')(x)
x = MyLayer()( [x, external_argument ] ) # keras layers expect a list of inputs to the def call function in the layer definition 
predictions = Dense(10, activation='softmax')(x)

# mention your new external argument as an input when creating your model
model = Model(inputs=[inputs, external_argument], outputs=[predictions])
model.compile(optimizer='sgd', loss='categorical_crossentropy', metrics=['accuracy'])

note that I haven't tested this but it should work. here is a list of sources:

  1. https://keras.io/layers/writing-your-own-keras-layers/ : Read this on how to implement your layer. Also note the def call function in the layer, the "x" can be a list of tensors rather than a single tensor. In your case it will be the tensor coming from above the graph AND the external_argument that you have defined as a keras tensor.
  2. https://keras.io/getting-started/functional-api-guide/ : since you will use the functional api instead of sequential api.
  3. https://keras.io/backend/#using-the-abstract-keras-backend-to-write-new-code : A primer on using the keras backend, especially using the K.placeholder functionality.

@axeper
Copy link

axeper commented Nov 10, 2017

Has anyone succeeded in implementing the lambda layer as described by @akshaychawla ?
I am currently facing the same issue as @nicksam112 and am quite lost about creating such a custom layer with placeholders.

Let's say I want a lambda layer that does out = a * in + b but a and b are modified at the end of each epoch using a callback. How would one go about that?

Any help would be really appreciated.

EDIT: I've received some help here.

@akshaychawla
Copy link
Contributor

akshaychawla commented Nov 22, 2017

@axeper I Wrote a small gist which implements the function out = a*x + b in a Lambda layer where "a", "b" and "x" are inputs.
In this gist I generate the inputs (x,a,b) before training, but to implement the functionality that you described you can have a class that contains a keras callback AND a keras data generator. The data being generated from the generator can be changed by the callback which will run after every epoch.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

5 participants