-
Notifications
You must be signed in to change notification settings - Fork 19.4k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Updating arguments to Lambda Layer after compiling #8170
Comments
No, Lambda layers are for applying parameter-free backend functions on input tensors. |
Deprecated : check my comment below.You could create a new layer that implements the same lambda functionality. The external argument can be created as a keras variable (K.placeholder) and passed to the layer. So when you have defined your new layer, you will add it to the model somewhat like this:
note that I haven't tested this but it should work. here is a list of sources:
|
Has anyone succeeded in implementing the lambda layer as described by @akshaychawla ? Let's say I want a lambda layer that does Any help would be really appreciated. EDIT: I've received some help here. |
@axeper I Wrote a small gist which implements the function out = a*x + b in a Lambda layer where "a", "b" and "x" are inputs. |
Is there any way to be able to pass in new arguments to a Lambda layer after a model is compiled?
I'm looking to pass in a different external argument every time I run the model, however it looks like the arguments passed in are locked once it's compiled
Any and all help would be appreciated, thank you
The text was updated successfully, but these errors were encountered: