You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I'm implementing a custom lambda layer in Keras. According to my guide, it should take in the previous variable (x) and multiply it with a learning rate.
y2= Lambda(lambda x: x/lr)(y1)
But this way, it need the lr to be specified ahead of time:
lr=0.2
y2= Lambda(lambda x: x/lr)(y1)
But what I need is something that take in the lr as a parameter which variable in each iteration, something like the following code:
y2= Lambda(lambda x: x/lr)(y1)
...
lr=1
for i in range(epochs):
lr=lr/1.0001
loss=Model.tain_on_batch(.....)
that means I need the lr not to be specified as a fixed parameter so that it can change over time. Any help?
The text was updated successfully, but these errors were encountered:
Morz114114
changed the title
Shifting a hyperparameter in a lambda layer
A hyperparameter in a lambda layer
Oct 12, 2020
@Morz114114 Moving this issue to closed status as there has been no recent activity, in case you still face the error please create a new issue.Thanks!
I'm implementing a custom lambda layer in Keras. According to my guide, it should take in the previous variable (x) and multiply it with a learning rate.
y2= Lambda(lambda x: x/lr)(y1)
But this way, it need the lr to be specified ahead of time:
lr=0.2
y2= Lambda(lambda x: x/lr)(y1)
But what I need is something that take in the lr as a parameter which variable in each iteration, something like the following code:
y2= Lambda(lambda x: x/lr)(y1)
...
lr=1
for i in range(epochs):
lr=lr/1.0001
loss=Model.tain_on_batch(.....)
that means I need the lr not to be specified as a fixed parameter so that it can change over time. Any help?
The text was updated successfully, but these errors were encountered: