New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Sharing weights across layers in keras 3 [feature request] #18821
Comments
A simpler solution to your problem would be:
|
Nice! |
This is a priori now not possible by design in keras 3 after the layer is built. See for instance this issue: keras-team/keras#18419 (comment) where it is advised to embed the layer whose weights we want to share. In our usecase (reproduce a given model by splitting the activations into separate layers but keeping the weights to get synchronized with original model), this is not a solution. We implement this workaround, even though using a private method for that. A feature request has been done on keras 3 repo: keras-team/keras#18821
This is a priori now not possible by design in keras 3 after the layer is built. See for instance this issue: keras-team/keras#18419 (comment) where it is advised to embed the layer whose weights we want to share. In our usecase (reproduce a given model by splitting the activations into separate layers but keeping the weights to get synchronized with original model), this is not a solution. We implement this workaround, even though using a private method for that. A feature request has been done on keras 3 repo: keras-team/keras#18821
This is a priori now not possible by design in keras 3 after the layer is built. See for instance this issue: keras-team/keras#18419 (comment) where it is advised to embed the layer whose weights we want to share. In our usecase (reproduce a given model by splitting the activations into separate layers but keeping the weights to get synchronized with original model), this is not a solution. We implement this workaround, even though using a private method for that. A feature request has been done on keras 3 repo: keras-team/keras#18821
This is a priori now not possible by design in keras 3 after the layer is built. See for instance this issue: keras-team/keras#18419 (comment) where it is advised to embed the layer whose weights we want to share. In our usecase (reproduce a given model by splitting the activations into separate layers but keeping the weights to get synchronized with original model), this is not a solution. We implement this workaround, even though using a private method for that. A feature request has been done on keras 3 repo: keras-team/keras#18821
This is a priori now not possible by design in keras 3 after the layer is built. See for instance this issue: keras-team/keras#18419 (comment) where it is advised to embed the layer whose weights we want to share. In our usecase (reproduce a given model by splitting the activations into separate layers but keeping the weights to get synchronized with original model), this is not a solution. We implement this workaround, even though using a private method for that. A feature request has been done on keras 3 repo: keras-team/keras#18821
It does not work anymore from keras 3.0.3 since |
We'll add a setter for the kernel. |
Thx! |
The setter thing turned out to be problematic. What I would recommend is just direct setting but use Ref: #19469 |
It seems that sharing weights is not possible anymore afterwards in keras 3. We should instead share layers as explained here.
But I have a usecase where I need to share a weight
In my usecase, I transform a model by splitting activations out of each layer, that means a Dense(3, activation="relu") is transformed in a Dense(3) + Activation layer. But I need
For now I have a solution but that use private attribute since by design this is currently not possible in keras 3.
Here is an example that works for sharing kernel (I actually will use something more generic to share any weight, but this is simpler to look at):
Notes:
layer2.kernel = layer1.kernel
after build will raise an error because of the lock.The text was updated successfully, but these errors were encountered: