-
Notifications
You must be signed in to change notification settings - Fork 8
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
How to do implicit differentiation for neural network parameters? #23
Comments
Once you create the implicit function, you can nest it with other functions. I don't understand what you are trying to do if it's not a simple nesting of 2 functions. Just make sure your second function you are nesting with the implicit function is Zygote compatible on its own first. |
@JinraeKim can you please post a complete example here? |
@mohamed82008 I may begin my side project this year, so if I figure it out or struggle with this, will post a comment! |
My workaround for this is to define the last layer of a neural network of interest (related to solving an optimization problem) by this package and put the intermediate values of the neural network to be the optimization parameters of the last layer, see #67 |
Hi!
I'm trying to do implicit differentiation of the parameters for a given neural network.
For example,
nn
is a neural network constructed by Flux.jl. I can get the parameters byFlux.params(nn)
.In this tutorial, I need to provide the parameters as the arguments of the forward solver function (here, it's
lasso(data::ComponentArray)
).But I don't know how to do this for my case; namely,
optimization_prob(parameters) = ...?
If I can overwrite network parameters with
p = Flux.params(nn)
, then I would be able to do so likeIs there any workaround for this?
The text was updated successfully, but these errors were encountered: