Skip to content

How to do implicit differentiation for neural network parameters? #23

@JinraeKim

Description

@JinraeKim

Hi!

I'm trying to do implicit differentiation of the parameters for a given neural network.

For example, nn is a neural network constructed by Flux.jl. I can get the parameters by Flux.params(nn).

In this tutorial, I need to provide the parameters as the arguments of the forward solver function (here, it's lasso(data::ComponentArray)).

But I don't know how to do this for my case; namely, optimization_prob(parameters) = ...?

If I can overwrite network parameters with p = Flux.params(nn), then I would be able to do so like

function optimization_prob(parameters)  # will be provided by `Flux.params(nn)` outside this function
    load_parameters!(nn, parameters)
    # etc...
end

Is there any workaround for this?

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions