Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Integration with Flux.jl #147

Closed
JinraeKim opened this issue Oct 27, 2021 · 4 comments
Closed

Integration with Flux.jl #147

JinraeKim opened this issue Oct 27, 2021 · 4 comments

Comments

@JinraeKim
Copy link
Contributor

JinraeKim commented Oct 27, 2021

As a standard tool of machine learning, Flux.jl is widely used in Julia.

It would be really helpful if users can use DiffOpt.jl conveniently with Flux.jl.

A notable example would be cvxpylayers.

If it is already possible, please share a simpler and clearer tutorial for this feature :)

@joaquimg
Copy link
Member

joaquimg commented Dec 7, 2021

As you pointed out in #168
There is one in the ReLU example.

@joaquimg joaquimg closed this as completed Dec 7, 2021
@JinraeKim
Copy link
Contributor Author

Sorry but I don't think the ReLU example indicates that DiffOpt.jl is integrated with Flux.jl as users have to manually write the rrule part carefully.

@joaquimg
Copy link
Member

joaquimg commented Dec 8, 2021

An issue or PR with a concrete alternative design would be much appreciated.

@JinraeKim
Copy link
Contributor Author

An issue or PR with a concrete alternative design would be much appreciated.

I see.
Thank you for your reply :)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Development

No branches or pull requests

2 participants