You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hi, developers!
Thank you for your good package as always. I'm really interested in DiffOpt.jl, but it's a bit hard to understand so I'm glad to see more examples.
My observations are:
The ReLU example realises a ReLU layer by a (convex) optimisation problem
rrule in the example corresponds to the usual backpropagation rule, typically observed from ML packages.
I wonder if I understood the ReLU example correctly.
Thanks!
The text was updated successfully, but these errors were encountered:
Hi, developers!
Thank you for your good package as always. I'm really interested in DiffOpt.jl, but it's a bit hard to understand so I'm glad to see more examples.
My observations are:
rrule
in the example corresponds to the usual backpropagation rule, typically observed from ML packages.I wonder if I understood the ReLU example correctly.
Thanks!
The text was updated successfully, but these errors were encountered: