-
Notifications
You must be signed in to change notification settings - Fork 104
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Diffhask integration #61
Comments
I've resolved the current instance resolution issues and am filling out the numeric hierarchy for diffhask. While one can define arbitrary operations, adding concrete numeric instances for What's the sitch in the roadmap for numhask instances? |
Current status: totally forgotten. I've been getting baseline AD integration with backprop in place. Right now I'm trying to scope out what the NN library will look like with convolutions which, I believe, needs a bit of reshaping (unintended pun) for a better AD experience. NumHask doesn't have an dependencies, as of |
This is slowly getting booted back up and I'm adding @mitchellwrosen to this ticket. To start, you should check out Conal Elliot's The simple essence of automatic differentiation. Added a followup and made this issue's title more specific. |
…tatype Rename DataType to Dtype
AD is now handled by libtorch. |
@o1lo01ol1o I know you are already working on this with diffhask so here's the official ticket to say "hasktorch needs this, please keep doing you." If you need anything, feel free to ping people here.
Followup ticket will be "Implement Learning to learn by gradient descent by gradient descent".
The text was updated successfully, but these errors were encountered: