-
-
Notifications
You must be signed in to change notification settings - Fork 2k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add draft version of AutogradTensor #1942
Conversation
…r from a derivatives.yaml file.
In the last commit I wrote some functionality to generate grad_fn classes from a file called derivatives.yaml using build_gradients.py. It's similar to this file from Pytorch: https://github.com/pytorch/pytorch/blob/master/tools/autograd/derivatives.yaml. Except that I'm generating a Python file. This structure should make it easier to construct gradient functions for Tensor methods. I think the next step is to generate methods that can be attached to Autograd. |
…able for an AutogradTensor method. Also add a few more gradient functions.
Making progress. This time I set it up so that AutogradTensor automatically checks if a method has a gradient function defined in gradients.py. As we add gradients to derivatives.yaml, they'll automatically be available in the autograd system. Something I need help with is having |
…rs in the grad_fns themselves.
Thank you for leaving good comments here - makes it really easy to stay updated :) |
Okay! Finally added tests for ~6 gradient functions. @iamtrask What's left to do to merge all this? |
See #2095 for continuation of this PR |
Adding AutogradTensor for performing backpropagation. Only function that works right now is addition. An example is shown in the notebook.