Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Diffhask integration #61

Closed
stites opened this issue Jan 19, 2018 · 5 comments
Closed

Diffhask integration #61

stites opened this issue Jan 19, 2018 · 5 comments
Assignees
Projects

Comments

@stites
Copy link
Member

stites commented Jan 19, 2018

@o1lo01ol1o I know you are already working on this with diffhask so here's the official ticket to say "hasktorch needs this, please keep doing you." If you need anything, feel free to ping people here.

Followup ticket will be "Implement Learning to learn by gradient descent by gradient descent".

@stites stites created this issue from a note in hasktorch (In Progress) Jan 19, 2018
@stites stites added this to the hackage release - v0.1.0 milestone Jan 19, 2018
@o1lo01ol1o
Copy link
Collaborator

o1lo01ol1o commented Jan 19, 2018

👍

o1lo01ol1o/diffhask#2

@o1lo01ol1o
Copy link
Collaborator

I've resolved the current instance resolution issues and am filling out the numeric hierarchy for diffhask. While one can define arbitrary operations, adding concrete numeric instances for numhask and container and numeric instances for numhask-array will get you diffhask compatability for the corresponding linear algebra operations.

What's the sitch in the roadmap for numhask instances?

@stites
Copy link
Member Author

stites commented May 4, 2018

Current status: totally forgotten. I've been getting baseline AD integration with backprop in place. Right now I'm trying to scope out what the NN library will look like with convolutions which, I believe, needs a bit of reshaping (unintended pun) for a better AD experience.

NumHask doesn't have an dependencies, as of >= 0.2.0.0, so hasktorch can have orphan instances directly added to the indefinites without too much overhead (I think). Tensors are already instances of Num, so I assume that about a quarter of the work is already done. Triaging which functions are pure and which are not, RE: #85, might pair well with the numhask instances -- opening an issue for that now.

@stites
Copy link
Member Author

stites commented Jul 11, 2018

This is slowly getting booted back up and I'm adding @mitchellwrosen to this ticket. To start, you should check out Conal Elliot's The simple essence of automatic differentiation.

Added a followup and made this issue's title more specific.

@stites stites changed the title Automatic differentiation prototype Diffhask integration Jul 11, 2018
@austinvhuang austinvhuang moved this from In Progress to Backlog in hasktorch Feb 1, 2019
stites pushed a commit to stites/hasktorch that referenced this issue Aug 4, 2019
@stites
Copy link
Member Author

stites commented Aug 4, 2019

AD is now handled by libtorch.

@stites stites closed this as completed Aug 4, 2019
hasktorch automation moved this from Backlog to Done Aug 4, 2019
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
hasktorch
  
Done
Development

No branches or pull requests

3 participants