Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

What would it look like to add the backprop step as Python too? #36

Open
kwinkunks opened this issue May 11, 2023 · 0 comments
Open

What would it look like to add the backprop step as Python too? #36

kwinkunks opened this issue May 11, 2023 · 0 comments
Labels
enhancement New feature or request help wanted Extra attention is needed question Further information is requested

Comments

@kwinkunks
Copy link
Member

We have the forward pass, which already makes the nice point that the network is a function and the forward pass is cheap... but this does not reflect hyperparams directly, eg how regularization works.

What would it look like to add the training loop as a function as well? Too much?

@kwinkunks kwinkunks added enhancement New feature or request help wanted Extra attention is needed question Further information is requested labels May 11, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request help wanted Extra attention is needed question Further information is requested
Projects
None yet
Development

No branches or pull requests

1 participant