Skip to content

Add unit-aware gradients and corresponding documentation#66

Merged
chaoming0625 merged 4 commits intomainfrom
update
Nov 25, 2024
Merged

Add unit-aware gradients and corresponding documentation#66
chaoming0625 merged 4 commits intomainfrom
update

Conversation

@chaoming0625
Copy link
Member

This pull request introduces the autograd module to the brainunit package, which includes functionality for automatic differentiation. The key changes involve adding new files for the autograd module, updating the __init__.py file to include the new module, and modifying existing methods for better compatibility.

New autograd module:

  • brainunit/autograd/__init__.py: Added the autograd module with functions for automatic differentiation, such as value_and_grad, grad, vector_grad, jacobian, jacrev, jacfwd, and hessian.
  • brainunit/autograd/_hessian.py: Implemented the hessian function, which computes the Hessian of a function with physical unit awareness.
  • brainunit/autograd/_jacobian.py: Implemented jacrev, jacfwd, and jacobian functions for computing Jacobians with physical unit awareness.

Updates to existing files:

  • brainunit/__init__.py: Imported the new autograd module and updated the __all__ list to include autograd. [1] [2]
  • brainunit/_base.py: Modified the reshape method signature to accept a single shape argument instead of variadic arguments.

New tests:

@chaoming0625 chaoming0625 merged commit 7992d8e into main Nov 25, 2024
@chaoming0625 chaoming0625 deleted the update branch November 25, 2024 07:33
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant