simplegrad is a small automatic differentiation project built in Python on top of NumPy.
It provides a minimal Tensor type with gradient tracking, a reverse-mode backward pass, and a few core tensor operations intended for learning and experimentation.
- NumPy-backed tensor storage
- Optional gradient tracking with
requires_grad=True - Reverse-mode autodiff via
backward(output) - Basic elementwise operations:
- addition
- subtraction
- multiplication
- ReLU
- Matrix multiplication with
@ - Reductions with
sum()andmean() softmax()andlog()for simple loss construction- Small neural-network helpers in
simplegrad.nn
- Python
>=3.13 - NumPy
Dependencies are declared in pyproject.toml.
With uv:
uv syncOr with pip in a virtual environment:
python -m venv .venv
source .venv/bin/activate
pip install -e .The repository includes a basic tensor/autodiff demo in example.py:
from simplegrad.tensor import Tensor, backward
x = Tensor([2.0, -1.0], requires_grad=True)
y = Tensor([3.0, 4.0], requires_grad=True)
z = (x + y) * x
w = z.relu()
backward(w)
print(w)
print(x.grad)
print(y.grad)Run it with:
uv run python example.pyIt also includes example-nn.py, which trains a small two-layer neural network to learn XOR using Linear, ReLU, SoftmaxLoss, and Adam:
uv run python example-nn.pyThe script prints the final loss, class predictions, accuracy, and output probabilities for the four XOR inputs.
Run the test suite with:
uv run pytestIf you are using an activated virtual environment instead of uv, python -m pytest also works.
Main entry points:
simplegrad.tensor.Tensorsimplegrad.tensor.backwardsimplegrad.tensor.no_grad
Useful tensor methods and operators:
Tensor(..., requires_grad=True)a + ba - ba * ba @ ba.relu()tensor.sum(...)tensor.mean(...)tensor.softmax(axis=-1)tensor.log()tensor.zero_grad()tensor.detach()with no_grad(): ...
simplegrad/tensor.py:Tensorimplementation and backpropagationsimplegrad/ops.py: operation-specific backward rulessimplegrad/function.py: baseFunctionabstractionsimplegrad/nn/: minimal modules, losses, and optimizerstests/: pytest coverage for tensor, module, loss, and optimizer behaviorexample.py: small runnable demoexample-nn.py: XOR training example usingsimplegrad.nn
This is a compact educational implementation, not a full deep learning framework.