Micrograd++ is a minimalistic wrapper around NumPy which adds support for automatic differentiation. It also provides various composable classes ("layers") and other tools to simplify building neural networks.
Micrograd++ draws inspiration from Andrej Karpathy's awesome micrograd library, prioritizing simplicity and readability over speed. Unlike micrograd, which tackles scalar inputs, Micrograd++ supports tensor inputs (specifically, NumPy arrays). This makes it possible to train larger networks.
Micrograd++ is not yet pip-able. Therefore, you will have to clone the Micrograd++ repository to your home directory and include it in any script or notebook you want to use it in by first executing the snippet below:
import sys
sys.path.insert(0, os.path.expanduser("~/micrograd-pp/python"))
- Train a simple feedforward neural network on MNIST to classify handwritten digits
- Learn an n-gram model to generate text
- Train a decoder-only transformer to generate text
- Core
- ☒ Reverse-mode automatic differentiation (
.backward
) - ☒ GPU support
- ☒ Reverse-mode automatic differentiation (
- Layers
- ☒ BatchNorm1d
- ☒ Dropout
- ☒ Embedding
- ☒ LayerNorm
- ☒ Linear
- ☒ MultiheadAttention
- ☒ ReLU
- ☒ Sequential
- Optimizers
- ☐ Adam
- ☒ Stochastic Gradient Descent (SGD)