Skip to content

A minimalistic wrapper around NumPy which adds support for automatic differentiation.

License

Notifications You must be signed in to change notification settings

parsiad/micrograd-pp

Repository files navigation

micrograd_pp

GitHub

Micrograd++ is a minimalistic wrapper around NumPy which adds support for automatic differentiation. It also provides various composable classes ("layers") and other tools to simplify building neural networks.

Micrograd++ draws inspiration from Andrej Karpathy's awesome micrograd library, prioritizing simplicity and readability over speed. Unlike micrograd, which tackles scalar inputs, Micrograd++ supports tensor inputs (specifically, NumPy arrays). This makes it possible to train larger networks.

Usage

Micrograd++ is not yet pip-able. Therefore, you will have to clone the Micrograd++ repository to your home directory and include it in any script or notebook you want to use it in by first executing the snippet below:

import sys
sys.path.insert(0, os.path.expanduser("~/micrograd-pp/python"))

Examples

Features

  • Core
    • ☒ Reverse-mode automatic differentiation (.backward)
    • ☒ GPU support
  • Layers
    • ☒ BatchNorm1d
    • ☒ Dropout
    • ☒ Embedding
    • ☒ LayerNorm
    • ☒ Linear
    • ☒ MultiheadAttention
    • ☒ ReLU
    • ☒ Sequential
  • Optimizers
    • ☐ Adam
    • ☒ Stochastic Gradient Descent (SGD)

About

A minimalistic wrapper around NumPy which adds support for automatic differentiation.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages