Skip to content


Switch branches/tags

Name already in use

A tag already exists with the provided branch name. Many Git commands accept both tag and branch names, so creating this branch may cause unexpected behavior. Are you sure you want to create this branch?


Failed to load latest commit information.
Latest commit message
Commit time


Equinox is your one-stop JAX library, for everything you need that isn't already in core JAX:

  • neural networks (or more generally any model), with easy-to-use PyTorch-like syntax;
  • filtered APIs for transformations;
  • useful PyTree manipulation routines;
  • advanced features like runtime errors;

and best of all, Equinox isn't a framework: everything you write in Equinox is compatible with anything else in JAX or the ecosystem.

If you're completely new to JAX, then start with this CNN on MNIST example.

Coming from Flax or Haiku? The main difference is that Equinox (a) offers a lot of advanced features not found in these libraries, like PyTree manipulation or runtime errors; (b) has a simpler way of building models: they're just PyTrees, so they can pass across JIT/grad/etc. boundaries smoothly.


pip install equinox

Requires Python 3.9+ and JAX 0.4.13+.


Available at

Quick example

Models are defined using PyTorch-like syntax:

import equinox as eqx
import jax

class Linear(eqx.Module):
    weight: jax.Array
    bias: jax.Array

    def __init__(self, in_size, out_size, key):
        wkey, bkey = jax.random.split(key)
        self.weight = jax.random.normal(wkey, (out_size, in_size))
        self.bias = jax.random.normal(bkey, (out_size,))

    def __call__(self, x):
        return self.weight @ x + self.bias

and fully compatible with normal JAX operations:

def loss_fn(model, x, y):
    pred_y = jax.vmap(model)(x)
    return jax.numpy.mean((y - pred_y) ** 2)

batch_size, in_size, out_size = 32, 2, 3
model = Linear(in_size, out_size, key=jax.random.PRNGKey(0))
x = jax.numpy.zeros((batch_size, in_size))
y = jax.numpy.zeros((batch_size, out_size))
grads = loss_fn(model, x, y)

Finally, there's no magic behind the scenes. All eqx.Module does is register your class as a PyTree. From that point onwards, JAX already knows how to work with PyTrees.


If you found this library to be useful in academic work, then please cite: (arXiv link)

    author={Patrick Kidger and Cristian Garcia},
    title={{E}quinox: neural networks in {JAX} via callable {P}y{T}rees and filtered transformations},
    journal={Differentiable Programming workshop at Neural Information Processing Systems 2021}

(Also consider starring the project on GitHub.)

See also: other libraries in the JAX ecosystem

Optax: first-order gradient (SGD, Adam, ...) optimisers.

Diffrax: numerical differential equation solvers.

Lineax: linear solvers and linear least squares.

jaxtyping: type annotations for shape/dtype of arrays.

Eqxvision: computer vision models.

sympy2jax: SymPy<->JAX conversion; train symbolic expressions via gradient descent.

Levanter: scalable+reliable training of foundation models (e.g. LLMs).


Equinox is maintained by Patrick Kidger at Google X, but this is not an official Google product.