Releases: leonard-gleyzer/connex
Releases · leonard-gleyzer/connex
Connex v0.3.3
Small changes to types to comply with Python 3.9
Connex v0.3.2
This release fixes a shape mismatch bug in applying neuron-level self-attention.
Connex v0.3.1
This release fixes a bug in applying adaptive activations, where previously the adaptive activation parameters were incorrectly vmap
-ed.
Connex v0.3.0
The main changes are:
- Making
key
a keyword-only argument in aNeuralNetwork
's__call__
method - Removing
set_dropout_p
from theNeuralNetwork
class: Instead ofnetwork.set_dropout_p(...)
, writeconnex.set_dropout_p(network, ...)
. - Bumping the minimum Python version to 3.9
Connex v0.2.1
set_dropout_p
has been moved to the plasticity module
v0.2.0
This release is a major rewrite of the library, greatly improving memory efficiency and performance.
v0.1.4
Use networkx
instead of numpy
for topological batching, making it significantly faster.
v0.1.3
- Added option for group-wise output activations, e.g. softmax.
- Export network to NetworkX graph for network analysis/debugging.
v0.1.2
Internally changed topological batching to use numpy
instead of jax.numpy
, making it significantly faster.
v0.1.1
Internal changes, mainly restructuring the forward pass for performance.