Skip to content

Releases: leonard-gleyzer/connex

Connex v0.3.3

23 Jul 05:24
Compare
Choose a tag to compare

Small changes to types to comply with Python 3.9

Connex v0.3.2

06 Jun 22:57
Compare
Choose a tag to compare

This release fixes a shape mismatch bug in applying neuron-level self-attention.

Connex v0.3.1

06 Jun 21:33
Compare
Choose a tag to compare

This release fixes a bug in applying adaptive activations, where previously the adaptive activation parameters were incorrectly vmap-ed.

Connex v0.3.0

03 Jun 05:16
Compare
Choose a tag to compare

The main changes are:

  • Making key a keyword-only argument in a NeuralNetwork's __call__ method
  • Removing set_dropout_p from the NeuralNetwork class: Instead of network.set_dropout_p(...), write connex.set_dropout_p(network, ...).
  • Bumping the minimum Python version to 3.9

Connex v0.2.1

27 May 20:13
Compare
Choose a tag to compare

set_dropout_p has been moved to the plasticity module

v0.2.0

26 May 02:14
Compare
Choose a tag to compare

This release is a major rewrite of the library, greatly improving memory efficiency and performance.

v0.1.4

09 Jul 16:59
Compare
Choose a tag to compare

Use networkx instead of numpy for topological batching, making it significantly faster.

v0.1.3

04 Jul 00:07
Compare
Choose a tag to compare
  • Added option for group-wise output activations, e.g. softmax.
  • Export network to NetworkX graph for network analysis/debugging.

v0.1.2

30 Jun 18:37
Compare
Choose a tag to compare

Internally changed topological batching to use numpy instead of jax.numpy, making it significantly faster.

v0.1.1

30 Jun 17:19
Compare
Choose a tag to compare

Internal changes, mainly restructuring the forward pass for performance.