Skip to content

v6.1.3: More neural network functions and training continuation

Compare
Choose a tag to compare
@honnibal honnibal released this 09 Jan 23:46

✨ Major features and improvements

  • NEW: Add several useful higher-order functions, including @layerize and @metalayerize decorators to turn functions into weightless layers.
  • NEW: Add batch normalization layer.
  • NEW: Add residual layer using pre-activation approach.
  • Simplify model setup and initialization.
  • Add ELU layer.

🔴 Bug fixes

  • The AveragedPerceptron class can now continue training after model loading. Previously, the weights were zeroed for each feature as soon as it was updated. This affected spaCy users, especially those adding new classes to the named entity recognizer.

📖 Documentation and examples