Skip to content
This repository has been archived by the owner on Apr 4, 2024. It is now read-only.

Releases: Benny-Nottonson/voodoo

v0.4.1-alpha

14 Jan 07:46
Compare
Choose a tag to compare

Overhauled activation functions for efficiency and fixed certain bugs, removed tests since they needed to be reworked

v0.4.0-alpha

13 Jan 15:44
Compare
Choose a tag to compare

Release Notes - v0.4.0-alpha

The following activation functions are now supported:

  1. ReLU (Rectified Linear Unit)

    • Parameters: negative_slope, max_value, threshold
  2. Sigmoid

    • No additional parameters
  3. Softmax

    • No additional parameters
  4. Softplus

    • No additional parameters
  5. Softsign

    • No additional parameters
  6. Tanh (Hyperbolic Tangent)

    • No additional parameters
  7. SELU (Scaled Exponential Linear Unit)

    • No additional parameters
  8. ELU (Exponential Linear Unit)

    • Parameters: alpha
  9. Exponential

    • No additional parameters
  10. Leaky ReLU

    • Parameters: alpha
  11. ReLU6

    • No additional parameters
  12. SiLU (Sigmoid Linear Unit)

    • No additional parameters
  13. Swish

    • No additional parameters
  14. GELU (Gaussian Error Linear Unit)

    • Parameters: approximate
  15. Hard Sigmoid

    • No additional parameters
  16. Linear

    • No additional parameters
  17. Mish

    • No additional parameters
  18. Log Softmax

    • No additional parameters

Loss Functions

The release also includes various loss functions to cater to different optimization objectives:

  1. Mean Squared Error (MSE)
  2. Mean Absolute Error (MAE)
  3. Mean Absolute Percentage Error (MAPE)
  4. Mean Squared Logarithmic Error (MSLE)

v0.3.1-alpha

10 Jan 17:19
Compare
Choose a tag to compare

Fully completed activation layers, more information can be found here

v0.3.0-alpha

09 Jan 19:39
Compare
Choose a tag to compare

Rewrote activations and arithmetic functions to use generics, speedups on uses of alias, changes to Graph

v0.2.6-alpha

08 Jan 04:07
Compare
Choose a tag to compare

Tensor struct Rewritten for memory and speed efficiency, other small tweaks and bugfixes

v0.2.5-alpha

04 Jan 17:03
Compare
Choose a tag to compare

Finalized API for Dense and Conv layers, optimized training and sample data

v0.2.1-alpha

03 Jan 16:02
Compare
Choose a tag to compare

Fully added CNN example designed for MNIST dataset

v0.2.0-alpha

02 Jan 02:42
Compare
Choose a tag to compare

Very WIP versions of the following layers have been added

  • Dense (Linear)
  • Max Pool
  • Dropout
  • LeakyRelu
  • Conv2D
  • Flatten

v0.1.0-alpha

30 Dec 02:40
Compare
Choose a tag to compare
conv2 added