Releases: Benny-Nottonson/voodoo
v0.4.1-alpha
Overhauled activation functions for efficiency and fixed certain bugs, removed tests since they needed to be reworked
v0.4.0-alpha
Release Notes - v0.4.0-alpha
The following activation functions are now supported:
-
ReLU (Rectified Linear Unit)
- Parameters:
negative_slope
,max_value
,threshold
- Parameters:
-
Sigmoid
- No additional parameters
-
Softmax
- No additional parameters
-
Softplus
- No additional parameters
-
Softsign
- No additional parameters
-
Tanh (Hyperbolic Tangent)
- No additional parameters
-
SELU (Scaled Exponential Linear Unit)
- No additional parameters
-
ELU (Exponential Linear Unit)
- Parameters:
alpha
- Parameters:
-
Exponential
- No additional parameters
-
Leaky ReLU
- Parameters:
alpha
- Parameters:
-
ReLU6
- No additional parameters
-
SiLU (Sigmoid Linear Unit)
- No additional parameters
-
Swish
- No additional parameters
-
GELU (Gaussian Error Linear Unit)
- Parameters:
approximate
- Parameters:
-
Hard Sigmoid
- No additional parameters
-
Linear
- No additional parameters
-
Mish
- No additional parameters
-
Log Softmax
- No additional parameters
Loss Functions
The release also includes various loss functions to cater to different optimization objectives:
- Mean Squared Error (MSE)
- Mean Absolute Error (MAE)
- Mean Absolute Percentage Error (MAPE)
- Mean Squared Logarithmic Error (MSLE)
v0.3.1-alpha
Fully completed activation layers, more information can be found here
v0.3.0-alpha
Rewrote activations and arithmetic functions to use generics, speedups on uses of alias, changes to Graph
v0.2.6-alpha
Tensor struct Rewritten for memory and speed efficiency, other small tweaks and bugfixes
v0.2.5-alpha
Finalized API for Dense and Conv layers, optimized training and sample data
v0.2.1-alpha
Fully added CNN example designed for MNIST dataset
v0.2.0-alpha
Very WIP versions of the following layers have been added
- Dense (Linear)
- Max Pool
- Dropout
- LeakyRelu
- Conv2D
- Flatten
v0.1.0-alpha
conv2 added