contains different kind of helper
functions and classes
implements abstract classes as support for the other algorithms - for example implementing the score()
function for all classifiers and regressors.
Both possible use cases for Machine Learning
algorithms are implemented, the abstract implementation of a standard Classifier
and a standard Regressor
, both used for Supervised Learning
. Similar to sklearn
the following functionalities are implemented
_classifier
- the standartClassifier
withtrain()
predict()
score()
_regressor
- the standardRegressor
withtrain()
predict()
score()
Furthermore added functionality:
convertSeconds()
that converts seconds into hours, minutes and seconds
implements several loss-/activation-functions (for deep_learning) and the metrics for score()
-evaluation
The following functionality is implemented here:
-
get_activation_function()
providing different kinds ofactivation functions
for theLayers
of aNeural Networks
. A short description aboutactivation functions
can be foundhere
. The following ones are implemented:Sigmoid
Tangens Hyperbolicus
Rectified Linear Unit
Leaky Rectified Linear Unit
Soft-Max
-
loss_function()
provides different kind ofloss functions
used for thetraining
of aNeural Network
. A short introduction towardsloss functions
can be foundhere
. The following of the presented functions are implemented:Mean squared Error
for RegressionMean absolute Error
for RegressionMean squared logarithmic Error
for RegressionHinge
for binary classificationSquared Hinge
for binary classificationCross Entropy
for binary classification - here calledBinary Cross Entropy
Categorical Cross Entropy
for multi-class classification - here calledMulti-Class Cross Entropy
Kullback-Leibler Divergence
for multi-class classification
Furthermore added are:
L1-norm Error
for Regression. Same asMean absoulte Error
without normalizingL2-norm Error
for Regression. Same asMean squared Error
without normalizingRoot mean squared Error
for RegressionHuber
for Regression
-
calc_rates()
calculates the true positives, the false positives, the true negatives and the false negatives each class →What are these rates?
-
classifier_score()
calculates possible evaluation metrics - describedhere
- for aClassifier
, including:Recall
Precision
Accuracy
F1 score
Balanced Accuracy
being similar toAccuracy
but paying respect to an unbalanced number of samples for different classes
-
regressor_score()
calculates possible evaluation metrics - described above - for aRegressor
, including:L1-norm
L2-norm
Mean squared Error
Mean absolute Error
Root mean squared Error
implements classes and functions for data preprocessing:
MinMaxScaler
- Splits arrays or matrices into random train and test subsets, similar toMinMaxScaler
bysklearn
.Here
you will find a brief introduction into the meaning ofScaling
,Standardizing
andNormalizing
train_test_split
- implements a MinMaxScaler that scales all given data into the range from 0 to 1, similar totrain_test_split
bysklearn
. Why splitting your dataset intotrain
andtest
samples is important, can be foundhere
.