Skip to content
This library provides a set of basic functions for different types of deep learning algorithms in C.This deep learning library will be constatly updated
C Other
  1. C 99.4%
  2. Other 0.6%
Branch: master
Clone or download
Latest commit 783b21f Oct 19, 2019
Permalink
Type Name Latest commit message Commit time
Failed to load latest commit information.
scripts
src parser added Oct 19, 2019
tests vae model test update Oct 14, 2019
LICENSE up Oct 5, 2019
README.md up Sep 30, 2019

README.md

logo

Learning-Lab-C-Library:

Creating the library for Linux users:

sh create_library.sh

Current Roadmap:

  • fully-connected-layers feed forward (20/11/2018)
  • fully-connected-layers backpropagation (20/11/2018)
  • nesterov momentum (20/11/2018)
  • adam optimization algorithm (20/11/2018)
  • fully-connected layers dropout (20/11/2018)
  • convolutional layers feed forward (20/11/2018)
  • convolutional layers backpropagation (20/11/2018)
  • convolutional 2d max-pooling (20/11/2018)
  • convolutional 2d avarage pooling (20/11/2018)
  • convolutional 2d local response normalization (20/11/2018)
  • convolutional padding (20/11/2018)
  • fully-connected sigmoid activation (20/11/2018)
  • fully-connected relu activation (20/11/2018)
  • fully-connected softmax activation (20/11/2018)
  • fully-connected tanh activation (20/11/2018)
  • mse loss (20/11/2018)
  • cross-entropy loss (20/11/2018)
  • reduced cross-entropy form with softmax (20/11/2018)
  • convolutional sigmoid activation (20/11/2018)
  • convolutional relu activation (20/11/2018)
  • convolutional tanh activation (20/11/2018)
  • residual layers filled with convolutional layers (20/11/2018)
  • residual layers feed-forward (20/11/2018)
  • residual layers backpropagation (20/11/2018)
  • model structure with fully-connected,convolutional,residual layers (20/11/2018)
  • fixed residual issues (22/11/2018)
  • adam algorithm for model_update (22/11/2018)
  • size_of_model(model* m) function (23/11/2018)
  • L2 Regularization (27/11/2018)
  • Manual Batch normalization feed forward and backpropagation (27/11/2018)
  • fixed residual issues2 (28/11/2018)
  • Manual Xavier Initialization (28/11/2018)
  • Clipping gradient (29/1/2019)
  • Convolutional Layers with only pooling function (30/1/2019)
  • Leaky Relu Activation function (30/1/2019)
  • Batch Normalization final mean and variance for feed forward output (1/2/2019)
  • Decision Tree structure (3/2/2019)
  • LSTM feed forward (13/5/2019)
  • LSTM back propagation (13/5/2019)
  • Recurrent Network (rmodel) with LSTM (13/5/2019)
  • Recurrent Network update with nesterov and adam algorithm (13/5/2019)
  • Group Normalization for convolutional layers (4/7/2019)
  • Residual LSTM cell (18/7/2019)
  • Group Normalization for lstm layer (21/7/2019)
  • Huber Loss (23/7/2019)
  • Variational Auto Encoder for model structures (23/7/2019)
  • VAE model feedforward and back propagation (23/7/2019)
  • Modified Huber Loss (26/7/2019)
  • Focal Loss (18/8/2019)
  • Rectified Adam (22/8/2019)
  • Gan model (28/8/2019)
  • Confusion matrix and accuracy array (30/9/2019)

Future implementations

  • Variational auto encoder for rmodel* structure coming soon...
  • ImageNet architecture coming soon...
  • GAN for rmodel structure coming soon...
You can’t perform that action at this time.