Skip to content

Autoencoder with Adam optimiser written in C, Python and Cython.

Notifications You must be signed in to change notification settings

Linus-J/autoencoder-from-scratch

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

27 Commits
 
 
 
 
 
 
 
 

Repository files navigation

Autoencoder implementation

Example output from the autoencoder with Adam optimiser written in C:

Original input (28x28):

Original

Decoded output (16x1) ⟶ (28x28):

Compressed

Description

This project implements an autoencoder (AE) trained on the MNIST dataset. The AE uses no convolutional layers and the encoder and decoder parts are comprised of 2 dense layers each. A variant of the Adam optimiser is used in each implementation.

The source code is written in Python using Pytorch/Numpy, Cython and C and execution times during training will soon be compared.

References:

Current progress:

  • AE implemented using Adam optimiser in C
  • Implemented batch training and composite matrix operations in C
  • AE implemented in PyTorch and Cython with DEMON Adam optimiser

Future aims:

  • Add DEMON to the Adam optimiser to C source code.
  • Implement a disentangled VAE in all languages.
  • Test and compare execution times.

About

Autoencoder with Adam optimiser written in C, Python and Cython.

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published