Autoencoder dimensionality reduction, EMD-Manhattan metrics comparison and classifier based clustering on MNIST dataset.
-
Updated
Mar 5, 2021 - C++
Autoencoder dimensionality reduction, EMD-Manhattan metrics comparison and classifier based clustering on MNIST dataset.
Use AutoEncoders to facilitate indexing of high dimensional data (C++, LibTorch)
Comparison of multiple methods for calculating MNIST hand-written digits similarity.
3 part project: A. bottleneck autoencoder, B. manhattan distance, C. earth mover's distance
C++ Deep Learning Library (DLFS-TUM)
Reducing MNIST image data dimensionality by extracting the latent space representations of an Autoencoder model. Comparing these latent space representations to the default MNIST representation
bilingual word embeddings mapping using fastText
Naive implementation of the generative adversarial network (GAN) training written in c++ using mxnet cpp API
RLE, Huffman, JPEG
A Variational Autoencoder, written entirely in C++
C++ implementation of neural networks library with Keras-like API. Contains majority of commonly used layers, losses and optimizers. Supports sequential and multi-input-output (flow) models. Supports single CPU, Multi-CPU and GPU tensor operations (using cuDNN and cuBLAS).
Real time face landmarking using decision trees and NN autoencoders
Deep Learning sample programs using PyTorch in C++
Add a description, image, and links to the autoencoder topic page so that developers can more easily learn about it.
To associate your repository with the autoencoder topic, visit your repo's landing page and select "manage topics."