The Noise Contrastive Estimation for softmax output written in Pytorch
-
Updated
Nov 6, 2019 - Python
The Noise Contrastive Estimation for softmax output written in Pytorch
Faster CSV for Python
This repository is the official implementation of the paper Pruning via Iterative Ranking of Sensitivity Statistics and implements novel pruning / compression algorithms for deep learning / neural networks. Amongst others it implements structured pruning before training, its actual parameter shrinking and unstructured before/during training.
A library that lets you easily increase efficiency of your deep learning models with no loss of accuracy.
Implementation of the paper: Selective_Backpropagation from paper Accelerating Deep Learning by Focusing on the Biggest Losers
Code for paper "Locally Distributed Deep Learning Inference on Edge Device Clusters"
A plug-and-play JIT implementation for Marshmallow to speed up data serialization and deserialization
Continuation methods of Deep Neural Networks optimization, deep-learning, homotopy, bifurcation-analysis, continuation
Re-ordering algorithm for structural sparsity in neural networks
Python realization of wavelet transform with Gabor-kernel (from matlab)
A toolkit for HPC performance evaluation.
A Pytorch Plug&Play implementation of the papers "Deep Networks with Stochastic Depth" and "Drop an Octave"
Infraestructuras Paralelas y Distribuidas - Parcial I - Parte Práctica
Add a description, image, and links to the speedup topic page so that developers can more easily learn about it.
To associate your repository with the speedup topic, visit your repo's landing page and select "manage topics."