Skip to content

rnbwdsh/lstm_rnn

Repository files navigation

Overview

This material may or may not be related to 365.107 UE: LSTM and Recurrent Neural Nets from JKU Linz. Who knows.

I decided to upload this, as no good, easy implementations was available.

Also a lot of notation was refactored, as transposing every weight matrix every time you use it is questionable

Sub-Projects

Assignment 1: Numpy RNN with BPTT (Backprop through time) with bias units - great for big hidden weight matrix sizes, slow for sequences of length > hidden-size^2

Assignment 2: Data generator, MSE loss (forward, backward, visuaization)

Assignment 3: Numpy RNN with RTRL (Real time recurrent learning) with bias units - great for very long sequences, bad for big hidden sizes.

Assignment 4: Basic Numpy LSTM

Assignment 5: Character prediction pytorch LSTM without any pytorch-builtin LSTM-classes.

Copyright statement

This project is CC0 - go crazy.

Original author: github.com/rnbwdsh

Original copyright statement

This material, no matter whether in printed or electronic form, may be used for personal and non-commercial educational use only. Any reproduction of this manuscript, no matter whether as a whole or in parts, no matter whether in printed or in electronic form, requires explicit prior acceptance of the authors.

Therefore I deleted everything that legally qualifies as copyrightable "creative work" like the original assignment texts.

Boilerplate code does not reach "creative height" and therefore not qualify as "creative work".

Also, copyright doesn't apply in educational scenarios.

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages