Skip to content

Homemade PyTorch leaky cell for leaky integrator recurrent layers, including reservoirs.

Notifications You must be signed in to change notification settings

p-enel/pytorch-leaky-cell

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

7 Commits
 
 
 
 
 
 
 
 

Repository files navigation

Implementation of a leaky cell for PyTorch

equation

where:

  • h is the state of the leaky cell
  • x is the input
  • lambda is the leak rate
  • Win is the input weight matrix
  • Wh is the hidden weight matrix
  • sigma is the nonlinearity or transfer function, typically, hyperbolic tangent

Includes:

  • a leaky cell module that can be used as a recurrent layer in any architecture
  • a function to generate initial weights for the leaky cell
  • a simple one layer leaky recurrent network trainable with backprop
  • a reservoir implementation with a specific 'train' method to train the readout weights with a L2 regularized linear regression

About

Homemade PyTorch leaky cell for leaky integrator recurrent layers, including reservoirs.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages