Skip to content
master
Switch branches/tags
Code

Latest commit

 

Git stats

Files

Permalink
Failed to load latest commit information.
Type
Name
Latest commit message
Commit time
 
 
 
 
 
 
 
 
 
 
 
 
 
 

variational Latent Gaussian Process

license python 3.5 python 3.6 pip

Introduction

This repo contains the implementation of variational Latent Gaussian Process (vLGP) (arXiv) (video) by Yuan Zhao (yuan.zhao@stonybrook.edu) and Il Memming Park (memming.park@stonybrook.edu). It has been developed with the goal of recovering low-dimensional dynamics from neural population recordings.

Installation

pip

pip install git+https://github.com/catniplab/vlgp.git

Usage

The main entry is vlgp.fit. The fit function requires two arguments trials and n_factors. The former is expected as a list of dictionaries, each of which stores on trial and at least contains a identifier ID and the observation y in the shape of (bin, channel). The later specifies the number of factors (latent processes).

result = vlgp.fit(
    trials,       # list of dictionaries
    n_factors=3,  # dimensionality
)

The fit function returns a dictionary of trials, params and config as the fitted model.

Please see the tutorial for details.

Citation

@Article{Zhao2017,
  author    = {Yuan Zhao and Il Memming Park},
  title     = {Variational Latent Gaussian Process for Recovering Single-Trial Dynamics from Population Spike Trains},
  journal   = {Neural Computation},
  year      = {2017},
  volume    = {29},
  number    = {5},
  pages     = {1293--1316},
  month     = {may},
  doi       = {10.1162/neco_a_00953},
  publisher = {{MIT} Press - Journals},
}

Changes

2018

  • New uniform data structure
  • Support trials of unequal duration
  • Faster
  • Use NumPy data format

2017

  • New fit function now only requires observation and the number of latent.
  • Save snapshots if path is passed to fit.
  • You can access the iterations via callback.

About

Dimensionality reduction of spikes trains

Topics

Resources

License

Releases

No releases published

Packages

No packages published

Languages