Skip to content

MonetDBSolutions/probrnn

 
 

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

13 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Probabilistic RNNs for sequential data with missing values

EU project 732328: "Fashion Brain".

D1.4: "Software Requirements: SSM library for time-series modeling and trend prediction".

Tasks

Prediction and forecasting

We assume we are given sample paths from a time-series: Thus:

alt text

Where the sample paths are all drawn from an underlying model:

alt text

The aim in forecasting is to predict future time-points from sample past time points. I.e. we would be interested in estimating:

alt text

or

alt text

In this package we follow the ansatz that the dependency is completely determined by the hidden state of a recurrent neural network.

alt text

where the RNN recurrence relation is given by:

alt text

Distribution modeling with NADE

Here we are interested in estimating the distribution of a multivariate vector without necessarily assuming a temporal or sequential ordering. Nevertheless we still have a decomposition of the joint distribution:

alt text

Which may be modeled again using an RNN:

alt text

This approach is known as neural autoregressive distribution estimation (NADE).

Missing value estimation

In many applications missing data is an issue which hampers both training and inference. Thus we are given a sequence of data where only a subsequence is available. For example:

alt text

Assume we are given an RNN model:

alt text

and a parameterized estimate of the conditional distribution:

alt text

In this package we take a sequential importance sampling (SIS) approach to inferring missing data given this model of the time-series. If, additionally, data is missing at training time, we employ an expectation-maximization (EM) training algorithm, rather than a standard backpropagation through (BPTT) time algorithm.

Multivariate temporal dependencies

In the simplest case suppose we have two time series. Temporal dependencies may be modeled by treating the interleaved sequence:

alt text

At test time, predicting one unseen time-series given an observed time-series may be treated as a missing value problem and applying SIS to the sequence:

alt text

Python package

Structure

alt text

Getting started

Installation:

git clone https://github.com/zalandoresearch/probrnn.git
cd probnn/
make install

Installation in development mode:

git clone https://github.com/zalandoresearch/probrnn.git
cd probnn/
make develop

Running the tests:

make clean
make test

Usage

Setting up NADE data

from probrnn import data
import numpy as np

x = np.random.randn(1000, 10)
datastruct = data.NadeWrapper(x)

Setting up time-series data

x = np.random.randn(10000)
datastruct = data.TimeSeries(x)

Setting up parameters for learning

params = \
    {
        "N_ITERATIONS": 10 ** 5, # no of batches to pass in total
        "VALIDATE_EACH": 100, # how often to check error on validation data
        "SAVE_EACH": 1000, # how often to save model
        "LOG_EVERY": 50, # how often to log
        "LEARNING_RATE": 0.0001, # learning rate of learning
        "N_HIDDEN": 256, # number of hidden units in RNN
        "N_BINS": 50, # number of bins to discretize data
        "BATCH_SIZE": 50, # number of samples per batch
    }

Get a NADE model

from probrnn import models

model = models.NADE(datastruct, params=params)

Do the training

training = models.Training(model, "test_model", "test_log.json")
callback = lambda err, i, _: print "loss: {err}; iteration {i}".format(err=err, i=i)
training.train(callback)

Same thing but with missing values

from probrnn import inference

imputer = lambda a, b: inference.NaiveSIS(a, b)
training = models.Training(model, "test_model", "test_log.json", imputer=imputer)
training.train(callback)

Do imputation at test time

x[np.random.choice(len(x), replace=False, size=50)] = np.nan
estimate = imputer(model, x).estimate()

Examples

Example notebooks are in ./examples/

License

The MIT License (MIT) Copyright (c) 2016 Zalando SE

Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions:

The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software.

THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.

About

State space modeling with recurrent neural networks

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Python 99.2%
  • Makefile 0.8%