Skip to content

Commit

Permalink
Merge pull request #19 from alan-turing-institute/feature/18-update-i…
Browse files Browse the repository at this point in the history
…ntros

Feature/18 update intros
  • Loading branch information
LouiseABowler committed Nov 2, 2018
2 parents 7092de6 + 7b2eb69 commit f2dbdb5
Show file tree
Hide file tree
Showing 2 changed files with 44 additions and 8 deletions.
42 changes: 39 additions & 3 deletions README.md
@@ -1,16 +1,15 @@
# BOCPDMS: Bayesian On-line Changepoint Detection with Model Selection

[![Binder](https://mybinder.org/badge.svg)](https://mybinder.org/v2/gh/alan-turing-institute/bocpdms/master)
[![Binder](https://mybinder.org/badge.svg)](https://mybinder.org/v2/gh/alan-turing-institute/bocpdms/master?filepath=examples%2FNile.ipynb)

This repository contains code from the _Bayesian On-line Changepoint Detection with Model Selection_ project.

## Table of contents

* [About BOCPDMS](#about-bocpdms)
* [Citing this project](#citing-this-project)
* [Reproducible Research Champions](#reproducible-research-champions)
* [Installation instructions](#installation-instructions)
* [Running the examples](#running-the-examples)
* [Reproducible research champions program](#reproducible-research-champions-program)
* [Contributors](#contributors)


Expand All @@ -30,6 +29,14 @@ Bayesian On-line Changepoint Detection (BOCPD) is a discrete-time inference fram

²Jeremias Knoblauch, Jack Jewson and Theodoros Damoulas. [Doubly Robust Bayesian Inference for Non-Stationary Streaming Data with β-Divergences](https://arxiv.org/abs/1806.02261), arXiv:1806.02261 (2018).

### Code

The code in this repository was used in both papers, and we are currently working on splitting the two projects so that it is easier to reproduce the work in both the older¹ and newer² papers. You can track our progress on this in [issue \#14](https://github.com/alan-turing-institute/bocpdms/issues/14).

Until we close \#14, you may notice that the results from some of the examples are _robust_, but do not exactly _reproduce_ those from the earlier ICML paper. This is due to changes in the core classes, and in particular the hyperparameter optimisation process, between the publication of the two papers.

Want a preview of the ICML results? Take a look at the updated demo in the branch associated with issue \#14 on Binder: [![Binder](https://mybinder.org/badge.svg)](https://mybinder.org/v2/gh/alan-turing-institute/bocpdms/feature/14-remove-nips?filepath=examples%2FNile.ipynb)

## Reproducible Research Champions

In May 2018, Theo Damoulas was selected as one of the Alan Turing Institute's Reproducible Research Champions - academics who encourage and promote reproducible research through their own work, and who want to take their latest project to the "next level" of reproducibility.
Expand Down Expand Up @@ -129,3 +136,32 @@ tests\test_nile_example.py . [100%]
========================== 6 passed in 17.83 seconds ==========================
```


## Running the examples

You can jump directly to an interactive demo of the Nile example by clicking on this Binder button:
[![Binder](https://mybinder.org/badge.svg)](https://mybinder.org/v2/gh/alan-turing-institute/bocpdms/master?filepath=examples%2FNile.ipynb)

To run from the command line, first activate your virtual environment as described above. You can then run, for example,
```
python nile_ICML18.py
```
and
```
python paper_pictures_nileData.py
```
to generate the figure(s). Recently, we have started to add further options that let you change various parameters from the command line. These are currently available for the Nile river height and bee waggle dance examples (although you can find this functionality for some of the other scripts in their respective [branches](https://github.com/alan-turing-institute/bocpdms/branches)). You can see the various options with the following commands:
```
python nile_ICML18.py --help
python bee_waggle_ICML18.py --help
```

## Contributors

Thank you to the following for their contributions to this project:
- Jeremias Knoblauch
- Theo Damoulas
- Kirstie Whitaker
- Martin O'Reilly
- Louise Bowler
10 changes: 5 additions & 5 deletions nile_ICML18.py
Expand Up @@ -61,12 +61,12 @@ def load_nile_data(path_to_data):
nile_file = os.path.join(baseline_working_directory, "Data", "nile.txt")
T, S1, S2, river_height, __, __ = load_nile_data(nile_file) # Use standardised river height

"""STEP 2: Set up initial hyperparameters (will be optimized throughout
"""STEP 2: Set up initial hyperparameters (will be optimized throughout
the algorithm) and lag lengths"""

# Set up the parser
parser = argparse.ArgumentParser(
description="Options for applying the BOCPDMS algorithm to the bee waggle dance dataset.")
description="Options for applying the BOCPDMS algorithm to the Nile river height dataset.")
parser.add_argument("-a", "--prior_a", type=float, default=1.0, help="Initial value of a")
parser.add_argument("-b", "--prior_b", type=float, default=1.0, help="Initial value of b")
parser.add_argument("-ms", "--prior_mean_scale", type=float, default=0.0,
Expand Down Expand Up @@ -111,8 +111,8 @@ def load_nile_data(path_to_data):
model_universe = np.array(AR_models)
model_prior = np.array([1 / len(model_universe)] * len(model_universe))

"""STEP 5: Build and run detector, i.e. the object responsible for executing
BOCPDMS with multiple (previously specified) models for the segments and a
"""STEP 5: Build and run detector, i.e. the object responsible for executing
BOCPDMS with multiple (previously specified) models for the segments and a
CP model specified by cp_model"""
detector = Detector(
data=river_height,
Expand Down Expand Up @@ -152,7 +152,7 @@ def load_nile_data(path_to_data):
format="pdf", dpi=800)
plt.cla()

"""STEP 8: Also plot some performance indicators (will usually be printed
"""STEP 8: Also plot some performance indicators (will usually be printed
to the console before the plots)"""
print("\nCPs are ", detector.CPs[-2])
print("\n***** Predictive MSE + NLL from Table 1 in ICML 2018 paper *****")
Expand Down

0 comments on commit f2dbdb5

Please sign in to comment.