A suite of boosting algorithms for the online learning setting.
Switch branches/tags
Nothing to show
Clone or download
Fetching latest commit…
Cannot retrieve the latest commit at this time.



A suite of boosting algorithms and weak learners for the online learning setting.


Implementations for the following online boosting algorithms are provided:

  1. Online AdaBoost (OzaBoost), from Oza & Russell.
  2. Online GradientBoost (OGBoost), from Leistner et al.
  3. Online SmoothBoost (OSBoost), from Chen et al.
  4. OSBoost with Expert Advice (EXPBoost), again from Chen et al.
  5. OSBoost with Online Convex Programming (OCPBoost), again from Chen et al.

The corresponding Python modules can be found in the 'ensemblers' folder, named as above.

Weak Learners

The package also includes implementations for a number of online weak learners, all of which can be plugged in to the above online boosting algorithms. Some of the key weak learners include:

  1. Perceptrons.
  2. Naive Bayes (Gaussian & Binary).
  3. Random Decision Stumps.
  4. Incremental Decision Trees, based on the DTree module.


The ensemblers and weak learners are generally dependent on Numpy and Scipy. Some of the weak learners (in particular, those prefixed with "sk") are dependent on scikit-learn. File I/O is done through YAML using the PyYAML package.

A full list of dependencies is available in the requirements.txt.


MIT © Charles Marsh