Skip to content
No description, website, or topics provided.
Python
Branch: master
Clone or download

Latest commit

andosa Merge pull request #24 from VolodymyrOrlov/master
Optimizes mean calculation routine in treeinterpreter/treeinterpreter.py
Latest commit 9b846cd May 31, 2019

Files

Permalink
Type Name Latest commit message Commit time
Failed to load latest commit information.
tests Add tests for ExtraTree and ExtraTrees classifiers and regressors Jul 19, 2017
treeinterpreter Optimizes mean calculation routine in treeinterpreter/treeinterpreter.py May 18, 2019
.gitignore First commit Aug 2, 2015
HISTORY.rst First commit Aug 2, 2015
LICENSE First commit Aug 2, 2015
MANIFEST.in First commit Aug 2, 2015
README.rst Update README to mention ExtraTree and ExtraTrees estimators Jul 19, 2017
setup.cfg First commit Aug 2, 2015
setup.py version bump for pip release Dec 16, 2018

README.rst

TreeInterpreter

Package for interpreting scikit-learn's decision tree and random forest predictions. Allows decomposing each prediction into bias and feature contribution components as described in http://blog.datadive.net/interpreting-random-forests/. For a dataset with n features, each prediction on the dataset is decomposed as prediction = bias + feature_1_contribution + ... + feature_n_contribution.

It works on scikit-learn's

  • DecisionTreeRegressor
  • DecisionTreeClassifier
  • ExtraTreeRegressor
  • ExtraTreeClassifier
  • RandomForestRegressor
  • RandomForestClassifier
  • ExtraTreesRegressor
  • ExtraTreesClassifier

Free software: BSD license

Dependencies

  • scikit-learn 0.17+

Installation

The easiest way to install the package is via pip:

$ pip install treeinterpreter

Usage

from treeinterpreter import treeinterpreter as ti
# fit a scikit-learn's regressor model
rf = RandomForestRegressor()
rf.fit(trainX, trainY)

prediction, bias, contributions = ti.predict(rf, testX)

Prediction is the sum of bias and feature contributions:

assert(numpy.allclose(prediction, bias + np.sum(contributions, axis=1)))
assert(numpy.allclose(rf.predict(testX), bias + np.sum(contributions, axis=1)))

More usage examples at http://blog.datadive.net/random-forest-interpretation-with-scikit-learn/.

You can’t perform that action at this time.