Skip to content
No description, website, or topics provided.
Branch: master
Clone or download

Latest commit

andosa Merge pull request #24 from VolodymyrOrlov/master
Optimizes mean calculation routine in treeinterpreter/
Latest commit 9b846cd May 31, 2019


Type Name Latest commit message Commit time
Failed to load latest commit information.
tests Add tests for ExtraTree and ExtraTrees classifiers and regressors Jul 19, 2017
treeinterpreter Optimizes mean calculation routine in treeinterpreter/ May 18, 2019
.gitignore First commit Aug 2, 2015
HISTORY.rst First commit Aug 2, 2015
LICENSE First commit Aug 2, 2015 First commit Aug 2, 2015
README.rst Update README to mention ExtraTree and ExtraTrees estimators Jul 19, 2017
setup.cfg First commit Aug 2, 2015 version bump for pip release Dec 16, 2018



Package for interpreting scikit-learn's decision tree and random forest predictions. Allows decomposing each prediction into bias and feature contribution components as described in For a dataset with n features, each prediction on the dataset is decomposed as prediction = bias + feature_1_contribution + ... + feature_n_contribution.

It works on scikit-learn's

  • DecisionTreeRegressor
  • DecisionTreeClassifier
  • ExtraTreeRegressor
  • ExtraTreeClassifier
  • RandomForestRegressor
  • RandomForestClassifier
  • ExtraTreesRegressor
  • ExtraTreesClassifier

Free software: BSD license


  • scikit-learn 0.17+


The easiest way to install the package is via pip:

$ pip install treeinterpreter


from treeinterpreter import treeinterpreter as ti
# fit a scikit-learn's regressor model
rf = RandomForestRegressor(), trainY)

prediction, bias, contributions = ti.predict(rf, testX)

Prediction is the sum of bias and feature contributions:

assert(numpy.allclose(prediction, bias + np.sum(contributions, axis=1)))
assert(numpy.allclose(rf.predict(testX), bias + np.sum(contributions, axis=1)))

More usage examples at

You can’t perform that action at this time.