Skip to content


Subversion checkout URL

You can clone with
Download ZIP
Large-scale and Distributed Gradient Boosting (GBDT, GBRT or GBM) Library, on single node, hadoop yarn and more.
C++ R Python Java C Makefile Other
branch: master

XGBoost: eXtreme Gradient Boosting

Build Status

An optimized general purpose gradient boosting library. The library is parallelized, and also provides an optimized distributed version. It implements machine learning algorithm under gradient boosting framework, including generalized linear model and gradient boosted regression tree (GBDT). XGBoost can also be distributed and scale to Terascale data


Documentations: Documentation of xgboost

Issues Tracker:

Please join XGBoost User Group to ask questions and share your experience on xgboost.

  • Use issue tracker for bug reports, feature requests etc.
  • Use the user group to post your experience, ask questions about general usages.

Gitter for developers Gitter chat for developers at

Distributed Version: Distributed XGBoost

Highlights of Usecases: Highlight Links

XGBoost is part of Distributed Machine Learning Common projects

What's New


  • Easily accessible in python, R, Julia, CLI
  • Fast speed and memory efficient
    • Can be more than 10 times faster than GBM in sklearn and R
    • Handles sparse matrices, support external memory
  • Accurate prediction, and used extensively by data scientists and kagglers
  • Distributed and Portable
    • The distributed version runs on Hadoop (YARN), MPI, SGE etc.
    • Scales to billions of examples and beyond


  • Run bash (you can also type make)


  • Current version xgboost-0.4, a lot improvment has been made since 0.3
    • Change log in
    • This version is compatible with 0.3x versions

XGBoost in Graphlab Create

Something went wrong with that request. Please try again.