Scalable, Portable and Distributed Gradient Boosting (GBDT, GBRT or GBM) Library, for Python, R, Java, Scala, C++ and more. Runs on single machine, Hadoop, Spark, Flink and DataFlow
C++ R Scala Python Java Cuda Other
Latest commit 5d74578 Jan 21, 2017 @hcho3 hcho3 committed with tqchen Disallow multiple roots for tree_method=hist (#1979)
As discussed in issue #1978, tree_method=hist ignores the parameter
param.num_roots; it simply assumes that the tree has only one root. In
particular, when InitData() method initializes row_set_collection_, it simply
assigns all rows to node 0, the value that's hard-coded.

For now, the updater will simply fail when num_roots exceeds 1. I will revise
the updater soon to support multiple roots.
Failed to load latest commit information.
R-package [R] various R code maintenance (#1964) Jan 21, 2017
amalgamation Histogram Optimized Tree Grower (#1940) Jan 13, 2017
demo Fix comment in (#1923) Jan 2, 2017
dmlc-core @ 78b78be Changing omp_get_num_threads to omp_get_max_threads (#1831) Dec 4, 2016
doc [R] various R code maintenance (#1964) Jan 21, 2017
include/xgboost Histogram Optimized Tree Grower (#1940) Jan 13, 2017
jvm-packages refactor duplicate evaluation implementation (#1852) Dec 9, 2016
make Set TEST_COVER to 0 by default (#1853) Dec 11, 2016
plugin Fix cmake build for linux. Update GPU benchmarks. (#1904) Dec 23, 2016
python-package adding sample weights for XGBRegressor (was this forgotten?) (#1874) Jan 21, 2017
rabit @ a9a2a69 Fix warnings from g++5 or higher (#1510) Aug 26, 2016
src Disallow multiple roots for tree_method=hist (#1979) Jan 21, 2017
tests Histogram Optimized Tree Grower (#1940) Jan 13, 2017
.gitignore Add make commands for tests Dec 4, 2016
.gitmodules [REFACTOR] cleanup structure Jan 16, 2016
.travis.yml travis: Add code coverage on success Dec 4, 2016
CMakeLists.txt Fix cmake build for linux. Update GPU benchmarks. (#1904) Dec 23, 2016 Use bst_float consistently throughout (#1824) Nov 30, 2016 issue template (#1475) Aug 18, 2016
LICENSE update year in LICENSE, and files Mar 15, 2016
Makefile autoconf for solaris (#1880) Dec 16, 2016 [CORE] Refactor cache mechanism (#1540) Sep 3, 2016 change contribution link to open issues (#1834) Dec 2, 2016
appveyor.yml GPU plug-in improvements + basic Windows continuous integration (#1752) Nov 10, 2016 Minor fix on installation guide and (the probably deprecated) build s… Feb 24, 2016

eXtreme Gradient Boosting

Build Status Build Status Documentation Status GitHub license CRAN Status Badge PyPI version Gitter chat for developers at

Documentation | Resources | Installation | Release Notes | RoadMap

XGBoost is an optimized distributed gradient boosting library designed to be highly efficient, flexible and portable. It implements machine learning algorithms under the Gradient Boosting framework. XGBoost provides a parallel tree boosting (also known as GBDT, GBM) that solve many data science problems in a fast and accurate way. The same code runs on major distributed environment (Hadoop, SGE, MPI) and can solve problems beyond billions of examples.

What's New

Ask a Question

Help to Make XGBoost Better

XGBoost has been developed and used by a group of active community members. Your help is very valuable to make the package better for everyone.


© Contributors, 2016. Licensed under an Apache-2 license.