Skip to content
Fork of dmlc/xgboost for RAPIDS + XGBoost integration
Branch: cudf-interop
Clone or download
Pull request Compare This branch is 51 commits ahead, 39 commits behind dmlc:master.
Fetching latest commit…
Cannot retrieve the latest commit at this time.
Permalink
Type Name Latest commit message Commit time
Failed to load latest commit information.
.github Enable auto-locking of issues closed long ago (dmlc#3821) Oct 24, 2018
R-package
amalgamation
cmake Merge branch 'master' into cudf-interop May 31, 2019
cub @ b20808b
demo Add native support for Dask (dmlc#4473) May 27, 2019
dev
dmlc-core @ 3943914
doc
include/xgboost
jvm-packages
make Not use -msse2 on power or arm arch. close dmlc#2446 (dmlc#2475) Jul 7, 2017
plugin
python-package Merge branch 'master' into cudf-interop Jun 25, 2019
rabit @ a429748
src
tests
.clang-tidy
.editorconfig Added configuration for python into .editorconfig (dmlc#3494) Jul 23, 2018
.gitignore
.gitmodules Upgrading to NCCL2 (dmlc#3404) Jul 10, 2018
.travis.yml [CI] Refactor Jenkins CI pipeline + migrate all Linux tests to Jenkins ( Apr 27, 2019
CITATION simplify software citation (dmlc#2912) Dec 1, 2017
CMakeLists.txt Merge branch 'master' into cudf-interop Jun 25, 2019
CONTRIBUTORS.md Simplify INI-style config reader using C++11 STL (dmlc#4478) May 30, 2019
Jenkinsfile
Jenkinsfile-win64 [CI] Add Python and C++ tests for Windows GPU target (dmlc#4469) May 16, 2019
LICENSE Include full text of Apache 2.0 license (dmlc#3698) Sep 13, 2018
Makefile [CI] Refactor Jenkins CI pipeline + migrate all Linux tests to Jenkins ( Apr 27, 2019
NEWS.md
README.md
appveyor.yml [CI] Fix Windows tests (dmlc#4403) Apr 26, 2019

README.md

eXtreme Gradient Boosting

Build Status Build Status Build Status Documentation Status GitHub license CRAN Status Badge PyPI version

Community | Documentation | Resources | Contributors | Release Notes

XGBoost is an optimized distributed gradient boosting library designed to be highly efficient, flexible and portable. It implements machine learning algorithms under the Gradient Boosting framework. XGBoost provides a parallel tree boosting (also known as GBDT, GBM) that solve many data science problems in a fast and accurate way. The same code runs on major distributed environment (Hadoop, SGE, MPI) and can solve problems beyond billions of examples.

License

© Contributors, 2016. Licensed under an Apache-2 license.

Contribute to XGBoost

XGBoost has been developed and used by a group of active community members. Your help is very valuable to make the package better for everyone. Checkout the Community Page

Reference

  • Tianqi Chen and Carlos Guestrin. XGBoost: A Scalable Tree Boosting System. In 22nd SIGKDD Conference on Knowledge Discovery and Data Mining, 2016
  • XGBoost originates from research project at University of Washington.

Sponsors

Become a sponsor and get a logo here. See details at Sponsoring the XGBoost Project. The funds are used to defray the cost of continuous integration and testing infrastructure (https://xgboost-ci.net).

Open Source Collective sponsors

Backers on Open Collective Sponsors on Open Collective

Sponsors

[Become a sponsor]

NVIDIA

Backers

[Become a backer]

Other sponsors

The sponsors in this list are donating cloud hours in lieu of cash donation.

Amazon Web Services

You can’t perform that action at this time.