Skip to content
C++ Scala R Python Cuda Java Other
Branch: master
Clone or download
This branch is 1 commit ahead, 791 commits behind dmlc:master.

Latest commit

Files

Permalink
Type Name Latest commit message Commit time
Failed to load latest commit information.
.github
R-package Refactor of FastHistMaker to allow for custom regularisation methods (d… Jun 28, 2018
amalgamation Refactor of FastHistMaker to allow for custom regularisation methods (d… Jun 28, 2018
cmake Add cuda forwards compatibility (dmlc#3316) May 16, 2018
cub @ b20808b Update cub submodule again (fixes GPU build) (dmlc#2599) Aug 13, 2017
demo fix typo (dmlc#3188) Mar 21, 2018
dmlc-core @ 459ab73
doc Add qid like ranklib format (dmlc#2749) Jun 30, 2018
include/xgboost Add qid like ranklib format (dmlc#2749) Jun 30, 2018
jvm-packages [jvm-packages] Expose nativeBooster for XGBoostClassificationModel an… Jul 1, 2018
make
nccl @ faeac83
plugin
python-package allow arbitrary cross validation fold indices (dmlc#3353) Jun 30, 2018
rabit @ 87143de Fix CRAN check for lintr (dmlc#3372) Jun 18, 2018
src
tests
.clang-tidy Clang-tidy static analysis (dmlc#3222) Apr 19, 2018
.gitignore Improve .gitignore patterns (dmlc#3184) May 9, 2018
.gitmodules
.travis.yml Clang-tidy static analysis (dmlc#3222) Apr 19, 2018
CITATION simplify software citation (dmlc#2912) Dec 1, 2017
CMakeLists.txt fixed MinGW missed dll (dmlc#3430) Jul 1, 2018
CONTRIBUTORS.md Add qid like ranklib format (dmlc#2749) Jun 30, 2018
Jenkinsfile Build universal wheels using GPU CI (dmlc#3424) Jun 29, 2018
LICENSE update year in LICENSE, conf.py and README.md files Mar 15, 2016
Makefile
NEWS.md Release version 0.72 (dmlc#3337) Jun 1, 2018
README.md [DOCS] Update link to readme Jul 4, 2018
appveyor.yml
build.sh Suggest git submodule update instead of delete + reclone (dmlc#3214) May 9, 2018

README.md

eXtreme Gradient Boosting

Build Status Build Status Documentation Status GitHub license CRAN Status Badge PyPI version Gitter chat for developers at https://gitter.im/dmlc/xgboost

Documentation | Resources | Contributors | Community | Release Notes

XGBoost is an optimized distributed gradient boosting library designed to be highly efficient, flexible and portable. It implements machine learning algorithms under the Gradient Boosting framework. XGBoost provides a parallel tree boosting (also known as GBDT, GBM) that solve many data science problems in a fast and accurate way. The same code runs on major distributed environment (Hadoop, SGE, MPI) and can solve problems beyond billions of examples.

License

© Contributors, 2016. Licensed under an Apache-2 license.

Contribute to XGBoost

XGBoost has been developed and used by a group of active community members. Your help is very valuable to make the package better for everyone. Checkout the Community Page

Reference

  • Tianqi Chen and Carlos Guestrin. XGBoost: A Scalable Tree Boosting System. In 22nd SIGKDD Conference on Knowledge Discovery and Data Mining, 2016
  • XGBoost originates from research project at University of Washington.
You can’t perform that action at this time.