Skip to content
forked from dmlc/xgboost

Scalable, Portable and Distributed Gradient Boosting (GBDT, GBRT or GBM) Library, for Python, R, Java, C++ and more

License

Notifications You must be signed in to change notification settings

SonglinYoung/xgboost

 
 

Repository files navigation

eXtreme Gradient Boosting

Build Status Documentation Status GitHub license CRAN Status Badge PyPI version Gitter chat for developers at https://gitter.im/dmlc/xgboost

XGBoost is an optimized distributed gradient boosting library designed to be highly efficient, flexible and portable. It implements machine learning algorithms under the Gradient Boosting framework. XGBoost provides a parallel tree boosting(also known as GBDT, GBM) that solve many data science problems in a fast and accurate way. The same code runs on major distributed environment(Hadoop, SGE, MPI) and can solve problems beyond billions of examples. XGBoost is part of DMLC projects.

Contents

What's New

Features

  • Easily accessible through CLI, python, R, Julia
  • Its fast! Benchmark numbers comparing xgboost, H20, Spark, R - benchm-ml numbers
  • Memory efficient - Handles sparse matrices, supports external memory
  • Accurate prediction, and used extensively by data scientists and kagglers - highlight links
  • Distributed version runs on Hadoop (YARN), MPI, SGE etc., scales to billions of examples.

Bug Reporting

Contributing to XGBoost

XGBoost has been developed and used by a group of active community members. Everyone is more than welcome to contribute. It is a way to make the project better and more accessible to more users.

  • Check out Feature Wish List to see what can be improved, or open an issue if you want something.
  • Contribute to the documents and examples to share your experience with other users.
  • Please add your name to CONTRIBUTORS.md and after your patch has been merged.
    • Please also update NEWS.md on changes and improvements in API and docs.

License

© Contributors, 2015. Licensed under an Apache-2 license.

About

Scalable, Portable and Distributed Gradient Boosting (GBDT, GBRT or GBM) Library, for Python, R, Java, C++ and more

Resources

License

Stars

Watchers

Forks

Packages

No packages published

Languages

  • C++ 50.4%
  • R 19.5%
  • Python 15.4%
  • Java 9.9%
  • C 2.2%
  • Shell 1.1%
  • Other 1.5%