Skip to content
/ xgboost Public
forked from dmlc/xgboost

eXtreme Gradient Boosting (GBDT, GBRT or GBM) Library for large-scale and distributed machine learning, on single node, hadoop yarn and more.

License

Notifications You must be signed in to change notification settings

laisun/xgboost

 
 

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

XGBoost: eXtreme Gradient Boosting

An optimized general purpose gradient boosting library. The library is parallelized, and also provides an optimized distributed version. It implements machine learning algorithm under gradient boosting framework, including generalized linear model and gradient boosted regression tree (GBDT). XGBoost can also also distributed and scale to Terascale data

Contributors: https://github.com/dmlc/xgboost/graphs/contributors

Documentations: Documentation of xgboost

Issues Tracker: https://github.com/dmlc/xgboost/issues

Please join XGBoost User Group to ask questions and share your experience on xgboost.

  • Use issue tracker for bug reports, feature requests etc.
  • Use the user group to post your experience, ask questions about general usages.

Gitter for developers Gitter chat for developers at https://gitter.im/dmlc/xgboost

Distributed Version: Distributed XGBoost

Highlights of Usecases: Highlight Links

What's New

Features

  • Sparse feature format:
    • Sparse feature format allows easy handling of missing values, and improve computation efficiency.
  • Push the limit on single machine:
    • Efficient implementation that optimizes memory and computation.
  • Speed: XGBoost is very fast
    • IN demo/higgs/speedtest.py, kaggle higgs data it is faster(on our machine 20 times faster using 4 threads) than sklearn.ensemble.GradientBoostingClassifier
  • Layout of gradient boosting algorithm to support user defined objective
  • Distributed and portable
    • The distributed version of xgboost is highly portable and can be used in different platforms
    • It inheritates all the optimizations made in single machine mode, maximumly utilize the resources using both multi-threading and distributed computing.

Build

  • Run bash build.sh (you can also type make)

Version

  • This version xgboost-0.3, the code has been refactored from 0.2x to be cleaner and more flexibility
  • This version of xgboost is not compatible with 0.2x, due to huge amount of changes in code structure
    • This means the model and buffer file of previous version can not be loaded in xgboost-3.0
  • For legacy 0.2x code, refer to Here
  • Change log in CHANGES.md

XGBoost in Graphlab Create

About

eXtreme Gradient Boosting (GBDT, GBRT or GBM) Library for large-scale and distributed machine learning, on single node, hadoop yarn and more.

Resources

License

Stars

Watchers

Forks

Packages

No packages published

Languages

  • C++ 67.7%
  • R 14.0%
  • Python 10.8%
  • Java 3.4%
  • C 2.0%
  • Makefile 1.0%
  • Other 1.1%