Skip to content

Commit

Permalink
Update README.md to add link to doc and update project status
Browse files Browse the repository at this point in the history
  • Loading branch information
ogrisel committed Dec 14, 2018
1 parent b10213f commit 6382c90
Showing 1 changed file with 19 additions and 5 deletions.
24 changes: 19 additions & 5 deletions README.md
Original file line number Diff line number Diff line change
@@ -1,7 +1,6 @@
# pygbm [![Build Status](https://travis-ci.org/ogrisel/pygbm.svg?branch=master)](https://travis-ci.org/ogrisel/pygbm) [![codecov](https://codecov.io/gh/ogrisel/pygbm/branch/master/graph/badge.svg)](https://codecov.io/gh/ogrisel/pygbm)



Experimental Gradient Boosting Machines in Python.

The goal of this project is to evaluate whether it's possible to
Expand All @@ -10,13 +9,28 @@ Gradient Boosting Trees (possibly with all the LightGBM optimizations)
while staying in pure Python using the [numba](http://numba.pydata.org/)
jit compiler.

We plan scikit-learn compatible set of estimator classes and possibly
integration with dask and dask-ml for out-of-core and distributed
fitting on a cluster.
pygbm provides a set of scikit-learn compatible estimator classes that
should play well with the scikit-learn `Pipeline`s and model selection
tools (grid search and randomized hyperparameter search).

Longer term plans include integration with dask and dask-ml for
out-of-core and distributed fitting on a cluster.

## Documentation

The API documentation is available at:

https://pygbm.readthedocs.io/

You might also what to have a look at the `examples/` folder of this repo.

## Status

This is unusable / under development.
The project is experimental. The API is subject to change without deprecation notice. Use at your own risk.

We welcome any feedback in the github issue tracker:

https://github.com/ogrisel/pygbm/issues

## Running the development version

Expand Down

0 comments on commit 6382c90

Please sign in to comment.