Skip to content

Commit 84ae514

Browse files
committed
[DOC] refactor doc
1 parent 149589c commit 84ae514

File tree

14 files changed

+128
-57
lines changed

14 files changed

+128
-57
lines changed

doc/R-package/index.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -9,7 +9,7 @@ You have find XGBoost R Package!
99
Get Started
1010
-----------
1111
* Checkout the [Installation Guide](../build.md) contains instructions to install xgboost, and [Tutorials](#tutorials) for examples on how to use xgboost for various tasks.
12-
* Please visit [walk through example](demo).
12+
* Please visit [walk through example](../../R-package/demo).
1313

1414
Tutorials
1515
---------

doc/cli/index.md

Lines changed: 3 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,3 @@
1+
# XGBoost Command Line version
2+
3+
See [XGBoost Command Line walkthrough](https://github.com/dmlc/xgboost/blob/master/demo/binary_classification/README.md)

doc/conf.py

Lines changed: 2 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -117,10 +117,11 @@
117117

118118
# -- Options for HTML output ----------------------------------------------
119119

120+
html_theme_path = ['_static']
120121
# The theme to use for HTML and HTML Help pages. See the documentation for
121122
# a list of builtin themes.
122123
# html_theme = 'alabaster'
123-
html_theme = 'sphinx_rtd_theme'
124+
html_theme = 'xgboost-theme'
124125

125126
# Add any paths that contain custom static files (such as style sheets) here,
126127
# relative to this directory. They are copied after the builtin static files,

doc/get_started/index.md

Lines changed: 80 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,80 @@
1+
# Get Started with XGBoost
2+
3+
This is a quick started tutorial showing snippets for you to quickly try out xgboost
4+
on the demo dataset on a binary classification task.
5+
6+
## Links to Helpful Other Resources
7+
- See [Installation Guide](../build.md) on how to install xgboost.
8+
- See [How to pages](../how_to/index.md) on various tips on using xgboost.
9+
- See [Tutorials](../tutorials/index.md) on tutorials on specific tasks.
10+
- See [Learning to use XGBoost by Examples](../../demo) for more code examples.
11+
12+
## Python
13+
```python
14+
import xgboost as xgb
15+
# read in data
16+
dtrain = xgb.DMatrix('demo/data/agaricus.txt.train')
17+
dtest = xgb.DMatrix('demo/data/agaricus.txt.test')
18+
# specify parameters via map
19+
param = {'max_depth':2, 'eta':1, 'silent':1, 'objective':'binary:logistic' }
20+
num_round = 2
21+
bst = xgb.train(param, dtrain, num_round)
22+
# make prediction
23+
preds = bst.predict(dtest)
24+
```
25+
26+
## R
27+
28+
```r
29+
# load data
30+
data(agaricus.train, package='xgboost')
31+
data(agaricus.test, package='xgboost')
32+
train <- agaricus.train
33+
test <- agaricus.test
34+
# fit model
35+
bst <- xgboost(data = train$data, label = train$label, max.depth = 2, eta = 1, nround = 2,
36+
nthread = 2, objective = "binary:logistic")
37+
# predict
38+
pred <- predict(bst, test$data)
39+
40+
```
41+
42+
## Julia
43+
```julia
44+
using XGBoost
45+
# read data
46+
train_X, train_Y = readlibsvm("demo/data/agaricus.txt.train", (6513, 126))
47+
test_X, test_Y = readlibsvm("demo/data/agaricus.txt.test", (1611, 126))
48+
# fit model
49+
num_round = 2
50+
bst = xgboost(train_X, num_round, label=train_Y, eta=1, max_depth=2)
51+
# predict
52+
pred = predict(bst, test_X)
53+
```
54+
55+
## Scala
56+
```scala
57+
import ml.dmlc.xgboost4j.scala.DMatrix
58+
import ml.dmlc.xgboost4j.scala.XGBoost
59+
60+
object XGBoostScalaExample {
61+
def main(args: Array[String]) {
62+
// read trainining data, available at xgboost/demo/data
63+
val trainData =
64+
new DMatrix("/path/to/agaricus.txt.train")
65+
// define parameters
66+
val paramMap = List(
67+
"eta" -> 0.1,
68+
"max_depth" -> 2,
69+
"objective" -> "binary:logistic").toMap
70+
// number of iterations
71+
val round = 2
72+
// train the model
73+
val model = XGBoost.train(trainData, paramMap, round)
74+
// run prediction
75+
val predTrain = model.predict(trainData)
76+
// save model to the file.
77+
model.saveModel("/local/path/to/model")
78+
}
79+
}
80+
```
File renamed without changes.
File renamed without changes.

doc/how_to/index.md

Lines changed: 16 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,16 @@
1+
# XGBoost How To
2+
3+
This page contains guidelines to use and develop mxnets.
4+
5+
## Installation
6+
- [How to Install XGBoost](../build.md)
7+
8+
## Use XGBoost in Specific Ways
9+
- [Parameter tunning guide](param_tuning.md)
10+
- [Use out of core computation for large dataset](external_memory.md)
11+
12+
## Develop and Hack XGBoost
13+
- [Contribute to XGBoost](contribute.md)
14+
15+
## Frequently Ask Questions
16+
- [FAQ](../faq.md)
File renamed without changes.

doc/index.md

Lines changed: 7 additions & 51 deletions
Original file line numberDiff line numberDiff line change
@@ -1,59 +1,15 @@
11
XGBoost Documentation
22
=====================
3-
This is document of xgboost library.
4-
XGBoost is short for eXtreme gradient boosting. This is a library that is designed, and optimized for boosted (tree) algorithms.
5-
The goal of this library is to push the extreme of the computation limits of machines to provide a ***scalable***, ***portable*** and ***accurate***
6-
for large scale tree boosting.
7-
83
This document is hosted at http://xgboost.readthedocs.org/. You can also browse most of the documents in github directly.
94

105

11-
Package Documents
12-
-----------------
13-
This section contains language specific package guide.
14-
* [XGBoost Command Line Usage Walkthrough](../demo/binary_classification/README.md)
6+
These are used to generate the index used in search.
7+
158
* [Python Package Document](python/index.md)
169
* [R Package Document](R-package/index.md)
1710
* [Java/Scala Package Document](jvm/index.md)
18-
* [XGBoost.jl Julia Package](https://github.com/dmlc/XGBoost.jl)
19-
20-
User Guides
21-
-----------
22-
This section contains users guides that are general across languages.
23-
* [Installation Guide](build.md)
24-
* [Introduction to Boosted Trees](model.md)
25-
* [Distributed Training Tutorial](tutorial/aws_yarn.md)
26-
* [Frequently Asked Questions](faq.md)
27-
* [External Memory Version](external_memory.md)
28-
* [Learning to use XGBoost by Example](../demo)
29-
* [Parameters](parameter.md)
30-
* [Text input format](input_format.md)
31-
* [Notes on Parameter Tunning](param_tuning.md)
32-
33-
34-
Tutorials
35-
---------
36-
This section contains official tutorials of XGBoost package.
37-
See [Awesome XGBoost](https://github.com/dmlc/xgboost/tree/master/demo) for links to mores resources.
38-
* [Introduction to XGBoost in R](R-package/xgboostPresentation.md) (R package)
39-
- This is a general presentation about xgboost in R.
40-
* [Discover your data with XGBoost in R](R-package/discoverYourData.md) (R package)
41-
- This tutorial explaining feature analysis in xgboost.
42-
* [Introduction of XGBoost in Python](python/python_intro.md) (python)
43-
- This tutorial introduces the python package of xgboost
44-
* [Understanding XGBoost Model on Otto Dataset](../demo/kaggle-otto/understandingXGBoostModel.Rmd) (R package)
45-
- This tutorial teaches you how to use xgboost to compete kaggle otto challenge.
46-
47-
Developer Guide
48-
---------------
49-
* [Contributor Guide](dev-guide/contribute.md)
50-
51-
52-
Indices and tables
53-
------------------
54-
55-
```eval_rst
56-
* :ref:`genindex`
57-
* :ref:`modindex`
58-
* :ref:`search`
59-
```
11+
* [Julia Package Document](julia/index.md)
12+
* [CLI Package Document](cli/index.md)
13+
- [Howto Documents](how_to/index.md)
14+
- [Get Started Documents](get_started/index.md)
15+
- [Tutorials](tutorials/index.md)

doc/julia/index.md

Lines changed: 3 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,3 @@
1+
# XGBoost.jl
2+
3+
See [XGBoost.jl Project page](https://github.com/dmlc/XGBoost.jl)

0 commit comments

Comments
 (0)