Skip to content

Commit 896aede

Browse files
trivialfishcho3
andauthored
Reorganize the installation documents. (dmlc#6877)
* Split up installation and building from source. * Use consistent section titles. Co-authored-by: Philip Hyunsu Cho <chohyu01@cs.washington.edu>
1 parent 74b4163 commit 896aede

File tree

9 files changed

+424
-395
lines changed

9 files changed

+424
-395
lines changed

doc/R-package/index.rst

+2-2
Original file line numberDiff line numberDiff line change
@@ -12,7 +12,7 @@ You have found the XGBoost R Package!
1212
***********
1313
Get Started
1414
***********
15-
* Checkout the :doc:`Installation Guide </build>` contains instructions to install xgboost, and :doc:`Tutorials </tutorials/index>` for examples on how to use XGBoost for various tasks.
15+
* Checkout the :doc:`Installation Guide </install>` contains instructions to install xgboost, and :doc:`Tutorials </tutorials/index>` for examples on how to use XGBoost for various tasks.
1616
* Read the `API documentation <https://cran.r-project.org/web/packages/xgboost/xgboost.pdf>`_.
1717
* Please visit `Walk-through Examples <https://github.com/dmlc/xgboost/tree/master/R-package/demo>`_.
1818

@@ -23,6 +23,6 @@ Tutorials
2323
.. toctree::
2424
:maxdepth: 2
2525
:titlesonly:
26-
26+
2727
Introduction to XGBoost in R <xgboostPresentation>
2828
Understanding your dataset with XGBoost <discoverYourData>

doc/build.rst

+200-196
Large diffs are not rendered by default.

doc/get_started.rst

+1-1
Original file line numberDiff line numberDiff line change
@@ -8,7 +8,7 @@ on the demo dataset on a binary classification task.
88
********************************
99
Links to Other Helpful Resources
1010
********************************
11-
- See :doc:`Installation Guide </build>` on how to install XGBoost.
11+
- See :doc:`Installation Guide </install>` on how to install XGBoost.
1212
- See :doc:`Text Input Format </tutorials/input_format>` on using text format for specifying training/testing data.
1313
- See :doc:`Tutorials </tutorials/index>` for tips and tutorials.
1414
- See `Learning to use XGBoost by Examples <https://github.com/dmlc/xgboost/tree/master/demo>`_ for more code examples.

doc/gpu/index.rst

+1-2
Original file line numberDiff line numberDiff line change
@@ -3,7 +3,6 @@ XGBoost GPU Support
33
###################
44

55
This page contains information about GPU algorithms supported in XGBoost.
6-
To install GPU support, checkout the :doc:`/build`.
76

87
.. note:: CUDA 10.0, Compute Capability 3.5 required
98

@@ -71,7 +70,7 @@ The device ordinal (which GPU to use if you have many of them) can be selected u
7170
``gpu_id`` parameter, which defaults to 0 (the first device reported by CUDA runtime).
7271

7372

74-
The GPU algorithms currently work with CLI, Python and R packages. See :doc:`/build` for details.
73+
The GPU algorithms currently work with CLI, Python, R, and JVM packages. See :doc:`/install` for details.
7574

7675
.. code-block:: python
7776
:caption: Python example

doc/index.rst

+1
Original file line numberDiff line numberDiff line change
@@ -15,6 +15,7 @@ Contents
1515
:maxdepth: 2
1616
:titlesonly:
1717

18+
install
1819
build
1920
get_started
2021
tutorials/index

doc/install.rst

+215
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,215 @@
1+
##################
2+
Installation Guide
3+
##################
4+
5+
XGBoost provides binary packages for some language bindings. The binary packages support
6+
the GPU algorithm (``gpu_hist``) on machines with NVIDIA GPUs. Please note that **training
7+
with multiple GPUs is only supported for Linux platform**. See :doc:`gpu/index`. Also we
8+
have both stable releases and nightly builds, see below for how to install them. For
9+
building from source, visit :doc:`this page </build>`.
10+
11+
.. contents:: Contents
12+
13+
Stable Release
14+
==============
15+
16+
Python
17+
------
18+
19+
Pre-built binary are uploaded to PyPI (Python Package Index) for each release. Supported platforms are Linux (x86_64, aarch64), Windows (x86_64) and MacOS (x86_64).
20+
21+
.. code-block:: bash
22+
23+
pip install xgboost
24+
25+
26+
You might need to run the command with ``--user`` flag or use ``virtualenv`` if you run
27+
into permission errors. Python pre-built binary capability for each platform:
28+
29+
.. |tick| unicode:: U+2714
30+
.. |cross| unicode:: U+2718
31+
32+
+-------------------+---------+----------------------+
33+
| Platform | GPU | Multi-Node-Multi-GPU |
34+
+===================+=========+======================+
35+
| Linux x86_64 | |tick| | |tick| |
36+
+-------------------+---------+----------------------+
37+
| Linux aarch64 | |cross| | |cross| |
38+
+-------------------+---------+----------------------+
39+
| MacOS | |cross| | |cross| |
40+
+-------------------+---------+----------------------+
41+
| Windows | |tick| | |cross| |
42+
+-------------------+---------+----------------------+
43+
44+
R
45+
-
46+
47+
* From CRAN:
48+
49+
.. code-block:: R
50+
51+
install.packages("xgboost")
52+
53+
.. note:: Using all CPU cores (threads) on Mac OSX
54+
55+
If you are using Mac OSX, you should first install OpenMP library (``libomp``) by running
56+
57+
.. code-block:: bash
58+
59+
brew install libomp
60+
61+
and then run ``install.packages("xgboost")``. Without OpenMP, XGBoost will only use a
62+
single CPU core, leading to suboptimal training speed.
63+
64+
* We also provide **experimental** pre-built binary on Linux x86_64 with GPU support.
65+
Download the binary package from the Releases page. The file name will be of the form
66+
``xgboost_r_gpu_linux_[version].tar.gz``. Then install XGBoost by running:
67+
68+
.. code-block:: bash
69+
70+
# Install dependencies
71+
R -q -e "install.packages(c('data.table', 'magrittr', 'jsonlite'))"
72+
# Install XGBoost
73+
R CMD INSTALL ./xgboost_r_gpu_linux.tar.gz
74+
75+
JVM
76+
---
77+
78+
You can use XGBoost4J in your Java/Scala application by adding XGBoost4J as a dependency:
79+
80+
.. code-block:: xml
81+
:caption: Maven
82+
83+
<properties>
84+
...
85+
<!-- Specify Scala version in package name -->
86+
<scala.binary.version>2.12</scala.binary.version>
87+
</properties>
88+
89+
<dependencies>
90+
...
91+
<dependency>
92+
<groupId>ml.dmlc</groupId>
93+
<artifactId>xgboost4j_${scala.binary.version}</artifactId>
94+
<version>latest_version_num</version>
95+
</dependency>
96+
<dependency>
97+
<groupId>ml.dmlc</groupId>
98+
<artifactId>xgboost4j-spark_${scala.binary.version}</artifactId>
99+
<version>latest_version_num</version>
100+
</dependency>
101+
</dependencies>
102+
103+
.. code-block:: scala
104+
:caption: sbt
105+
106+
libraryDependencies ++= Seq(
107+
"ml.dmlc" %% "xgboost4j" % "latest_version_num",
108+
"ml.dmlc" %% "xgboost4j-spark" % "latest_version_num"
109+
)
110+
111+
This will check out the latest stable version from the Maven Central.
112+
113+
For the latest release version number, please check `release page <https://github.com/dmlc/xgboost/releases>`_.
114+
115+
To enable the GPU algorithm (``tree_method='gpu_hist'``), use artifacts ``xgboost4j-gpu_2.12`` and ``xgboost4j-spark-gpu_2.12`` instead (note the ``gpu`` suffix).
116+
117+
118+
.. note:: Windows not supported in the JVM package
119+
120+
Currently, XGBoost4J-Spark does not support Windows platform, as the distributed training algorithm is inoperational for Windows. Please use Linux or MacOS.
121+
122+
123+
Nightly Build
124+
=============
125+
126+
127+
Python
128+
------
129+
130+
Nightly builds are available. You can go to `this page <https://s3-us-west-2.amazonaws.com/xgboost-nightly-builds/list.html>`_,
131+
find the wheel with the commit ID you want and install it with pip:
132+
133+
.. code-block:: bash
134+
135+
pip install <url to the wheel>
136+
137+
138+
The capability of Python pre-built wheel is the same as stable release.
139+
140+
141+
R
142+
-
143+
144+
Other than standard CRAN installation, we also provide *experimental* pre-built binary on
145+
Linux x86_64 with GPU support. You can go to `this page
146+
<https://s3-us-west-2.amazonaws.com/xgboost-nightly-builds/list.html>`_, Find the commit
147+
ID you want to install: ``xgboost_r_gpu_linux_[commit].tar.gz``, download it then run:
148+
149+
.. code-block:: bash
150+
151+
# Install dependencies
152+
R -q -e "install.packages(c('data.table', 'magrittr', 'jsonlite', 'remotes'))"
153+
# Install XGBoost
154+
R CMD INSTALL ./xgboost_r_gpu_linux.tar.gz
155+
156+
157+
JVM
158+
---
159+
160+
First add the following Maven repository hosted by the XGBoost project:
161+
162+
.. code-block:: xml
163+
:caption: Maven
164+
165+
<repository>
166+
<id>XGBoost4J Snapshot Repo</id>
167+
<name>XGBoost4J Snapshot Repo</name>
168+
<url>https://s3-us-west-2.amazonaws.com/xgboost-maven-repo/snapshot/</url>
169+
</repository>
170+
171+
.. code-block:: scala
172+
:caption: sbt
173+
174+
resolvers += "XGBoost4J Snapshot Repo" at "https://s3-us-west-2.amazonaws.com/xgboost-maven-repo/snapshot/"
175+
176+
Then add XGBoost4J as a dependency:
177+
178+
.. code-block:: xml
179+
:caption: maven
180+
181+
<properties>
182+
...
183+
<!-- Specify Scala version in package name -->
184+
<scala.binary.version>2.12</scala.binary.version>
185+
</properties>
186+
187+
<dependencies>
188+
...
189+
<dependency>
190+
<groupId>ml.dmlc</groupId>
191+
<artifactId>xgboost4j_${scala.binary.version}</artifactId>
192+
<version>latest_version_num-SNAPSHOT</version>
193+
</dependency>
194+
<dependency>
195+
<groupId>ml.dmlc</groupId>
196+
<artifactId>xgboost4j-spark_${scala.binary.version}</artifactId>
197+
<version>latest_version_num-SNAPSHOT</version>
198+
</dependency>
199+
</dependencies>
200+
201+
.. code-block:: scala
202+
:caption: sbt
203+
204+
libraryDependencies ++= Seq(
205+
"ml.dmlc" %% "xgboost4j" % "latest_version_num-SNAPSHOT",
206+
"ml.dmlc" %% "xgboost4j-spark" % "latest_version_num-SNAPSHOT"
207+
)
208+
209+
Look up the ``version`` field in `pom.xml <https://github.com/dmlc/xgboost/blob/master/jvm-packages/pom.xml>`_ to get the correct version number.
210+
211+
The SNAPSHOT JARs are hosted by the XGBoost project. Every commit in the ``master`` branch will automatically trigger generation of a new SNAPSHOT JAR. You can control how often Maven should upgrade your SNAPSHOT installation by specifying ``updatePolicy``. See `here <http://maven.apache.org/pom.html#Repositories>`_ for details.
212+
213+
You can browse the file listing of the Maven repository at https://s3-us-west-2.amazonaws.com/xgboost-maven-repo/list.html.
214+
215+
To enable the GPU algorithm (``tree_method='gpu_hist'``), use artifacts ``xgboost4j-gpu_2.12`` and ``xgboost4j-spark-gpu_2.12`` instead (note the ``gpu`` suffix).

0 commit comments

Comments
 (0)