|
| 1 | +################## |
| 2 | +Installation Guide |
| 3 | +################## |
| 4 | + |
| 5 | +XGBoost provides binary packages for some language bindings. The binary packages support |
| 6 | +the GPU algorithm (``gpu_hist``) on machines with NVIDIA GPUs. Please note that **training |
| 7 | +with multiple GPUs is only supported for Linux platform**. See :doc:`gpu/index`. Also we |
| 8 | +have both stable releases and nightly builds, see below for how to install them. For |
| 9 | +building from source, visit :doc:`this page </build>`. |
| 10 | + |
| 11 | +.. contents:: Contents |
| 12 | + |
| 13 | +Stable Release |
| 14 | +============== |
| 15 | + |
| 16 | +Python |
| 17 | +------ |
| 18 | + |
| 19 | +Pre-built binary are uploaded to PyPI (Python Package Index) for each release. Supported platforms are Linux (x86_64, aarch64), Windows (x86_64) and MacOS (x86_64). |
| 20 | + |
| 21 | +.. code-block:: bash |
| 22 | +
|
| 23 | + pip install xgboost |
| 24 | +
|
| 25 | +
|
| 26 | +You might need to run the command with ``--user`` flag or use ``virtualenv`` if you run |
| 27 | +into permission errors. Python pre-built binary capability for each platform: |
| 28 | + |
| 29 | +.. |tick| unicode:: U+2714 |
| 30 | +.. |cross| unicode:: U+2718 |
| 31 | + |
| 32 | ++-------------------+---------+----------------------+ |
| 33 | +| Platform | GPU | Multi-Node-Multi-GPU | |
| 34 | ++===================+=========+======================+ |
| 35 | +| Linux x86_64 | |tick| | |tick| | |
| 36 | ++-------------------+---------+----------------------+ |
| 37 | +| Linux aarch64 | |cross| | |cross| | |
| 38 | ++-------------------+---------+----------------------+ |
| 39 | +| MacOS | |cross| | |cross| | |
| 40 | ++-------------------+---------+----------------------+ |
| 41 | +| Windows | |tick| | |cross| | |
| 42 | ++-------------------+---------+----------------------+ |
| 43 | + |
| 44 | +R |
| 45 | +- |
| 46 | + |
| 47 | +* From CRAN: |
| 48 | + |
| 49 | + .. code-block:: R |
| 50 | +
|
| 51 | + install.packages("xgboost") |
| 52 | +
|
| 53 | + .. note:: Using all CPU cores (threads) on Mac OSX |
| 54 | + |
| 55 | + If you are using Mac OSX, you should first install OpenMP library (``libomp``) by running |
| 56 | + |
| 57 | + .. code-block:: bash |
| 58 | +
|
| 59 | + brew install libomp |
| 60 | +
|
| 61 | + and then run ``install.packages("xgboost")``. Without OpenMP, XGBoost will only use a |
| 62 | + single CPU core, leading to suboptimal training speed. |
| 63 | + |
| 64 | +* We also provide **experimental** pre-built binary on Linux x86_64 with GPU support. |
| 65 | + Download the binary package from the Releases page. The file name will be of the form |
| 66 | + ``xgboost_r_gpu_linux_[version].tar.gz``. Then install XGBoost by running: |
| 67 | + |
| 68 | + .. code-block:: bash |
| 69 | +
|
| 70 | + # Install dependencies |
| 71 | + R -q -e "install.packages(c('data.table', 'magrittr', 'jsonlite'))" |
| 72 | + # Install XGBoost |
| 73 | + R CMD INSTALL ./xgboost_r_gpu_linux.tar.gz |
| 74 | +
|
| 75 | +JVM |
| 76 | +--- |
| 77 | + |
| 78 | +You can use XGBoost4J in your Java/Scala application by adding XGBoost4J as a dependency: |
| 79 | + |
| 80 | +.. code-block:: xml |
| 81 | + :caption: Maven |
| 82 | +
|
| 83 | + <properties> |
| 84 | + ... |
| 85 | + <!-- Specify Scala version in package name --> |
| 86 | + <scala.binary.version>2.12</scala.binary.version> |
| 87 | + </properties> |
| 88 | +
|
| 89 | + <dependencies> |
| 90 | + ... |
| 91 | + <dependency> |
| 92 | + <groupId>ml.dmlc</groupId> |
| 93 | + <artifactId>xgboost4j_${scala.binary.version}</artifactId> |
| 94 | + <version>latest_version_num</version> |
| 95 | + </dependency> |
| 96 | + <dependency> |
| 97 | + <groupId>ml.dmlc</groupId> |
| 98 | + <artifactId>xgboost4j-spark_${scala.binary.version}</artifactId> |
| 99 | + <version>latest_version_num</version> |
| 100 | + </dependency> |
| 101 | + </dependencies> |
| 102 | +
|
| 103 | +.. code-block:: scala |
| 104 | + :caption: sbt |
| 105 | +
|
| 106 | + libraryDependencies ++= Seq( |
| 107 | + "ml.dmlc" %% "xgboost4j" % "latest_version_num", |
| 108 | + "ml.dmlc" %% "xgboost4j-spark" % "latest_version_num" |
| 109 | + ) |
| 110 | +
|
| 111 | +This will check out the latest stable version from the Maven Central. |
| 112 | + |
| 113 | +For the latest release version number, please check `release page <https://github.com/dmlc/xgboost/releases>`_. |
| 114 | + |
| 115 | +To enable the GPU algorithm (``tree_method='gpu_hist'``), use artifacts ``xgboost4j-gpu_2.12`` and ``xgboost4j-spark-gpu_2.12`` instead (note the ``gpu`` suffix). |
| 116 | + |
| 117 | + |
| 118 | +.. note:: Windows not supported in the JVM package |
| 119 | + |
| 120 | + Currently, XGBoost4J-Spark does not support Windows platform, as the distributed training algorithm is inoperational for Windows. Please use Linux or MacOS. |
| 121 | + |
| 122 | + |
| 123 | +Nightly Build |
| 124 | +============= |
| 125 | + |
| 126 | + |
| 127 | +Python |
| 128 | +------ |
| 129 | + |
| 130 | +Nightly builds are available. You can go to `this page <https://s3-us-west-2.amazonaws.com/xgboost-nightly-builds/list.html>`_, |
| 131 | +find the wheel with the commit ID you want and install it with pip: |
| 132 | + |
| 133 | +.. code-block:: bash |
| 134 | +
|
| 135 | + pip install <url to the wheel> |
| 136 | +
|
| 137 | +
|
| 138 | +The capability of Python pre-built wheel is the same as stable release. |
| 139 | + |
| 140 | + |
| 141 | +R |
| 142 | +- |
| 143 | + |
| 144 | +Other than standard CRAN installation, we also provide *experimental* pre-built binary on |
| 145 | +Linux x86_64 with GPU support. You can go to `this page |
| 146 | +<https://s3-us-west-2.amazonaws.com/xgboost-nightly-builds/list.html>`_, Find the commit |
| 147 | +ID you want to install: ``xgboost_r_gpu_linux_[commit].tar.gz``, download it then run: |
| 148 | + |
| 149 | +.. code-block:: bash |
| 150 | +
|
| 151 | + # Install dependencies |
| 152 | + R -q -e "install.packages(c('data.table', 'magrittr', 'jsonlite', 'remotes'))" |
| 153 | + # Install XGBoost |
| 154 | + R CMD INSTALL ./xgboost_r_gpu_linux.tar.gz |
| 155 | +
|
| 156 | +
|
| 157 | +JVM |
| 158 | +--- |
| 159 | + |
| 160 | +First add the following Maven repository hosted by the XGBoost project: |
| 161 | + |
| 162 | +.. code-block:: xml |
| 163 | + :caption: Maven |
| 164 | +
|
| 165 | + <repository> |
| 166 | + <id>XGBoost4J Snapshot Repo</id> |
| 167 | + <name>XGBoost4J Snapshot Repo</name> |
| 168 | + <url>https://s3-us-west-2.amazonaws.com/xgboost-maven-repo/snapshot/</url> |
| 169 | + </repository> |
| 170 | +
|
| 171 | +.. code-block:: scala |
| 172 | + :caption: sbt |
| 173 | +
|
| 174 | + resolvers += "XGBoost4J Snapshot Repo" at "https://s3-us-west-2.amazonaws.com/xgboost-maven-repo/snapshot/" |
| 175 | +
|
| 176 | +Then add XGBoost4J as a dependency: |
| 177 | + |
| 178 | +.. code-block:: xml |
| 179 | + :caption: maven |
| 180 | +
|
| 181 | + <properties> |
| 182 | + ... |
| 183 | + <!-- Specify Scala version in package name --> |
| 184 | + <scala.binary.version>2.12</scala.binary.version> |
| 185 | + </properties> |
| 186 | +
|
| 187 | + <dependencies> |
| 188 | + ... |
| 189 | + <dependency> |
| 190 | + <groupId>ml.dmlc</groupId> |
| 191 | + <artifactId>xgboost4j_${scala.binary.version}</artifactId> |
| 192 | + <version>latest_version_num-SNAPSHOT</version> |
| 193 | + </dependency> |
| 194 | + <dependency> |
| 195 | + <groupId>ml.dmlc</groupId> |
| 196 | + <artifactId>xgboost4j-spark_${scala.binary.version}</artifactId> |
| 197 | + <version>latest_version_num-SNAPSHOT</version> |
| 198 | + </dependency> |
| 199 | + </dependencies> |
| 200 | +
|
| 201 | +.. code-block:: scala |
| 202 | + :caption: sbt |
| 203 | +
|
| 204 | + libraryDependencies ++= Seq( |
| 205 | + "ml.dmlc" %% "xgboost4j" % "latest_version_num-SNAPSHOT", |
| 206 | + "ml.dmlc" %% "xgboost4j-spark" % "latest_version_num-SNAPSHOT" |
| 207 | + ) |
| 208 | +
|
| 209 | +Look up the ``version`` field in `pom.xml <https://github.com/dmlc/xgboost/blob/master/jvm-packages/pom.xml>`_ to get the correct version number. |
| 210 | + |
| 211 | +The SNAPSHOT JARs are hosted by the XGBoost project. Every commit in the ``master`` branch will automatically trigger generation of a new SNAPSHOT JAR. You can control how often Maven should upgrade your SNAPSHOT installation by specifying ``updatePolicy``. See `here <http://maven.apache.org/pom.html#Repositories>`_ for details. |
| 212 | + |
| 213 | +You can browse the file listing of the Maven repository at https://s3-us-west-2.amazonaws.com/xgboost-maven-repo/list.html. |
| 214 | + |
| 215 | +To enable the GPU algorithm (``tree_method='gpu_hist'``), use artifacts ``xgboost4j-gpu_2.12`` and ``xgboost4j-spark-gpu_2.12`` instead (note the ``gpu`` suffix). |
0 commit comments