Skip to content

Commit

Permalink
[docs] reduce redirects in docs links (#6181)
Browse files Browse the repository at this point in the history
  • Loading branch information
jameslamb committed Nov 14, 2023
1 parent 694e41e commit e63e54a
Show file tree
Hide file tree
Showing 8 changed files with 37 additions and 37 deletions.
26 changes: 13 additions & 13 deletions docs/Experiments.rst
Original file line number Diff line number Diff line change
Expand Up @@ -18,19 +18,19 @@ Data

We used 5 datasets to conduct our comparison experiments. Details of data are listed in the following table:

+-----------+-----------------------+------------------------------------------------------------------------+-------------+----------+----------------------------------------------+
| Data | Task | Link | #Train\_Set | #Feature | Comments |
+===========+=======================+========================================================================+=============+==========+==============================================+
| Higgs | Binary classification | `link <https://archive.ics.uci.edu/ml/datasets/HIGGS>`__ | 10,500,000 | 28 | last 500,000 samples were used as test set |
+-----------+-----------------------+------------------------------------------------------------------------+-------------+----------+----------------------------------------------+
| Yahoo LTR | Learning to rank | `link <https://webscope.sandbox.yahoo.com/catalog.php?datatype=c>`__ | 473,134 | 700 | set1.train as train, set1.test as test |
+-----------+-----------------------+------------------------------------------------------------------------+-------------+----------+----------------------------------------------+
| MS LTR | Learning to rank | `link <https://www.microsoft.com/en-us/research/project/mslr/>`__ | 2,270,296 | 137 | {S1,S2,S3} as train set, {S5} as test set |
+-----------+-----------------------+------------------------------------------------------------------------+-------------+----------+----------------------------------------------+
| Expo | Binary classification | `link <http://stat-computing.org/dataexpo/2009/>`__ | 11,000,000 | 700 | last 1,000,000 samples were used as test set |
+-----------+-----------------------+------------------------------------------------------------------------+-------------+----------+----------------------------------------------+
| Allstate | Binary classification | `link <https://www.kaggle.com/c/ClaimPredictionChallenge>`__ | 13,184,290 | 4228 | last 1,000,000 samples were used as test set |
+-----------+-----------------------+------------------------------------------------------------------------+-------------+----------+----------------------------------------------+
+-----------+-----------------------+---------------------------------------------------------------------------------+-------------+----------+----------------------------------------------+
| Data | Task | Link | #Train\_Set | #Feature | Comments |
+===========+=======================+=================================================================================+=============+==========+==============================================+
| Higgs | Binary classification | `link <https://archive.ics.uci.edu/dataset/280/higgs>`__ | 10,500,000 | 28 | last 500,000 samples were used as test set |
+-----------+-----------------------+---------------------------------------------------------------------------------+-------------+----------+----------------------------------------------+
| Yahoo LTR | Learning to rank | `link <https://webscope.sandbox.yahoo.com/catalog.php?datatype=c>`__ | 473,134 | 700 | set1.train as train, set1.test as test |
+-----------+-----------------------+---------------------------------------------------------------------------------+-------------+----------+----------------------------------------------+
| MS LTR | Learning to rank | `link <https://www.microsoft.com/en-us/research/project/mslr/>`__ | 2,270,296 | 137 | {S1,S2,S3} as train set, {S5} as test set |
+-----------+-----------------------+---------------------------------------------------------------------------------+-------------+----------+----------------------------------------------+
| Expo | Binary classification | `link <https://community.amstat.org/jointscsg-section/dataexpo/dataexpo2009>`__ | 11,000,000 | 700 | last 1,000,000 samples were used as test set |
+-----------+-----------------------+---------------------------------------------------------------------------------+-------------+----------+----------------------------------------------+
| Allstate | Binary classification | `link <https://www.kaggle.com/c/ClaimPredictionChallenge>`__ | 13,184,290 | 4228 | last 1,000,000 samples were used as test set |
+-----------+-----------------------+---------------------------------------------------------------------------------+-------------+----------+----------------------------------------------+

Environment
^^^^^^^^^^^
Expand Down
2 changes: 1 addition & 1 deletion docs/Features.rst
Original file line number Diff line number Diff line change
Expand Up @@ -291,7 +291,7 @@ References

.. _On Grouping for Maximum Homogeneity: https://www.tandfonline.com/doi/abs/10.1080/01621459.1958.10501479

.. _Optimization of collective communication operations in MPICH: https://www.mcs.anl.gov/~thakur/papers/ijhpca-coll.pdf
.. _Optimization of collective communication operations in MPICH: https://web.cels.anl.gov/~thakur/papers/ijhpca-coll.pdf

.. _A Communication-Efficient Parallel Algorithm for Decision Tree: http://papers.nips.cc/paper/6381-a-communication-efficient-parallel-algorithm-for-decision-tree

Expand Down
6 changes: 3 additions & 3 deletions docs/GPU-Performance.rst
Original file line number Diff line number Diff line change
Expand Up @@ -194,17 +194,17 @@ following article:

Huan Zhang, Si Si and Cho-Jui Hsieh. `GPU Acceleration for Large-scale Tree Boosting`_. SysML Conference, 2018.

.. _link1: https://archive.ics.uci.edu/ml/datasets/HIGGS
.. _link1: https://archive.ics.uci.edu/dataset/280/higgs

.. _link2: https://www.csie.ntu.edu.tw/~cjlin/libsvmtools/datasets/binary.html

.. _link3: https://www.kaggle.com/c/bosch-production-line-performance/data

.. _link4: https://webscope.sandbox.yahoo.com/catalog.php?datatype=c

.. _link5: http://research.microsoft.com/en-us/projects/mslr/
.. _link5: https://www.microsoft.com/en-us/research/project/mslr/

.. _link6: http://stat-computing.org/dataexpo/2009/
.. _link6: https://community.amstat.org/jointscsg-section/dataexpo/dataexpo2009

.. _0bb4a82: https://github.com/microsoft/LightGBM/commit/0bb4a82

Expand Down
2 changes: 1 addition & 1 deletion docs/Installation-Guide.rst
Original file line number Diff line number Diff line change
Expand Up @@ -950,7 +950,7 @@ gcc

.. _RDMA: https://en.wikipedia.org/wiki/Remote_direct_memory_access

.. _MS MPI: https://docs.microsoft.com/en-us/message-passing-interface/microsoft-mpi-release-notes
.. _MS MPI: https://learn.microsoft.com/en-us/message-passing-interface/microsoft-mpi-release-notes

.. _Open MPI: https://www.open-mpi.org/

Expand Down
4 changes: 2 additions & 2 deletions docs/Parallel-Learning-Guide.rst
Original file line number Diff line number Diff line change
Expand Up @@ -518,7 +518,7 @@ See `the mars documentation`_ for usage examples.

.. _the Dask DataFrame documentation: https://docs.dask.org/en/latest/dataframe.html

.. _the Dask prediction example: https://github.com/microsoft/lightgbm/tree/master/examples/python-guide/dask/prediction.py
.. _the Dask prediction example: https://github.com/microsoft/LightGBM/blob/master/examples/python-guide/dask/prediction.py

.. _the Dask worker documentation: https://distributed.dask.org/en/stable/worker-memory.html

Expand All @@ -536,7 +536,7 @@ See `the mars documentation`_ for usage examples.

.. _lightgbm_ray: https://github.com/ray-project/lightgbm_ray

.. _Ray: https://ray.io/
.. _Ray: https://www.ray.io/

.. _the lightgbm_ray documentation: https://docs.ray.io/en/latest/tune/api_docs/integration.html#lightgbm-tune-integration-lightgbm

Expand Down
16 changes: 8 additions & 8 deletions docs/Parameters.rst
Original file line number Diff line number Diff line change
Expand Up @@ -119,7 +119,7 @@ Core Parameters

- ranking application

- ``lambdarank``, `lambdarank <https://papers.nips.cc/paper/2971-learning-to-rank-with-nonsmooth-cost-functions.pdf>`__ objective. `label_gain <#label_gain>`__ can be used to set the gain (weight) of ``int`` label and all values in ``label`` must be smaller than number of elements in ``label_gain``
- ``lambdarank``, `lambdarank <https://proceedings.neurips.cc/paper_files/paper/2006/file/af44c4c56f385c43f2529f9b1b018f6a-Paper.pdf>`__ objective. `label_gain <#label_gain>`__ can be used to set the gain (weight) of ``int`` label and all values in ``label`` must be smaller than number of elements in ``label_gain``

- ``rank_xendcg``, `XE_NDCG_MART <https://arxiv.org/abs/1911.09798>`__ ranking objective function, aliases: ``xendcg``, ``xe_ndcg``, ``xe_ndcg_mart``, ``xendcg_mart``

Expand Down Expand Up @@ -536,15 +536,15 @@ Learning Control Parameters

- ``basic``, the most basic monotone constraints method. It does not slow the library at all, but over-constrains the predictions

- ``intermediate``, a `more advanced method <https://hal.archives-ouvertes.fr/hal-02862802/document>`__, which may slow the library very slightly. However, this method is much less constraining than the basic method and should significantly improve the results
- ``intermediate``, a `more advanced method <https://hal.science/hal-02862802/document>`__, which may slow the library very slightly. However, this method is much less constraining than the basic method and should significantly improve the results

- ``advanced``, an `even more advanced method <https://hal.archives-ouvertes.fr/hal-02862802/document>`__, which may slow the library. However, this method is even less constraining than the intermediate method and should again significantly improve the results
- ``advanced``, an `even more advanced method <https://hal.science/hal-02862802/document>`__, which may slow the library. However, this method is even less constraining than the intermediate method and should again significantly improve the results

- ``monotone_penalty`` :raw-html:`<a id="monotone_penalty" title="Permalink to this parameter" href="#monotone_penalty">&#x1F517;&#xFE0E;</a>`, default = ``0.0``, type = double, aliases: ``monotone_splits_penalty``, ``ms_penalty``, ``mc_penalty``, constraints: ``monotone_penalty >= 0.0``

- used only if ``monotone_constraints`` is set

- `monotone penalty <https://hal.archives-ouvertes.fr/hal-02862802/document>`__: a penalization parameter X forbids any monotone splits on the first X (rounded down) level(s) of the tree. The penalty applied to monotone splits on a given depth is a continuous, increasing function the penalization parameter
- `monotone penalty <https://hal.science/hal-02862802/document>`__: a penalization parameter X forbids any monotone splits on the first X (rounded down) level(s) of the tree. The penalty applied to monotone splits on a given depth is a continuous, increasing function the penalization parameter

- if ``0.0`` (the default), no penalization is applied

Expand All @@ -564,7 +564,7 @@ Learning Control Parameters

- **Note**: the forced split logic will be ignored, if the split makes gain worse

- see `this file <https://github.com/microsoft/LightGBM/tree/master/examples/binary_classification/forced_splits.json>`__ as an example
- see `this file <https://github.com/microsoft/LightGBM/blob/master/examples/binary_classification/forced_splits.json>`__ as an example

- ``refit_decay_rate`` :raw-html:`<a id="refit_decay_rate" title="Permalink to this parameter" href="#refit_decay_rate">&#x1F517;&#xFE0E;</a>`, default = ``0.9``, type = double, constraints: ``0.0 <= refit_decay_rate <= 1.0``

Expand Down Expand Up @@ -770,7 +770,7 @@ Dataset Parameters

- ``enable_bundle`` :raw-html:`<a id="enable_bundle" title="Permalink to this parameter" href="#enable_bundle">&#x1F517;&#xFE0E;</a>`, default = ``true``, type = bool, aliases: ``is_enable_bundle``, ``bundle``

- set this to ``false`` to disable Exclusive Feature Bundling (EFB), which is described in `LightGBM: A Highly Efficient Gradient Boosting Decision Tree <https://papers.nips.cc/paper/6907-lightgbm-a-highly-efficient-gradient-boosting-decision-tree>`__
- set this to ``false`` to disable Exclusive Feature Bundling (EFB), which is described in `LightGBM: A Highly Efficient Gradient Boosting Decision Tree <https://papers.nips.cc/paper_files/paper/2017/hash/6449f44a102fde848669bdd9eb6b76fa-Abstract.html>`__

- **Note**: disabling this may cause the slow training speed for sparse datasets

Expand Down Expand Up @@ -894,7 +894,7 @@ Dataset Parameters

- ``.json`` file should contain an array of objects, each containing the word ``feature`` (integer feature index) and ``bin_upper_bound`` (array of thresholds for binning)

- see `this file <https://github.com/microsoft/LightGBM/tree/master/examples/regression/forced_bins.json>`__ as an example
- see `this file <https://github.com/microsoft/LightGBM/blob/master/examples/regression/forced_bins.json>`__ as an example

- ``save_binary`` :raw-html:`<a id="save_binary" title="Permalink to this parameter" href="#save_binary">&#x1F517;&#xFE0E;</a>`, default = ``false``, type = bool, aliases: ``is_save_binary``, ``is_save_binary_file``

Expand Down Expand Up @@ -961,7 +961,7 @@ Predict Parameters

- produces ``#features + 1`` values where the last value is the expected value of the model output over the training data

- **Note**: if you want to get more explanation for your model's predictions using SHAP values like SHAP interaction values, you can install `shap package <https://github.com/slundberg/shap>`__
- **Note**: if you want to get more explanation for your model's predictions using SHAP values like SHAP interaction values, you can install `shap package <https://github.com/shap>`__

- **Note**: unlike the shap package, with ``predict_contrib`` we return a matrix with an extra column, where the last column is the expected value

Expand Down
2 changes: 1 addition & 1 deletion docs/Quick-Start.rst
Original file line number Diff line number Diff line change
Expand Up @@ -85,4 +85,4 @@ Examples

.. _LibSVM: https://www.csie.ntu.edu.tw/~cjlin/libsvm/

.. _Expo data: http://stat-computing.org/dataexpo/2009/
.. _Expo data: https://community.amstat.org/jointscsg-section/dataexpo/dataexpo2009

0 comments on commit e63e54a

Please sign in to comment.