Skip to content

Commit 8d0f2bf

Browse files
[doc] Add more detailed explanations for advanced objectives (dmlc#10283)
--------- Co-authored-by: Jiaming Yuan <jm.yuan@outlook.com>
1 parent 2266db1 commit 8d0f2bf

File tree

7 files changed

+760
-4
lines changed

7 files changed

+760
-4
lines changed

R-package/R/xgb.train.R

Lines changed: 12 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -102,6 +102,18 @@
102102
#' It might be useful, e.g., for modeling total loss in insurance, or for any outcome that might be
103103
#' \href{https://en.wikipedia.org/wiki/Tweedie_distribution#Applications}{Tweedie-distributed}.}
104104
#' }
105+
#'
106+
#' For custom objectives, one should pass a function taking as input the current predictions (as a numeric
107+
#' vector or matrix) and the training data (as an `xgb.DMatrix` object) that will return a list with elements
108+
#' `grad` and `hess`, which should be numeric vectors or matrices with number of rows matching to the numbers
109+
#' of rows in the training data (same shape as the predictions that are passed as input to the function).
110+
#' For multi-valued custom objectives, should have shape `[nrows, ntargets]`. Note that negative values of
111+
#' the Hessian will be clipped, so one might consider using the expected Hessian (Fisher information) if the
112+
#' objective is non-convex.
113+
#'
114+
#' See the tutorials \href{https://xgboost.readthedocs.io/en/stable/tutorials/custom_metric_obj.html}{
115+
#' Custom Objective and Evaluation Metric} and \href{https://xgboost.readthedocs.io/en/stable/tutorials/advanced_custom_obj}{
116+
#' Advanced Usage of Custom Objectives} for more information about custom objectives.
105117
#' }
106118
#' \item \code{base_score} the initial prediction score of all instances, global bias. Default: 0.5
107119
#' \item{ \code{eval_metric} evaluation metrics for validation data.

R-package/man/xgb.train.Rd

Lines changed: 12 additions & 0 deletions
Some generated files are not rendered by default. Learn more about customizing how changed files appear on GitHub.

demo/guide-python/custom_softmax.py

Lines changed: 6 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -6,7 +6,8 @@
66
XGBoost returns transformed prediction for multi-class objective function. More details
77
in comments.
88
9-
See :doc:`/tutorials/custom_metric_obj` for detailed tutorial and notes.
9+
See :doc:`/tutorials/custom_metric_obj` and :doc:`/tutorials/advanced_custom_obj` for
10+
detailed tutorial and notes.
1011
1112
'''
1213

@@ -39,7 +40,9 @@ def softmax(x):
3940

4041

4142
def softprob_obj(predt: np.ndarray, data: xgb.DMatrix):
42-
'''Loss function. Computing the gradient and approximated hessian (diagonal).
43+
'''Loss function. Computing the gradient and upper bound on the
44+
Hessian with a diagonal structure for XGBoost (note that this is
45+
not the true Hessian).
4346
Reimplements the `multi:softprob` inside XGBoost.
4447
4548
'''
@@ -61,7 +64,7 @@ def softprob_obj(predt: np.ndarray, data: xgb.DMatrix):
6164

6265
eps = 1e-6
6366

64-
# compute the gradient and hessian, slow iterations in Python, only
67+
# compute the gradient and hessian upper bound, slow iterations in Python, only
6568
# suitable for demo. Also the one in native XGBoost core is more robust to
6669
# numeric overflow as we don't do anything to mitigate the `exp` in
6770
# `softmax` here.

0 commit comments

Comments
 (0)