Skip to content

Commit

Permalink
Merge pull request #48 from orchardbirds/add_log_cosh
Browse files Browse the repository at this point in the history
add log cosh error closes #47
  • Loading branch information
orchardbirds committed Mar 16, 2021
2 parents 96d7c87 + 62599e4 commit 85de733
Show file tree
Hide file tree
Showing 5 changed files with 99 additions and 1 deletion.
28 changes: 28 additions & 0 deletions bokbokbok/eval_metrics/regression/regression_eval_metrics.py
Original file line number Diff line number Diff line change
Expand Up @@ -61,3 +61,31 @@ def root_mean_squared_log_error(yhat, dtrain, XGBoost=XGBoost):
return 'RMSLE', float(np.sqrt(np.sum(elements) / len(y))), False

return root_mean_squared_log_error


def LogCoshMetric(XGBoost=False):
"""
Calculates the Log Cosh Error
Args:
XGBoost (Bool): Set to True if using XGBoost. We assume LightGBM as default use.
Note that you should also set `maximize=False` in the XGBoost train function
"""
def log_cosh_error(yhat, dtrain, XGBoost=XGBoost):
"""
Root Mean Squared Log Error.
All input labels are required to be greater than -1.
yhat: Predictions
dtrain: The XGBoost / LightGBM dataset
XGBoost (Bool): If XGBoost is to be implemented
"""

y = dtrain.get_label()
elements = np.log(np.cosh(yhat - y))
if XGBoost:
return 'LogCosh', float(np.sum(elements) / len(y))
else:
return 'LogCosh', float(np.sum(elements) / len(y)), False

return log_cosh_error
56 changes: 56 additions & 0 deletions bokbokbok/loss_functions/regression/regression_loss_functions.py
Original file line number Diff line number Diff line change
Expand Up @@ -58,3 +58,59 @@ def squared_log_loss(
return grad, hess

return squared_log_loss


def LogCoshLoss():
"""
"""

def _gradient(yhat, dtrain):
"""Compute the log cosh gradient.
Args:
yhat (np.array): Predictions
dtrain: The XGBoost / LightGBM dataset
Returns:
log cosh gradient
"""

y = dtrain.get_label()
return np.tanh(yhat - y)

def _hessian(yhat, dtrain):
"""Compute the log cosh hessian.
Args:
yhat (np.array): Predictions
dtrain: The XGBoost / LightGBM dataset
Returns:
log cosh Hessian
"""

y = dtrain.get_label()
return 1. / np.power(np.cosh(yhat - y), 2)

def log_cosh_loss(
yhat,
dtrain
):
"""
Calculate gradient and hessian for log cosh loss.
Args:
yhat (np.array): Predictions
dtrain: The XGBoost / LightGBM dataset
Returns:
grad: log cosh loss gradient
hess: log cosh loss Hessian
"""
grad = _gradient(yhat, dtrain)

hess = _hessian(yhat, dtrain)

return grad, hess

return log_cosh_loss
13 changes: 13 additions & 0 deletions docs/derivations/log_cosh.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,13 @@
## Log Cosh Error

The equation for Log Cosh Error is:

<img src="https://latex.codecogs.com/svg.latex?L_{LC}&space;=&space;\log(\cosh(\hat{y}&space;-&space;y))" title="L_{LC} = \log(\cosh(\hat{y} - y))" />

We calculate the Gradient:

<img src="https://latex.codecogs.com/svg.latex?G_{LC}&space;=&space;\tanh(\hat{y}&space;-&space;y)" title="G_{LC} = \tanh(\hat{y} - y)" />

We also need to calculate the Hessian:

<img src="https://latex.codecogs.com/svg.latex?H_{LC}&space;=&space;\frac{1}{\cosh^{2}(\hat{y}&space;-&space;y)}" title="H_{LC} = \frac{1}{\cosh^{2}(\hat{y} - y)}" />
2 changes: 1 addition & 1 deletion docs/derivations/sle.md
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
## Squared Log Error

The equations for Squared Log Error is:
The equation for Squared Log Error is:

<img src="https://latex.codecogs.com/svg.latex?L_{SLE}&space;=&space;\frac{1}{2}(\log(\hat{y}&space;&plus;&space;1)&space;-&space;\log(y&space;&plus;&space;1))^{2}" title="L_{SLE} = \frac{1}{2}(\log(\hat{y} + 1) - \log(y + 1))^{2}" />

Expand Down
1 change: 1 addition & 0 deletions mkdocs.yml
Original file line number Diff line number Diff line change
Expand Up @@ -17,6 +17,7 @@ nav:
- Weighted Cross Entropy: derivations/wce.md
- Focal Loss: derivations/focal.md
- Squared Log Error: derivations/sle.md
- Log Cosh Error: derivations/log_cosh.md

plugins:
- mkdocstrings:
Expand Down

0 comments on commit 85de733

Please sign in to comment.