Skip to content

Commit

Permalink
include 1st weight in L2 loss
Browse files Browse the repository at this point in the history
  • Loading branch information
rasbt committed Oct 13, 2020
1 parent 9ab8db7 commit 0b1e431
Show file tree
Hide file tree
Showing 2 changed files with 2 additions and 2 deletions.
2 changes: 1 addition & 1 deletion docs/sources/CHANGELOG.md
Expand Up @@ -30,7 +30,7 @@ The CHANGELOG for the current development version is available at

##### Bug Fixes

- -
- The loss in `LogisticRegression` for logging purposes didn't include the L2 penalty for the first weight in the weight vector (this is not the bias unit). However, since this loss function was only used for logging purposes, and the gradient remains correct, this does not have an effect on the main code. ([#741](https://github.com/rasbt/mlxtend/pull/741))


### Version 0.17.3 (07-27-2020)
Expand Down
2 changes: 1 addition & 1 deletion mlxtend/classifier/logistic_regression.py
Expand Up @@ -146,7 +146,7 @@ def predict_proba(self, X):
def _logit_cost(self, y, y_val):
logit = -y.dot(np.log(y_val)) - ((1 - y).dot(np.log(1 - y_val)))
if self.l2_lambda:
l2 = self.l2_lambda / 2.0 * np.sum(self.w_[1:]**2)
l2 = self.l2_lambda / 2.0 * np.sum(self.w_**2)
logit += l2
return logit

Expand Down

0 comments on commit 0b1e431

Please sign in to comment.