Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

include 1st weight in L2 loss #741

Merged
merged 1 commit into from
Oct 13, 2020
Merged

include 1st weight in L2 loss #741

merged 1 commit into from
Oct 13, 2020

Conversation

rasbt
Copy link
Owner

@rasbt rasbt commented Oct 13, 2020

Description

The loss in LogisticRegression for logging purposes didn't include the L2 penalty for the first weight in the weight vector (this is not the bias unit). However, since this loss function was only used for logging purposes, and the gradient remains correct, this does not have an effect on the main code.

@rasbt rasbt merged commit 71942e1 into master Oct 13, 2020
@rasbt rasbt deleted the regularize branch November 12, 2020 17:32
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

1 participant