New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[MRG+1] use class_weight through sample_weight in LogisticRegression with liblinear #6817

Merged
merged 1 commit into from May 25, 2016

Conversation

Projects
None yet
3 participants
@TomDLT
Member

TomDLT commented May 24, 2016

I forgot to update #5008 after we merged #5274 (cf. #5008 (comment)).

It makes the class_weight behavior consistent with all solver of LogisticRegression, by putting the class_weight into the sample_weight.

Fixes #6795

"solver cannot handle multiclass with "
"class_weight of type dict. Use the lbfgs, "
"newton-cg or sag solvers or set "
"class_weight='balanced'")

This comment has been minimized.

@agramfort

agramfort May 25, 2016

Member

this is not true anymore?

This comment has been minimized.

@TomDLT

TomDLT May 25, 2016

Member

No, since liblinear solver handles sample_weight (#5274), we can make the behavior identical for all LogisticRegression's solvers, by merging the class_weight into the sample_weight.

@agramfort agramfort changed the title from [MRG] use class_weight through sample_weight in LogisticRegression with liblinear to [MRG+1] use class_weight through sample_weight in LogisticRegression with liblinear May 25, 2016

@agramfort

This comment has been minimized.

Member

agramfort commented May 25, 2016

+1 for merge

for class_weight in ({0: 0.1, 1: 0.2, 2: 0.5}, 'balanced'):
for n_classes in [3, 2]:
if n_classes == 2 and class_weight != 'balanced':
class_weight.pop(2)

This comment has been minimized.

@jnothman

jnothman May 25, 2016

Member

to avoid bugs if this were extended, can we please do class_weight = class_weight.copy() first?

This comment has been minimized.

@TomDLT

TomDLT May 25, 2016

Member

I have changed the loops to avoid modifications on class_weight.

This comment has been minimized.

@jnothman

jnothman May 25, 2016

Member

I have changed the loops to avoid modifications on class_weight.

Yes, of course that's the sensible way to do it!

@@ -947,7 +947,8 @@ def test_warm_start():
solver=solver,
random_state=42, max_iter=100,
fit_intercept=fit_intercept)
clf.fit(X, y)
with ignore_warnings():

This comment has been minimized.

@jnothman

jnothman May 25, 2016

Member

what warning are we ignoring?

This comment has been minimized.

@TomDLT

TomDLT May 25, 2016

Member
  • first fit:
    ConvergenceWarning for SAG solver
  • second fit:
    ConvergenceWarning for SAG solver and UserWarning for Newton-CG solver (which actually should be a ConvergenceWarning)

This comment has been minimized.

@jnothman

jnothman May 25, 2016

Member

Seeing as ConvergenceWarning subclasses UserWarning, it would be nice to see that changed.

@jnothman

This comment has been minimized.

Member

jnothman commented May 25, 2016

Otherwise, LGTM

@jnothman

This comment has been minimized.

Member

jnothman commented May 25, 2016

would you like to add to whats_new, then squash?

@TomDLT

This comment has been minimized.

Member

TomDLT commented May 25, 2016

Done. I also added ConvergenceWarning in newton-cg solver.

@jnothman

This comment has been minimized.

Member

jnothman commented May 25, 2016

Let's do it.

@jnothman jnothman merged commit af171b8 into scikit-learn:master May 25, 2016

2 of 4 checks passed

ci/circleci CircleCI is running your tests
Details
continuous-integration/appveyor/pr Waiting for AppVeyor build to complete
Details
continuous-integration/travis-ci/pr The Travis CI build passed
Details
coverage/coveralls Coverage decreased (-0.0007%) to 94.488%
Details

@TomDLT TomDLT deleted the TomDLT:logistic_class_weight branch Dec 20, 2016

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment