New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Non-negative coefficients with lars #9837

Closed
anoopamagarwal opened this Issue Sep 27, 2017 · 10 comments

Comments

Projects
None yet
5 participants
@anoopamagarwal

anoopamagarwal commented Sep 27, 2017

I am running this piece of code

import numpy as np
from sklearn.linear_model import Lars
import sklearn

flag = True
print sklearn.__version__
while flag:
    X = np.random.randn(10,5)
    Y = np.random.randn(10)
    l=Lars(fit_intercept=False, positive=True)

    l.fit(X,Y)
    print l.coef_
    for i in l.coef_ :
        if i < 0 :
            flag = False

I am getting the following output

0.19.0
[ 0.00000000e+00 0.00000000e+00 5.21722148e-01 3.10743837e-01
4.18778571e-04]
[ 0. 0. 0. 0.45541234 0.09542552]
[ 0.80960144 0.35327211 0.17399586 0.16972765 -0.02813728]

with negative coefficients

@jnothman

This comment has been minimized.

Show comment
Hide comment
@jnothman

jnothman Sep 27, 2017

Member
Member

jnothman commented Sep 27, 2017

@anoopamagarwal

This comment has been minimized.

Show comment
Hide comment
@anoopamagarwal

anoopamagarwal Sep 27, 2017

Code snippet

import numpy as np
from sklearn.linear_model import Lars
import sklearn

flag = True
print sklearn.__version__
s = 100
while flag:
    np.random.seed(s)
    X = np.random.randn(10,5)
    Y = np.random.randn(10)
    l=Lars(fit_intercept=False, positive=True)

    l.fit(X,Y)
    print l.coef_
    for i in l.coef_ :
        if i < 0 :
            flag = False
    s += 1

Output

0.19.0
[ 0.18760379  0.25412609  0.          0.          0.        ]
[ 0.          0.07212829  0.19939359  0.          0.0600625 ]
[ 0.          0.09812548  0.30880926  0.          0.21128258]
[ 0.          0.          0.          0.          0.25207725]
[ 0.21952703  0.07202002  0.44614742  0.          0.        ]
[ 0.03364774  0.          0.04648766  0.          0.        ]
[  2.66874987e-04   7.74587735e-01   0.00000000e+00   5.53611627e-01
   8.67075723e-01]
[ 0.          0.22650594  0.05970705  0.          0.24344585]
[ 0.06457963  0.          0.          0.03992725  0.        ]
[ 0.         -0.02069482  0.          0.47615768  1.25238536]

anoopamagarwal commented Sep 27, 2017

Code snippet

import numpy as np
from sklearn.linear_model import Lars
import sklearn

flag = True
print sklearn.__version__
s = 100
while flag:
    np.random.seed(s)
    X = np.random.randn(10,5)
    Y = np.random.randn(10)
    l=Lars(fit_intercept=False, positive=True)

    l.fit(X,Y)
    print l.coef_
    for i in l.coef_ :
        if i < 0 :
            flag = False
    s += 1

Output

0.19.0
[ 0.18760379  0.25412609  0.          0.          0.        ]
[ 0.          0.07212829  0.19939359  0.          0.0600625 ]
[ 0.          0.09812548  0.30880926  0.          0.21128258]
[ 0.          0.          0.          0.          0.25207725]
[ 0.21952703  0.07202002  0.44614742  0.          0.        ]
[ 0.03364774  0.          0.04648766  0.          0.        ]
[  2.66874987e-04   7.74587735e-01   0.00000000e+00   5.53611627e-01
   8.67075723e-01]
[ 0.          0.22650594  0.05970705  0.          0.24344585]
[ 0.06457963  0.          0.          0.03992725  0.        ]
[ 0.         -0.02069482  0.          0.47615768  1.25238536]
@jnothman

This comment has been minimized.

Show comment
Hide comment
@jnothman

jnothman Sep 27, 2017

Member

And what is s at the end?

Member

jnothman commented Sep 27, 2017

And what is s at the end?

@anoopamagarwal

This comment has been minimized.

Show comment
Hide comment
@anoopamagarwal

anoopamagarwal Sep 27, 2017

@jnothman random seed incremented at every iteration

anoopamagarwal commented Sep 27, 2017

@jnothman random seed incremented at every iteration

@anoopamagarwal

This comment has been minimized.

Show comment
Hide comment
@anoopamagarwal

anoopamagarwal Sep 27, 2017

@jnothman Sorry didn't understand your question earlier. s=109 at the breaking point

Code snippet

import numpy as np
from sklearn.linear_model import Lars
import sklearn

flag = True
print sklearn.__version__
s = 100
while flag:
    np.random.seed(s)
    X = np.random.randn(10,5)
    Y = np.random.randn(10)
    l=Lars(fit_intercept=False, positive=True)

    l.fit(X,Y)
    print l.coef_
    for i in l.coef_ :
        if i < 0 :
            print s
            flag = False
    s += 1

Output

0.19.0
[ 0.18760379  0.25412609  0.          0.          0.        ]
[ 0.          0.07212829  0.19939359  0.          0.0600625 ]
[ 0.          0.09812548  0.30880926  0.          0.21128258]
[ 0.          0.          0.          0.          0.25207725]
[ 0.21952703  0.07202002  0.44614742  0.          0.        ]
[ 0.03364774  0.          0.04648766  0.          0.        ]
[  2.66874987e-04   7.74587735e-01   0.00000000e+00   5.53611627e-01
   8.67075723e-01]
[ 0.          0.22650594  0.05970705  0.          0.24344585]
[ 0.06457963  0.          0.          0.03992725  0.        ]
[ 0.         -0.02069482  0.          0.47615768  1.25238536]
109

anoopamagarwal commented Sep 27, 2017

@jnothman Sorry didn't understand your question earlier. s=109 at the breaking point

Code snippet

import numpy as np
from sklearn.linear_model import Lars
import sklearn

flag = True
print sklearn.__version__
s = 100
while flag:
    np.random.seed(s)
    X = np.random.randn(10,5)
    Y = np.random.randn(10)
    l=Lars(fit_intercept=False, positive=True)

    l.fit(X,Y)
    print l.coef_
    for i in l.coef_ :
        if i < 0 :
            print s
            flag = False
    s += 1

Output

0.19.0
[ 0.18760379  0.25412609  0.          0.          0.        ]
[ 0.          0.07212829  0.19939359  0.          0.0600625 ]
[ 0.          0.09812548  0.30880926  0.          0.21128258]
[ 0.          0.          0.          0.          0.25207725]
[ 0.21952703  0.07202002  0.44614742  0.          0.        ]
[ 0.03364774  0.          0.04648766  0.          0.        ]
[  2.66874987e-04   7.74587735e-01   0.00000000e+00   5.53611627e-01
   8.67075723e-01]
[ 0.          0.22650594  0.05970705  0.          0.24344585]
[ 0.06457963  0.          0.          0.03992725  0.        ]
[ 0.         -0.02069482  0.          0.47615768  1.25238536]
109

@jnothman

This comment has been minimized.

Show comment
Hide comment
@jnothman

jnothman Sep 27, 2017

Member

So the minimal reproducing example is:

import numpy as np
from sklearn.linear_model import Lars
np.random.seed(109)
X = np.random.randn(10,5)
Y = np.random.randn(10)
l = Lars(fit_intercept=False, positive=True)
l.fit(X,Y)
print(l.coef_)
assert(np.all(l.coef_ >= 0))

Thanks. I can reproduce, indeed.

Member

jnothman commented Sep 27, 2017

So the minimal reproducing example is:

import numpy as np
from sklearn.linear_model import Lars
np.random.seed(109)
X = np.random.randn(10,5)
Y = np.random.randn(10)
l = Lars(fit_intercept=False, positive=True)
l.fit(X,Y)
print(l.coef_)
assert(np.all(l.coef_ >= 0))

Thanks. I can reproduce, indeed.

@jnothman

This comment has been minimized.

Show comment
Hide comment
@jnothman

jnothman Sep 27, 2017

Member

And the bug appears to have been present since positive was introduced.

Member

jnothman commented Sep 27, 2017

And the bug appears to have been present since positive was introduced.

@VinodKumarLogan

This comment has been minimized.

Show comment
Hide comment
@VinodKumarLogan

VinodKumarLogan Nov 1, 2017

Contributor

Hi @jnothman, Can I work on this issue?

Contributor

VinodKumarLogan commented Nov 1, 2017

Hi @jnothman, Can I work on this issue?

@jnothman

This comment has been minimized.

Show comment
Hide comment
@jnothman

jnothman Nov 1, 2017

Member

If you're confident you can debug it, you're more than welcome to!

Member

jnothman commented Nov 1, 2017

If you're confident you can debug it, you're more than welcome to!

@agramfort

This comment has been minimized.

Show comment
Hide comment
@agramfort
Member

agramfort commented Dec 4, 2017

see #10248

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment