{{ message }}

# Non-negative coefficients with lars #9837

Closed
opened this issue Sep 27, 2017 · 10 comments
Closed

# Non-negative coefficients with lars#9837

opened this issue Sep 27, 2017 · 10 comments
Labels

### anoopamagarwal commented Sep 27, 2017 • edited by TomDLT

 I am running this piece of code ```import numpy as np from sklearn.linear_model import Lars import sklearn flag = True print sklearn.__version__ while flag: X = np.random.randn(10,5) Y = np.random.randn(10) l=Lars(fit_intercept=False, positive=True) l.fit(X,Y) print l.coef_ for i in l.coef_ : if i < 0 : flag = False``` I am getting the following output 0.19.0 [ 0.00000000e+00 0.00000000e+00 5.21722148e-01 3.10743837e-01 4.18778571e-04] [ 0. 0. 0. 0.45541234 0.09542552] [ 0.80960144 0.35327211 0.17399586 0.16972765 -0.02813728] with negative coefficients The text was updated successfully, but these errors were encountered:

### jnothman commented Sep 27, 2017

 thanks for the report. please make this do np.random.seed(i) for iteration i, and report which i it breaks on so that the issue is easily replicated.

### anoopamagarwal commented Sep 27, 2017 • edited by TomDLT

 Code snippet ```import numpy as np from sklearn.linear_model import Lars import sklearn flag = True print sklearn.__version__ s = 100 while flag: np.random.seed(s) X = np.random.randn(10,5) Y = np.random.randn(10) l=Lars(fit_intercept=False, positive=True) l.fit(X,Y) print l.coef_ for i in l.coef_ : if i < 0 : flag = False s += 1``` Output ``````0.19.0 [ 0.18760379 0.25412609 0. 0. 0. ] [ 0. 0.07212829 0.19939359 0. 0.0600625 ] [ 0. 0.09812548 0.30880926 0. 0.21128258] [ 0. 0. 0. 0. 0.25207725] [ 0.21952703 0.07202002 0.44614742 0. 0. ] [ 0.03364774 0. 0.04648766 0. 0. ] [ 2.66874987e-04 7.74587735e-01 0.00000000e+00 5.53611627e-01 8.67075723e-01] [ 0. 0.22650594 0.05970705 0. 0.24344585] [ 0.06457963 0. 0. 0.03992725 0. ] [ 0. -0.02069482 0. 0.47615768 1.25238536] ``````

### jnothman commented Sep 27, 2017

 And what is s at the end?

### anoopamagarwal commented Sep 27, 2017

 @jnothman random seed incremented at every iteration

### anoopamagarwal commented Sep 27, 2017

 @jnothman Sorry didn't understand your question earlier. s=109 at the breaking point Code snippet ``````import numpy as np from sklearn.linear_model import Lars import sklearn flag = True print sklearn.__version__ s = 100 while flag: np.random.seed(s) X = np.random.randn(10,5) Y = np.random.randn(10) l=Lars(fit_intercept=False, positive=True) l.fit(X,Y) print l.coef_ for i in l.coef_ : if i < 0 : print s flag = False s += 1 `````` Output ``````0.19.0 [ 0.18760379 0.25412609 0. 0. 0. ] [ 0. 0.07212829 0.19939359 0. 0.0600625 ] [ 0. 0.09812548 0.30880926 0. 0.21128258] [ 0. 0. 0. 0. 0.25207725] [ 0.21952703 0.07202002 0.44614742 0. 0. ] [ 0.03364774 0. 0.04648766 0. 0. ] [ 2.66874987e-04 7.74587735e-01 0.00000000e+00 5.53611627e-01 8.67075723e-01] [ 0. 0.22650594 0.05970705 0. 0.24344585] [ 0.06457963 0. 0. 0.03992725 0. ] [ 0. -0.02069482 0. 0.47615768 1.25238536] 109 ``````

### jnothman commented Sep 27, 2017

 So the minimal reproducing example is: ```import numpy as np from sklearn.linear_model import Lars np.random.seed(109) X = np.random.randn(10,5) Y = np.random.randn(10) l = Lars(fit_intercept=False, positive=True) l.fit(X,Y) print(l.coef_) assert(np.all(l.coef_ >= 0))``` Thanks. I can reproduce, indeed.

### jnothman commented Sep 27, 2017

 And the bug appears to have been present since `positive` was introduced.

### VinodKumarLogan commented Nov 1, 2017

 Hi @jnothman, Can I work on this issue?

### jnothman commented Nov 1, 2017

 If you're confident you can debug it, you're more than welcome to!

### agramfort commented Dec 4, 2017

 see #10248