Skip to content
This repository has been archived by the owner on Dec 6, 2023. It is now read-only.

Inconsistent results on different computers #166

Open
odedbd opened this issue Sep 3, 2017 · 28 comments
Open

Inconsistent results on different computers #166

odedbd opened this issue Sep 3, 2017 · 28 comments
Labels

Comments

@odedbd
Copy link

odedbd commented Sep 3, 2017

Hi,

I have recently updated the commit I am working with to #160. The commit I used to work with before the update was 30abe96 (Mar 31, 2016).

After updating I noticed that in some cases the model fit by the Earth estimator can be different between two computers running the exact same script, even when both computers have identical virtual environments, as indicated by "pip freeze" generating the exact same output.

The trained estimator is consistent when the script is run repeatedly on the same computer. Also, the trained estimator can be consistent between two different computers, but this is not guaranteed. So far I have been unable to find any consistent difference between computers where the trained estimator ends up different, except perhaps for a difference between i7 and i5 CPUs. I have no idea whether this could be relevant in any way.

This inconsistency does not occur when using the 30abe96 commit.

I was able to replicate this by slightly modifying one of the example from the py-earth documentation as show below. I have set the verbose output on, which shows that there are small differences between two runs of this code made on the same computer.

As you can see in the outputs from computer A and B below (both running Windows 8.1), there are a lot of similarities in the output, but also some significant differences. The numbers calculated in the forward pass are identical for the first five terms, but begin to differ, usually slightly from the 6th term onward. The backward pass has difference numbers for all the terms, and the selected iterations are different for the two cases, 21 vs. 24.

The final model basis displayed below also shows differences. On a private dataset (which I cannot share), the forward pass was identical, and the differences only began showing on the backward pass.

I'll be grateful for any idea on what might cause such an inconsistency in the trained estimator. Thank you for your help.

The modified example code:

import numpy
from pyearth import Earth
from matplotlib import pyplot

#Create some fake data
numpy.random.seed(0)
m = 1000
n = 10
X = 80*numpy.random.uniform(size=(m,n)) - 4*numpy.pi
y = X[:,3]*(numpy.abs(X[:,6] - numpy.pi) + 15*numpy.random.gamma(2,size=m)) + 2*(X[:,2] - 2) -1*(X[:,7] + 3)**2

#Fit an Earth model
model = Earth(verbose=1,max_terms=30, max_degree=1, allow_missing=False, penalty=3, endspan_alpha=None,
                 endspan=None, minspan_alpha=None, minspan=None, thresh=0.0001, zero_tol=None, min_search_points=None,
                 check_every=None, allow_linear=True, use_fast=True, fast_K=20, fast_h=1, smooth=None,
                 enable_pruning=True)
model.fit(X,y)

#Print the model
print(model.trace())
print(model.summary())

The output I receive on computer A:

Beginning forward pass
---------------------------------------------------------------------------
iter  parent  var  knot  mse             terms  gcv          rsq    grsq   
---------------------------------------------------------------------------
0     -       -    -     4950518.324962  1      4960434.233  0.000  0.000  
1     0       7    311   2737418.709755  3      2770565.758  0.447  0.441  
2     0       3    -1    1022465.093864  4      1040071.158  0.793  0.790  
3     0       6    320   754542.772780  6      775335.527  0.848  0.844  
4     0       7    200   740470.232168  8      768647.106  0.850  0.845  
5     0       7    554   729236.351357  10     764757.622  0.853  0.846  
6     0       6    255   722135.459949  12     765126.178  0.854  0.846  
7     0       8    -1    720170.334629  13     766986.419  0.855  0.845  
8     0       8    405   717642.698760  15     772243.396  0.855  0.844  
9     0       8    661   714963.572360  17     777403.874  0.856  0.843  
10    0       2    -1    712990.074603  18     779315.897  0.856  0.843  
11    0       2    514   710075.084704  20     784308.080  0.857  0.842  
12    0       3    217   708491.421267  22     790848.617  0.857  0.841  
13    0       3    434   705014.947214  24     795348.884  0.858  0.840  
14    0       3    672   702076.013034  26     800513.337  0.858  0.839  
15    0       3    604   700045.813445  28     806790.441  0.859  0.837  
16    0       8    331   698728.157244  30     813986.866  0.859  0.836  
17    0       8    981   695971.429421  32     819597.703  0.859  0.835  
-------------------------------------------------------------------------
Stopping Condition 0: Reached maximum number of terms
Beginning pruning pass
------------------------------------------------------
iter  bf  terms  mse        gcv         rsq    grsq   
------------------------------------------------------
0     -   32     697629.17  821549.912  0.859  0.834  
1     7   31     696049.83  815260.480  0.859  0.836  
2     19  30     696087.18  810910.243  0.859  0.837  
3     20  29     696098.55  806564.867  0.859  0.837  
4     27  28     696093.12  802235.036  0.859  0.838  
5     24  27     696100.61  797954.747  0.859  0.839  
6     15  26     696100.61  793700.130  0.859  0.840  
7     9   25     696100.61  789479.451  0.859  0.841  
8     10  24     696100.61  785292.349  0.859  0.842  
9     13  23     696044.23  781075.201  0.859  0.843  
10    30  22     696071.59  776985.068  0.859  0.843  
11    28  21     696100.61  772928.979  0.859  0.844  
12    23  20     696100.61  768872.680  0.859  0.845  
13    16  19     696310.24  765078.566  0.859  0.846  
14    21  18     697463.55  762345.015  0.859  0.846  
15    26  17     698870.46  759905.290  0.859  0.847  
16    1   16     701506.92  758810.593  0.858  0.847  
17    18  15     704209.63  757788.296  0.858  0.847  
18    17  14     706125.72  755924.315  0.857  0.848  
19    14  13     710259.13  756430.911  0.857  0.848  
20    12  12     711204.04  753543.977  0.856  0.848  
21    3   11     716678.64  755451.425  0.855  0.848  
22    22  10     717283.22  752222.248  0.855  0.848  
23    29  9      722648.68  753983.483  0.854  0.848  
24    31  8      724040.81  751592.497  0.854  0.848  
25    4   7      730640.99  754594.851  0.852  0.848  
26    5   6      749443.92  770096.162  0.849  0.845  
27    8   5      770625.58  787863.245  0.844  0.841  
28    2   4      924877.06  940802.731  0.813  0.810  
29    11  3      1202850.60  1217415.760  0.757  0.755  
30    25  2      2977555.15  2998507.979  0.399  0.396  
31    6   1      4950518.32  4960434.233  0.000  0.000  
--------------------------------------------------------
Selected iteration: 24
Forward Pass
---------------------------------------------------------------------------
iter  parent  var  knot  mse             terms  gcv          rsq    grsq   
---------------------------------------------------------------------------
0     -       -    -     4950518.324962  1      4960434.233  0.000  0.000  
1     0       7    311   2737418.709755  3      2770565.758  0.447  0.441  
2     0       3    -1    1022465.093864  4      1040071.158  0.793  0.790  
3     0       6    320   754542.772780   6      775335.527   0.848  0.844  
4     0       7    200   740470.232168   8      768647.106   0.850  0.845  
5     0       7    554   729236.351357   10     764757.622   0.853  0.846  
6     0       6    255   722135.459949   12     765126.178   0.854  0.846  
7     0       8    -1    720170.334629   13     766986.419   0.855  0.845  
8     0       8    405   717642.698760   15     772243.396   0.855  0.844  
9     0       8    661   714963.572360   17     777403.874   0.856  0.843  
10    0       2    -1    712990.074603   18     779315.897   0.856  0.843  
11    0       2    514   710075.084704   20     784308.080   0.857  0.842  
12    0       3    217   708491.421267   22     790848.617   0.857  0.841  
13    0       3    434   705014.947214   24     795348.884   0.858  0.840  
14    0       3    672   702076.013034   26     800513.337   0.858  0.839  
15    0       3    604   700045.813445   28     806790.441   0.859  0.837  
16    0       8    331   698728.157244   30     813986.866   0.859  0.836  
17    0       8    981   695971.429421   32     819597.703   0.859  0.835  
---------------------------------------------------------------------------
Stopping Condition 0: Reached maximum number of terms
Pruning Pass
--------------------------------------------------------
iter  bf  terms  mse         gcv          rsq    grsq   
--------------------------------------------------------
0     -   32     697629.17   821549.912   0.859  0.834  
1     7   31     696049.83   815260.480   0.859  0.836  
2     19  30     696087.18   810910.243   0.859  0.837  
3     20  29     696098.55   806564.867   0.859  0.837  
4     27  28     696093.12   802235.036   0.859  0.838  
5     24  27     696100.61   797954.747   0.859  0.839  
6     15  26     696100.61   793700.130   0.859  0.840  
7     9   25     696100.61   789479.451   0.859  0.841  
8     10  24     696100.61   785292.349   0.859  0.842  
9     13  23     696044.23   781075.201   0.859  0.843  
10    30  22     696071.59   776985.068   0.859  0.843  
11    28  21     696100.61   772928.979   0.859  0.844  
12    23  20     696100.61   768872.680   0.859  0.845  
13    16  19     696310.24   765078.566   0.859  0.846  
14    21  18     697463.55   762345.015   0.859  0.846  
15    26  17     698870.46   759905.290   0.859  0.847  
16    1   16     701506.92   758810.593   0.858  0.847  
17    18  15     704209.63   757788.296   0.858  0.847  
18    17  14     706125.72   755924.315   0.857  0.848  
19    14  13     710259.13   756430.911   0.857  0.848  
20    12  12     711204.04   753543.977   0.856  0.848  
21    3   11     716678.64   755451.425   0.855  0.848  
22    22  10     717283.22   752222.248   0.855  0.848  
23    29  9      722648.68   753983.483   0.854  0.848  
24    31  8      724040.81   751592.497   0.854  0.848  
25    4   7      730640.99   754594.851   0.852  0.848  
26    5   6      749443.92   770096.162   0.849  0.845  
27    8   5      770625.58   787863.245   0.844  0.841  
28    2   4      924877.06   940802.731   0.813  0.810  
29    11  3      1202850.60  1217415.760  0.757  0.755  
30    25  2      2977555.15  2998507.979  0.399  0.396  
31    6   1      4950518.32  4960434.233  0.000  0.000  
--------------------------------------------------------
Selected iteration: 24
Earth Model
-------------------------------------
Basis Function  Pruned  Coefficient  
-------------------------------------
(Intercept)     No      72582.3      
h(x7-35.0812)   Yes     None         
h(35.0812-x7)   No      -42.4157     
x3              Yes     None         
h(x6-4.22723)   No      -1065.73     
h(4.22723-x6)   No      1113.05      
h(x7-1.00788)   No      -87.8541     
h(1.00788-x7)   Yes     None         
h(x7-53.3666)   No      -77.5691     
h(53.3666-x7)   Yes     None         
h(x6-66.34)     Yes     None         
h(66.34-x6)     No      -1094.65     
x8              Yes     None         
h(x8-51.3231)   Yes     None         
h(51.3231-x8)   Yes     None         
h(x8+6.2673)    Yes     None         
h(-6.2673-x8)   Yes     None         
x2              Yes     None         
h(x2+9.81187)   Yes     None         
h(-9.81187-x2)  Yes     None         
h(x3-66.4284)   Yes     None         
h(66.4284-x3)   Yes     None         
h(x3-63.2394)   Yes     None         
h(63.2394-x3)   Yes     None         
h(x3-65.8573)   Yes     None         
h(65.8573-x3)   No      -54.4066     
h(x3-62.6728)   Yes     None         
h(62.6728-x3)   Yes     None         
h(x8+11.7519)   Yes     None         
h(-11.7519-x8)  Yes     None         
h(x8+10.5919)   Yes     None         
h(-10.5919-x8)  Yes     None         
-------------------------------------
MSE: 724040.8075, GCV: 751592.4974, RSQ: 0.8537, GRSQ: 0.8485

The output I receive on computer B:

Beginning forward pass
---------------------------------------------------------------------------
iter  parent  var  knot  mse             terms  gcv          rsq    grsq   
---------------------------------------------------------------------------
0     -       -    -     4950518.324962  1      4960434.233  0.000  0.000  
1     0       7    311   2737418.709755  3      2770565.758  0.447  0.441  
2     0       3    -1    1022465.093864  4      1040071.158  0.793  0.790  
3     0       6    320   754542.772780  6      775335.527  0.848  0.844  
4     0       7    200   740470.232168  8      768647.106  0.850  0.845  
5     0       7    554   729236.351357  10     764757.622  0.853  0.846  
6     0       6    255   722316.228852  12     765317.709  0.854  0.846  
7     0       8    -1    720316.871586  13     767142.482  0.854  0.845  
8     0       8    405   717697.402167  15     772302.261  0.855  0.844  
9     0       8    661   715006.680289  17     777450.747  0.856  0.843  
10    0       2    -1    713035.003056  18     779365.005  0.856  0.843  
11    0       2    514   710128.220766  20     784366.771  0.857  0.842  
12    0       3    217   708664.271302  22     791041.559  0.857  0.841  
13    0       3    434   705142.169448  24     795492.407  0.858  0.840  
14    0       3    672   702191.404756  26     800644.908  0.858  0.839  
15    0       3    604   700187.073599  28     806953.241  0.859  0.837  
16    0       8    331   698867.906750  30     814149.668  0.859  0.836  
17    0       8    981   696100.611195  32     819749.831  0.859  0.835  
-------------------------------------------------------------------------
Stopping Condition 0: Reached maximum number of terms
Beginning pruning pass
------------------------------------------------------
iter  bf  terms  mse        gcv         rsq    grsq   
------------------------------------------------------
0     -   32     696100.61  819749.831  0.859  0.835  
1     16  31     696100.61  815319.957  0.859  0.836  
2     29  30     696100.61  810925.893  0.859  0.837  
3     9   29     696100.61  806567.256  0.859  0.837  
4     2   28     696100.61  802243.666  0.859  0.838  
5     20  27     696100.61  797954.747  0.859  0.839  
6     4   26     696100.61  793700.130  0.859  0.840  
7     17  25     696100.61  789479.451  0.859  0.841  
8     23  24     696100.61  785292.349  0.859  0.842  
9     27  23     696100.61  781138.470  0.859  0.843  
10    3   22     696100.61  777017.461  0.859  0.843  
11    14  21     696100.61  772928.979  0.859  0.844  
12    30  20     696100.61  768872.680  0.859  0.845  
13    15  19     696310.24  765078.566  0.859  0.846  
14    18  18     697149.10  762001.321  0.859  0.846  
15    21  17     698309.49  759295.329  0.859  0.847  
16    26  16     699773.34  756935.402  0.859  0.847  
17    13  15     703223.30  756726.922  0.858  0.847  
18    7   14     706929.75  756785.047  0.857  0.847  
19    12  13     711449.91  757699.103  0.856  0.847  
20    31  12     712267.50  754670.755  0.856  0.848  
21    28  11     714429.27  753080.363  0.856  0.848  
22    19  10     719584.21  754635.325  0.855  0.848  
23    22  9      724988.15  756424.389  0.854  0.848  
24    24  8      725762.72  753379.932  0.853  0.848  
25    10  7      732099.50  756101.172  0.852  0.848  
26    8   6      742110.69  762560.854  0.850  0.846  
27    5   5      758739.65  775711.443  0.847  0.844  
28    6   4      879885.89  895036.848  0.822  0.820  
29    11  3      1116514.74  1130034.475  0.774  0.772  
30    25  2      2804385.57  2824119.811  0.434  0.431  
31    1   1      4950518.32  4960434.233  0.000  0.000  
--------------------------------------------------------
Selected iteration: 21
Forward Pass
---------------------------------------------------------------------------
iter  parent  var  knot  mse             terms  gcv          rsq    grsq   
---------------------------------------------------------------------------
0     -       -    -     4950518.324962  1      4960434.233  0.000  0.000  
1     0       7    311   2737418.709755  3      2770565.758  0.447  0.441  
2     0       3    -1    1022465.093864  4      1040071.158  0.793  0.790  
3     0       6    320   754542.772780   6      775335.527   0.848  0.844  
4     0       7    200   740470.232168   8      768647.106   0.850  0.845  
5     0       7    554   729236.351357   10     764757.622   0.853  0.846  
6     0       6    255   722316.228852   12     765317.709   0.854  0.846  
7     0       8    -1    720316.871586   13     767142.482   0.854  0.845  
8     0       8    405   717697.402167   15     772302.261   0.855  0.844  
9     0       8    661   715006.680289   17     777450.747   0.856  0.843  
10    0       2    -1    713035.003056   18     779365.005   0.856  0.843  
11    0       2    514   710128.220766   20     784366.771   0.857  0.842  
12    0       3    217   708664.271302   22     791041.559   0.857  0.841  
13    0       3    434   705142.169448   24     795492.407   0.858  0.840  
14    0       3    672   702191.404756   26     800644.908   0.858  0.839  
15    0       3    604   700187.073599   28     806953.241   0.859  0.837  
16    0       8    331   698867.906750   30     814149.668   0.859  0.836  
17    0       8    981   696100.611195   32     819749.831   0.859  0.835  
---------------------------------------------------------------------------
Stopping Condition 0: Reached maximum number of terms

Pruning Pass
--------------------------------------------------------
iter  bf  terms  mse         gcv          rsq    grsq   
--------------------------------------------------------
0     -   32     696100.61   819749.831   0.859  0.835  
1     16  31     696100.61   815319.957   0.859  0.836  
2     29  30     696100.61   810925.893   0.859  0.837  
3     9   29     696100.61   806567.256   0.859  0.837  
4     2   28     696100.61   802243.666   0.859  0.838  
5     20  27     696100.61   797954.747   0.859  0.839  
6     4   26     696100.61   793700.130   0.859  0.840  
7     17  25     696100.61   789479.451   0.859  0.841  
8     23  24     696100.61   785292.349   0.859  0.842  
9     27  23     696100.61   781138.470   0.859  0.843  
10    3   22     696100.61   777017.461   0.859  0.843  
11    14  21     696100.61   772928.979   0.859  0.844  
12    30  20     696100.61   768872.680   0.859  0.845  
13    15  19     696310.24   765078.566   0.859  0.846  
14    18  18     697149.10   762001.321   0.859  0.846  
15    21  17     698309.49   759295.329   0.859  0.847  
16    26  16     699773.34   756935.402   0.859  0.847  
17    13  15     703223.30   756726.922   0.858  0.847  
18    7   14     706929.75   756785.047   0.857  0.847  
19    12  13     711449.91   757699.103   0.856  0.847  
20    31  12     712267.50   754670.755   0.856  0.848  
21    28  11     714429.27   753080.363   0.856  0.848  
22    19  10     719584.21   754635.325   0.855  0.848  
23    22  9      724988.15   756424.389   0.854  0.848  
24    24  8      725762.72   753379.932   0.853  0.848  
25    10  7      732099.50   756101.172   0.852  0.848  
26    8   6      742110.69   762560.854   0.850  0.846  
27    5   5      758739.65   775711.443   0.847  0.844  
28    6   4      879885.89   895036.848   0.822  0.820  
29    11  3      1116514.74  1130034.475  0.774  0.772  
30    25  2      2804385.57  2824119.811  0.434  0.431  
31    1   1      4950518.32  4960434.233  0.000  0.000  
--------------------------------------------------------
Selected iteration: 21

Earth Model
-------------------------------------
Basis Function  Pruned  Coefficient  
-------------------------------------
(Intercept)     No      4696.33      
h(x7-35.0812)   No      -62.1822     
h(35.0812-x7)   Yes     None         
x3              Yes     None         
h(x6-4.22723)   Yes     None         
h(4.22723-x6)   No      43.8207      
h(x7-1.00788)   No      -36.4057     
h(1.00788-x7)   Yes     None         
h(x7-53.3666)   No      -56.807      
h(53.3666-x7)   Yes     None         
h(x6-66.34)     No      -976.209     
h(66.34-x6)     No      -28.1518     
x8              Yes     None         
h(x8-51.3231)   Yes     None         
h(51.3231-x8)   Yes     None         
h(x8+6.2673)    Yes     None         
h(-6.2673-x8)   Yes     None         
x2              Yes     None         
h(x2+9.81187)   Yes     None         
h(-9.81187-x2)  No      -246.073     
h(x3-66.4284)   Yes     None         
h(66.4284-x3)   Yes     None         
h(x3-63.2394)   No      261.51       
h(63.2394-x3)   Yes     None         
h(x3-65.8573)   No      -1016.35     
h(65.8573-x3)   No      -53.5466     
h(x3-62.6728)   Yes     None         
h(62.6728-x3)   Yes     None         
h(x8+11.7519)   Yes     None         
h(-11.7519-x8)  Yes     None         
h(x8+10.5919)   Yes     None         
h(-10.5919-x8)  Yes     None         
-------------------------------------
MSE: 714429.2661, GCV: 753080.3626, RSQ: 0.8557, GRSQ: 0.8482
@odedbd
Copy link
Author

odedbd commented Sep 3, 2017

UPDATE:

I have ran the above script also within a linux docker container on computer B and got different results from that.

I have also tested on a few other Windows systems (versions 8 & 10) and all of these gave the same results as computer B. All these systems had identical virtual environments installed.

Docker results:

Beginning forward pass
---------------------------------------------------------------------------
iter  parent  var  knot  mse             terms  gcv          rsq    grsq
---------------------------------------------------------------------------
0     -       -    -     4950518.324962  1      4960434.233  0.000  0.000
1     0       7    311   2737418.709755  3      2770565.758  0.447  0.441
2     0       3    -1    1022465.093864  4      1040071.158  0.793  0.790
3     0       6    320   754542.772780  6      775335.527  0.848  0.844
4     0       7    200   740470.232168  8      768647.106  0.850  0.845
5     0       7    554   729236.351357  10     764757.622  0.853  0.846
6     0       6    255   722316.228852  12     765317.709  0.854  0.846
7     0       8    -1    720316.871586  13     767142.482  0.854  0.845
8     0       8    405   717697.402167  15     772302.261  0.855  0.844
9     0       8    661   715006.680289  17     777450.747  0.856  0.843
10    0       2    -1    713035.003056  18     779365.005  0.856  0.843
11    0       2    514   710128.220766  20     784366.771  0.857  0.842
12    0       3    217   708664.271302  22     791041.559  0.857  0.841
13    0       3    434   705142.169448  24     795492.407  0.858  0.840
14    0       3    672   702191.404756  26     800644.908  0.858  0.839
15    0       3    604   700187.073599  28     806953.241  0.859  0.837
16    0       8    331   698867.906750  30     814149.668  0.859  0.836
17    0       8    981   696100.611195  32     819749.831  0.859  0.835
-------------------------------------------------------------------------
Stopping Condition 0: Reached maximum number of terms
Beginning pruning pass
------------------------------------------------------
iter  bf  terms  mse        gcv         rsq    grsq
------------------------------------------------------
0     -   32     696116.41  819768.441  0.859  0.835
1     25  31     696092.61  815310.581  0.859  0.836
2     21  30     696057.43  810875.585  0.859  0.837
3     3   29     696068.76  806530.354  0.859  0.837
4     8   28     696059.83  802196.669  0.859  0.838
5     28  27     696100.61  797954.747  0.859  0.839
6     30  26     696100.61  793700.130  0.859  0.840
7     23  25     696093.48  789471.362  0.859  0.841
8     13  24     696091.01  785281.519  0.859  0.842
9     10  23     696100.61  781138.470  0.859  0.843
10    7   22     696100.61  777017.461  0.859  0.843
11    19  21     696100.61  772928.979  0.859  0.844
12    16  20     696100.61  768872.680  0.859  0.845
13    15  19     696310.24  765078.566  0.859  0.846
14    20  18     697463.55  762345.015  0.859  0.846
15    26  17     698630.89  759644.806  0.859  0.847
16    2   16     701010.53  758273.650  0.858  0.847
17    18  15     703779.53  757325.471  0.858  0.847
18    17  14     705854.29  755633.745  0.857  0.848
19    14  13     709946.75  756098.226  0.857  0.848
20    12  12     710939.41  753263.592  0.856  0.848
21    29  11     716650.97  755422.260  0.855  0.848
22    31  10     718198.07  753181.664  0.855  0.848
23    4   9      724033.26  755428.101  0.854  0.848
24    24  8      731824.11  759671.976  0.852  0.847
25    22  7      734070.76  758137.064  0.852  0.847
26    9   6      744699.11  765220.608  0.850  0.846
27    5   5      761453.33  778485.818  0.846  0.843
28    6   4      882015.09  897202.712  0.822  0.819
29    11  3      1120537.16  1134105.597  0.774  0.771
30    27  2      2804385.57  2824119.811  0.434  0.431
31    1   1      4950518.32  4960434.233  0.000  0.000
--------------------------------------------------------
Selected iteration: 22
Forward Pass
---------------------------------------------------------------------------
iter  parent  var  knot  mse             terms  gcv          rsq    grsq
---------------------------------------------------------------------------
0     -       -    -     4950518.324962  1      4960434.233  0.000  0.000
1     0       7    311   2737418.709755  3      2770565.758  0.447  0.441
2     0       3    -1    1022465.093864  4      1040071.158  0.793  0.790
3     0       6    320   754542.772780   6      775335.527   0.848  0.844
4     0       7    200   740470.232168   8      768647.106   0.850  0.845
5     0       7    554   729236.351357   10     764757.622   0.853  0.846
6     0       6    255   722316.228852   12     765317.709   0.854  0.846
7     0       8    -1    720316.871586   13     767142.482   0.854  0.845
8     0       8    405   717697.402167   15     772302.261   0.855  0.844
9     0       8    661   715006.680289   17     777450.747   0.856  0.843
10    0       2    -1    713035.003056   18     779365.005   0.856  0.843
11    0       2    514   710128.220766   20     784366.771   0.857  0.842
12    0       3    217   708664.271302   22     791041.559   0.857  0.841
13    0       3    434   705142.169448   24     795492.407   0.858  0.840
14    0       3    672   702191.404756   26     800644.908   0.858  0.839
15    0       3    604   700187.073599   28     806953.241   0.859  0.837
16    0       8    331   698867.906750   30     814149.668   0.859  0.836
17    0       8    981   696100.611195   32     819749.831   0.859  0.835
---------------------------------------------------------------------------
Stopping Condition 0: Reached maximum number of terms

Pruning Pass
--------------------------------------------------------
iter  bf  terms  mse         gcv          rsq    grsq
--------------------------------------------------------
0     -   32     696116.41   819768.441   0.859  0.835
1     25  31     696092.61   815310.581   0.859  0.836
2     21  30     696057.43   810875.585   0.859  0.837
3     3   29     696068.76   806530.354   0.859  0.837
4     8   28     696059.83   802196.669   0.859  0.838
5     28  27     696100.61   797954.747   0.859  0.839
6     30  26     696100.61   793700.130   0.859  0.840
7     23  25     696093.48   789471.362   0.859  0.841
8     13  24     696091.01   785281.519   0.859  0.842
9     10  23     696100.61   781138.470   0.859  0.843
10    7   22     696100.61   777017.461   0.859  0.843
11    19  21     696100.61   772928.979   0.859  0.844
12    16  20     696100.61   768872.680   0.859  0.845
13    15  19     696310.24   765078.566   0.859  0.846
14    20  18     697463.55   762345.015   0.859  0.846
15    26  17     698630.89   759644.806   0.859  0.847
16    2   16     701010.53   758273.650   0.858  0.847
17    18  15     703779.53   757325.471   0.858  0.847
18    17  14     705854.29   755633.745   0.857  0.848
19    14  13     709946.75   756098.226   0.857  0.848
20    12  12     710939.41   753263.592   0.856  0.848
21    29  11     716650.97   755422.260   0.855  0.848
22    31  10     718198.07   753181.664   0.855  0.848
23    4   9      724033.26   755428.101   0.854  0.848
24    24  8      731824.11   759671.976   0.852  0.847
25    22  7      734070.76   758137.064   0.852  0.847
26    9   6      744699.11   765220.608   0.850  0.846
27    5   5      761453.33   778485.818   0.846  0.843
28    6   4      882015.09   897202.712   0.822  0.819
29    11  3      1120537.16  1134105.597  0.774  0.771
30    27  2      2804385.57  2824119.811  0.434  0.431
31    1   1      4950518.32  4960434.233  0.000  0.000
--------------------------------------------------------
Selected iteration: 22

Earth Model
-------------------------------------
Basis Function  Pruned  Coefficient
-------------------------------------
(Intercept)     No      68908.6
h(x7-35.0812)   No      -62.4031
h(35.0812-x7)   Yes     None
x3              Yes     None
h(x6-4.22723)   No      -1003.84
h(4.22723-x6)   No      1049.26
h(x7-1.00788)   No      -76.8895
h(1.00788-x7)   Yes     None
h(x7-53.3666)   Yes     None
h(53.3666-x7)   No      -36.2088
h(x6-66.34)     Yes     None
h(66.34-x6)     No      -1032.24
x8              Yes     None
h(x8-51.3231)   Yes     None
h(51.3231-x8)   Yes     None
h(x8+6.2673)    Yes     None
h(-6.2673-x8)   Yes     None
x2              Yes     None
h(x2+9.81187)   Yes     None
h(-9.81187-x2)  Yes     None
h(x3-66.4284)   Yes     None
h(66.4284-x3)   Yes     None
h(x3-63.2394)   No      338.365
h(63.2394-x3)   Yes     None
h(x3-65.8573)   No      -1092.3
h(65.8573-x3)   Yes     None
h(x3-62.6728)   Yes     None
h(62.6728-x3)   No      -53.7473
h(x8+11.7519)   Yes     None
h(-11.7519-x8)  Yes     None
h(x8+10.5919)   Yes     None
h(-10.5919-x8)  Yes     None
-------------------------------------
MSE: 718198.0700, GCV: 753181.6636, RSQ: 0.8549, GRSQ: 0.8482

@jcrudy
Copy link
Collaborator

jcrudy commented Sep 5, 2017

@odedbd In general, py-earth will produce slightly different results on different machines. I believe it's due to differences in the underlying BLAS and Lapack subroutines, but I don't know for sure. Are these differences causing any problems for you?

@odedbd
Copy link
Author

odedbd commented Sep 11, 2017

@jcrudy Thank you for getting back to me on this. I am not sure whether these differences are causing problems, so much as they are causing a concern. We test our processes, including training, as part of a continuous deployment setup to catch regression bugs etc. These differences cause tests of the training process to break, since the testing machine is not necessarily exactly the same as the development machine. Do you have any idea which part of the algorithm is more susceptible to such BLAS/Lapack differences which could be suspect?

One idea I had was to maybe test using the earth R package, to see if it is also displaying these kinds of differences between machines.

@jcrudy
Copy link
Collaborator

jcrudy commented Sep 13, 2017

@odedbd I haven't done much to analyze where machine-dependent differences occur. A little searching, though, indicates that in general floating point calculations can't be relied upon to yield identical results across machines, and the differences can depend on many aspects of both hardware and software. See this, for example. One way to ensure that results are at least very similar across machines is to give py-earth an easy problem to solve. That is, a problem in which the predictors are not very correlated with each other. That should at least help. When I run tests, I often make my assertions around model performance, which is generally much less sensitive than the exact terms and coefficients.

I would be curious to know to what extent this occurs in the R package, so please post if you do decide to test it out.

@odedbd
Copy link
Author

odedbd commented Sep 20, 2017

@jcrudy I will update if we decide to test the R package, thanks.

@ficerr
Copy link

ficerr commented Jul 30, 2018

Just wondering if anyone has got any updates or ideas on resolving this issue, would be great to know.

@jcrudy
Copy link
Collaborator

jcrudy commented Jul 31, 2018

@ficerr No updates, except to say that this should probably be considered expected behavior. Is there a problem it's causing for you? Perhaps I could suggest a workaround.

@jcrudy jcrudy added the wontfix label Jul 31, 2018
@ficerr
Copy link

ficerr commented Aug 1, 2018

@jcrudy Yes, it is an issue as the difference in the results from one PC to another is dramatic, it is not like a rounding error in the 10th decimal place. I am now trying to use rpy2 to run R version of Earth and just use those results in Python, but I am not sure how much longer it will take because there must be some communication overheads between python and R, and, which is more important, I have not checked yet if the results are the same across different PCs. What workaround do you mean? Would be great to know.

@jcrudy
Copy link
Collaborator

jcrudy commented Aug 1, 2018

@ficerr When you say that the difference is dramatic, is this difference in the basis functions, the coefficients, or the predictions? While it is theoretically possible, due to the stepwise nature of the algorithm, to get large differences in prediction due to small numerical differences, I've never seen it happen. If it's happening to you, I would be interested to know more about your data set if possible. I am also interested to know whether the R package is giving you a similar problem. It's possible you've uncovered some bug or instability specific to py-earth, in which case it could be fixed.

In terms of speed, the overhead of communication with R may not be a greater factor than the speed difference between the R package and py-earth, so your rpy2 implementation may actually be faster. However, it depends on your usage.

Regarding workarounds, it depends on specifically what differences you're seeing and what your goals are. I only mean that if I know those things, I may be able to suggest something (but maybe not).

@ficerr
Copy link

ficerr commented Aug 2, 2018

@jcrudy Across different computers: basis functions are the same, knots are the same (at least up to 3rd decimal place, I do not know where to find the exact values for knots) after the forward pass, but coefficients are different and selected basis functions (after the pruning pass) are different, i.e. the results of the pruning pass are not the same across different PCs.

Across Python and R: basis functions, knots and coefficients, and predictions are very different between the models, although I am not 100% sure the default settings are the same in Python and R (I called Earth in R the same way you did it in https://gist.github.com/jcrudy/7221551).

R is giving the same results across different PCs, so I am using R solution now, however it is much slower because of the communication overheads between Python and R. I will try to make my code around it work faster, but I doubt it will help as the results of profiling are telling me that the longest part is the communication between Python and R (calling R Earth function from Python takes up to 10 times longer comparing to the Python Earth function).

My data are just two real-valued vectors, x and y, with 500 rows each. mean(x)=-1.30, std(x)=1.50, mean(y)=52.20, std(y)=4.30.

@jcrudy
Copy link
Collaborator

jcrudy commented Aug 3, 2018

@ficerr @odedbd found a similar instability with the pruning pass in #179. One workaround is to skip the pruning pass entirely, and instead put the Earth model in a scikit-learn Pipeline with a Lasso or similar model to select the best terms to use. See #159 for an example from @Fish-Soup.

I currently consider this instability in the pruning pass to be not worth fixing, as so far I have not seen a case where it results in a model with substantially worse performance. However, you may have found that case. Can you tell me whether the predictions and predictive performance of your models differ much between machines? If so, I may change my mind (however, it still may be a long time before I can get around to attempting a fix).

I am a little concerned that the R model gave very different predictions from the Python model. Generally, predictions should not differ too much between implementations. Of course, there are adjustable parameters that may differ. If there's anything more you can share about that, I'd be interested.

@ficerr
Copy link

ficerr commented Aug 7, 2018

@jcrudy Well, the effect of that instability depends on how sensitive you want your model to be. Regarding the differences across computers: for instance, sometimes I get a very small difference in slopes of the lines, 1-2% or so, but sometimes it is 10% or more. For some data the MSE, GCV, RSQ, and GRSQ are almost indentical or even identical (at least to the 4th decimal place), but sometimes there is 2-3% difference. Therefore, for example, on different computers the fitted lines (in 2D case) of the model (and thus the forecasts) look quite different which leads to inconsistent results across different computers and to the inability to reproduce it across computers (you use the same data and same parameters but obtain different results). Imagine that instead of one line between knots at x=2 and x=5 with a slope=0.8, because of a different pruning pass the model selects another knot in between (say at x=3) and instead of one line with a slope=0.8 you obtain two lines with say slopes=25 and -30.

Actually, the most annoying is that however Earth model in R is stable and can be used from Python, it takes up to 10 times longer to calculate (due to the communication overheads between Python and R) than using the Python one, as I mentioned earlier. Therefore for everyone who is going to need to run the model many times on large sets of data, the processing time difference will become too constraining.

So, thanks a lot for your effort and it would be really great if you could look at that instability issue.

@jcrudy
Copy link
Collaborator

jcrudy commented Aug 7, 2018

@ficerr Okay, I'm convinced it is at least desirable to fix this. Changing this from wontfix to bug. It will probably not get fixed by me any time soon, unfortunately. If anyone reading this wants to attempt it, I would be happy to advise.

@jcrudy jcrudy added bug and removed wontfix labels Aug 7, 2018
@johnmrudolph
Copy link

I also noticed this issue when I was fitting a model on my work vs home computers. I've been working off the original example that @odedbd used. I'm able to reproduce the "bug" that gives different results on the pruning pass onward. I'm testing this using 2 environments: 1. My base Windows and 2. Windows Subsystem for Linux both with Python 3.6.

I think the difference in results on the pruning pass and selected Earth Model is being driven by numpy.linalg.lstsq in _pruning.pyx:

beta, mse_ = np.linalg.lstsq(B[:, 0:(basis_size)], weighted_y)[0:2]

beta, mse_ = np.linalg.lstsq(

It looks like numpy.linalg.lstsq will produce different results depending on machine precision if an explicit rcond condition is not provided. I can get pyearth to give consistent results if I set rcond to something low like rcond=1e-3 but I assume that the final model is not as accurate if the precision is being truncated.

@jcrudy
Copy link
Collaborator

jcrudy commented Jan 21, 2020

@johnmrudolph Thank you for this. If you get a chance, since you've got a reproducible test case in hand, could you see if the problem still exists in the v0.2dev branch? To install the branch, you should do:

git clone git@github.com:scikit-learn-contrib/py-earth.git
cd py-earth
git checkout v0.2dev
python setup.py --cythonize install

Results on this branch might be different because I changed the lstsq function from numpy to scipy. However, I don't have a test case.

@johnmrudolph
Copy link

I pulled the v0.2dev branch and reran the test. Still getting different results on pruning pass and selected model. Scipy lstsq has a different default rcond implementation but it still appears to give different results on the pruning pass. Similar to numpy lstsq if I trim the precision then earth models produce same results on pruning pass and model selection.

I'll dig a little more into how the scipy lstsq is estimating default precisions to use on both the environments that I'm testing. I'll also take a look at the input matrices to see if there are any differences in the values going into scipy lstsq. Any other ideas???

@AresEkb
Copy link

AresEkb commented Jun 20, 2020

Here is a test script https://github.com/AresEkb/pyearth_issue166 It gives very different results on a Windows and Linux machines (384 vs -704 for a last period). I run such a script on thousands of a similar datasets. On the Windows machine predictions seems to be better. On the Linux a significant amount of predictions are below zero. It seems that the problem with Intercept, on the Linux machine it is close to zero.

@johnmrudolph
Copy link

Revisiting this issue. The inconsistent results between machines looks to be caused by the least squares step on the Pruning Pass. The likely culprit is that the bx matrix generated by the Forward Pass is ill-conditioned which causes the numpy least squares solution to be unstable across machines.

As I mentioned earlier in the thread, a quick and dirty way to fix this is to set the leastsq rcond (or cond in Scipy implmentation) to something low like 1-e3 which will force small singular values in the bx matrix to 0 which makes it easier for leastsq to find a consistent solution. That said allowing the user to set the rcond parameter is not a good fix because the user has no way knowing if the bx matrix generated by the forward pass will be near singular. The leastsq methods in numpy/scipy both assume that the coefficient matrix is full-rank. Some good background discussion on this here and how R lm function handles it differently: Differences in Linear Regression in R and Python [closed]

A better solution would be to "fix" the bx matrix before fitting the leastsq method in the Prunning step to improve stability. Not sure what the best approach would be. Looks like in the R Earth package their is a routine that is called in the underlying C code called RegressAndFix which drops any collinear terms in the bx matrix before running the backward pass. Alternatively could also try regularize or scale the bx matrix to see if this makes the leastsq solution more stable. This would be a significant change to the existing py-earth mars implementation so not sure if its worth pursuing.

@jcrudy
Copy link
Collaborator

jcrudy commented Jun 29, 2020

@johnmrudolph Thank you for all this info. While in some cases the pruning pass results differ across machines, the example posted by @AresEkb seems to differ in the forward pass. I do think near-singular matrices at various steps may be the issue. The forward pass has the zero_tol argument as a threshold to detect linear dependence, and its default value may be too small. @AresEkb I would be interested to know whether adjusting zero_tol affects your example.

Apologies to everyone here for my relative absence from this thread and this problem. I have had little time available for this stuff lately, and that's likely to continue for a while. I would welcome any experiments or pull requests for this issue, and would be happy to advise on implementation details if anyone wants to work on it.

@AresEkb
Copy link

AresEkb commented Jun 29, 2020

@jcrudy Thanks a lot! I added an information on a run with zero_tol=1e-7 into README.md. It gives the same results on both machines now. Also it seems that the quality of a model is greater now.

For zero_tol=None
MSE: 2480.9588, GCV: 4410.5933, RSQ: 0.9703, GRSQ: 0.9556

For zero_tol=1e-7
MSE: 30.0431, GCV: 53.4099, RSQ: 0.9996, GRSQ: 0.9995

@johnmrudolph
Copy link

@jcrudy I would be interested in trying a few experiments to attempt to resolve the inconsistent results on the Pruning step. If you have any thoughts on what the preferred approach would be for handling the near-singular matrices (assuming that this is the cause) then let me know. Otherwise I will probably trial a few different methods and report back on which appears to be the best approach - this will probably take some time.

I didn't not realize that @AresEkb was getting different results on the Forward pass too but looks like the issue was resolved by setting a different zero_tol value. Does the potential for inconsistency on the forward pass still warrant some additional investigation?

@jcrudy
Copy link
Collaborator

jcrudy commented Jun 29, 2020

@johnmrudolph That's awesome. I think the general direction proposed in your previous comment is right. The forward pass uses an iterative update strategy based on the QR-decomposition to update the least squares solution when adding new terms to the model. A similar strategy and much of the same code could (I think) be used in the pruning pass, which would also allow for the use of zero_tol in the same way. This would be a major change in the code, and it would certainly be less work to just modify the current pruning pass to handle linearly (almost) dependent columns better, but just in case I'll share my notes on the math of the forward pass. Note that there's a mistake in the definition of the vector b. It should be b_i = pi * max(X_{i, c} - \phi, 0) instead of b_i = max(pi * X_{i, c} - \phi, 0). If you do decide to look at these and need some additional context for them, let me know. There's plenty of stuff that's probably missing from those notes.

@AresEkb Thank you very much for those results. Based on those, I think we can consider this problem solved for the forward pass (at least until someone discovers a data set where the problem persists with the higher zero_tol). Remaining work might be to find the optimal zero_tol default value and update the code (and possibly test cases) accordingly. I'll add an issue for that. @johnmrudolph if you have any interest, please feel free to take over that issue.

notes_on_forward_pass.pdf

@jcrudy
Copy link
Collaborator

jcrudy commented Jun 29, 2020

@johnmrudolph FYI I just had a quick look at those notes. Only the very first part is relevant. The part that's really important is that you can do an update of a QR decomposition using Householder reflections, and that the lower right component of the R factor gives the error of the least squares problem. The code for building and updating the QR factorization is in _qr.pyx, and there are tests in test_qr.py. Again, not necessary to even look at that stuff, but just want to point you at it in case you decide to investigate that route.

@johnmrudolph
Copy link

Thanks for the background information @jcrudy. I'll take a closer look at the .pyx files that handle the forward pass and QR factorization. I've mostly focused on the pruning step so will need to familiarize myself with this part of the code base. Your idea of repurposing the qr factorization seems like a good approach to start with. I'll check back after doing some investigating to determine if its a good path forward.

@johnmrudolph
Copy link

Checking back in on this. I've had a chance to go through the forward pass in a little more detail to get a better understanding for how QR factorization is being applied in that step. General idea is that we could update the pruning pass to leverage QR factorization in a similar way as Forward Pass instead of using leastsq which appears to yield inconsistent results when there are near singular columns.

@jcrudy could use a little more guidance on how to leveraging the existing code base for solving the least squares problem using UpdatingQT. From_knot_search.pyx it seems we might be able to re purpose SingleWeightDependentData and SingleOutcomeDependentData to compute SSE since SingleOutcomeDependentData already has a built in method to compute SSE using QR factorization from UpdatingQT but maybe there is a good reason that we can't go this route.

Maybe there is more direct way to compute SSE from UpdatingQT but I'm having trouble figuring out we could implement this or where in the existing code base this method for solving the least squares problem resides if it isn't in _knot_search.pyx.

@SandraSukarieh
Copy link

SandraSukarieh commented Nov 17, 2022

I have the same issue but with a different method so the zero-tol value change won't help. I am using sklearn Linear Regression for a time-series-forecasting application:

regr = LinearRegression()
regr.fit(x, y)

Since there's no attribute to set the seed when the class object is initialized and then the fit function is called, I am setting the NumPy random seed to a fixed seed:

np.random.seed(123)

This seed is set globally in the first entry point of the application.

The problem is that the results of the forecasting are different between different machines, although we are running inside a Docker Image that depends on a Pipfile so all the dependencies have the same versions + the python version is the same as well (3.9):

numpy = "==1.22.3"
scikit-learn = "==1.1.1"

and from the Pipfile.lock:

"scipy": {
            "hashes": [
                "sha256:06d2e1b4c491dc7d8eacea139a1b0b295f74e1a1a0f704c375028f8320d16e31",
                "sha256:0d54222d7a3ba6022fdf5773931b5d7c56efe41ede7f7128c7b1637700409108",
                "sha256:1884b66a54887e21addf9c16fb588720a8309a57b2e258ae1c7986d4444d3bc0",
                "sha256:1a72d885fa44247f92743fc20732ae55564ff2a519e8302fb7e18717c5355a8b",
                "sha256:2318bef588acc7a574f5bfdff9c172d0b1bf2c8143d9582e05f878e580a3781e",
                "sha256:4db5b30849606a95dcf519763dd3ab6fe9bd91df49eba517359e450a7d80ce2e",
                "sha256:545c83ffb518094d8c9d83cce216c0c32f8c04aaf28b92cc8283eda0685162d5",
                "sha256:5a04cd7d0d3eff6ea4719371cbc44df31411862b9646db617c99718ff68d4840",
                "sha256:5b88e6d91ad9d59478fafe92a7c757d00c59e3bdc3331be8ada76a4f8d683f58",
                "sha256:68239b6aa6f9c593da8be1509a05cb7f9efe98b80f43a5861cd24c7557e98523",
                "sha256:83b89e9586c62e787f5012e8475fbb12185bafb996a03257e9675cd73d3736dd",
                "sha256:83c06e62a390a9167da60bedd4575a14c1f58ca9dfde59830fc42e5197283dab",
                "sha256:90453d2b93ea82a9f434e4e1cba043e779ff67b92f7a0e85d05d286a3625df3c",
                "sha256:abaf921531b5aeaafced90157db505e10345e45038c39e5d9b6c7922d68085cb",
                "sha256:b41bc822679ad1c9a5f023bc93f6d0543129ca0f37c1ce294dd9d386f0a21096",
                "sha256:c68db6b290cbd4049012990d7fe71a2abd9ffbe82c0056ebe0f01df8be5436b0",
                "sha256:cff3a5295234037e39500d35316a4c5794739433528310e117b8a9a0c76d20fc",
                "sha256:d01e1dd7b15bd2449c8bfc6b7cc67d630700ed655654f0dfcf121600bad205c9",
                "sha256:d644a64e174c16cb4b2e41dfea6af722053e83d066da7343f333a54dae9bc31c",
                "sha256:da8245491d73ed0a994ed9c2e380fd058ce2fa8a18da204681f2fe1f57f98f95",
                "sha256:fbc5c05c85c1a02be77b1ff591087c83bc44579c6d2bd9fb798bb64ea5e1a027"
            ],
            "markers": "python_version >= '3.8'",
            "version": "==1.9.3

The differences in the forecasted values are also significant to our application. Example:

  • Locally with Docker on Windows 10:
"balanceForecasting": [
        {
            "balanceDay": "2021-06-03",
            "balanceOut": 8499.004637178892
        },
        {
            "balanceDay": "2021-06-04",
            "balanceOut": 140277.5941793727
        },
        {
            "balanceDay": "2021-06-05",
            "balanceOut": 120714.62511521135
        },
        {
            "balanceDay": "2021-06-06",
            "balanceOut": 115484.57147830987
        }
]
  • CICD test pipeline on cloud environment (Azure) with the same Docker settings:
"balanceForecasting": [
        {
            "balanceDay": "2021-06-03",
            "balanceOut": 12693.62272062352,
        },
        {
            "balanceDay": "2021-06-04",
            "balanceOut": 41278.34888875554,
        },
        {
            "balanceDay": "2021-06-05",
            "balanceOut": 26454.121022592182,
        },
        {
            "balanceDay": "2021-06-06",
            "balanceOut": 33658.93825553794,
        }
]

As per my understanding, setting the seed to a specific number should confirm the reproducibility, but is there anything I am missing here. Or If I need to set the seed in some other way.

Can anyone support on this?

@kevin-dietz
Copy link

@SandraSukarieh This is a known bug. Others have suggested setting rcond or zero_tol to a very low value like 1e-7 seems to have helped with the reproducibility problem. But this project is largely abandoned so don't expect a fix for this issue.

@kikogomez
Copy link

Could it have something to do with the processor's AVX and FMA instruction set? The results in an older PC with AVX and a newer one with AVX2 differ. I have checked numpy info 'np.show_config()', and:

The numpy info in the older one states:

Supported SIMD extensions in this NumPy install:
baseline = SSE,SSE2,SSE3
found = SSSE3,SSE41,POPCNT,SSE42,AVX,F16C
not found = FMA3,AVX2,AVX512F,AVX512CD,AVX512_SKX,AVX512_CLX,AVX512_CNL,AVX512_ICL

While, the newer:
Supported SIMD extensions in this NumPy install:
baseline = SSE,SSE2,SSE3
found = SSSE3,SSE41,POPCNT,SSE42,AVX,F16C,FMA3,AVX2
not found = AVX512F,AVX512CD,AVX512_SKX,AVX512_CLX,AVX512_CNL,AVX512_ICL

Both work with OpenBlas, no MKL.

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
Projects
None yet
Development

No branches or pull requests

8 participants