You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I'm using numpy-1.16.6 with openblas-0.3.13 which causes a segmentation fault while importing.
Not sure if it is numpy or openblas related. But what I found out is that with openblas-0.3.13, much more virtual memory is required.
OpenBLAS versions after 0.3.10 default to using a larger buffer for multihreaded GEMM to accomodate larger problem sizes. This can be adjusted at build time via the BUFFERSIZE parameter - recent numpy releases make use of this (setting BUFFERSIZE=20 to restore the pre-0.3.10 behaviour), which is probably why you were asked to upgrade to 1.20.x in the numpy ticket
(i.e. basically a duplicate of #2970 and numpy/numpy#18141)
I'm using
numpy-1.16.6
withopenblas-0.3.13
which causes a segmentation fault while importing.Not sure if it is
numpy
oropenblas
related. But what I found out is that withopenblas-0.3.13
, much more virtual memory is required.Here is how I tested it:
Downgrading to
openblas-0.3.10
also fixes this issue.Also mentioned here: numpy/numpy#18631
The text was updated successfully, but these errors were encountered: