Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Segmentation fault with openblas >0.3.10 and numpy #3180

Closed
Vringe opened this issue Apr 13, 2021 · 2 comments
Closed

Segmentation fault with openblas >0.3.10 and numpy #3180

Vringe opened this issue Apr 13, 2021 · 2 comments

Comments

@Vringe
Copy link

Vringe commented Apr 13, 2021

I'm using numpy-1.16.6 with openblas-0.3.13 which causes a segmentation fault while importing.
Not sure if it is numpy or openblas related. But what I found out is that with openblas-0.3.13, much more virtual memory is required.

Here is how I tested it:

$ ulimit -v
2097152
$ python -c "import numpy as np"
Segmentation fault
$ ulimit -v 3670016
$ python -c "import numpy as np"
$

Downgrading to openblas-0.3.10 also fixes this issue.

Also mentioned here: numpy/numpy#18631

@martin-frbg
Copy link
Collaborator

martin-frbg commented Apr 13, 2021

OpenBLAS versions after 0.3.10 default to using a larger buffer for multihreaded GEMM to accomodate larger problem sizes. This can be adjusted at build time via the BUFFERSIZE parameter - recent numpy releases make use of this (setting BUFFERSIZE=20 to restore the pre-0.3.10 behaviour), which is probably why you were asked to upgrade to 1.20.x in the numpy ticket
(i.e. basically a duplicate of #2970 and numpy/numpy#18141)

@Vringe
Copy link
Author

Vringe commented Apr 13, 2021

Thank you. Can confirm that setting BUFFERSIZE to 20 fixes the issue.

@Vringe Vringe closed this as completed Apr 13, 2021
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants