Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

fftpack segfault with big arrays (Trac #1714) #2233

Closed
scipy-gitbot opened this issue Apr 25, 2013 · 6 comments · Fixed by #10507
Closed

fftpack segfault with big arrays (Trac #1714) #2233

scipy-gitbot opened this issue Apr 25, 2013 · 6 comments · Fixed by #10507
Labels
defect A clear bug or issue that prevents SciPy from being installed or used as expected Migrated from Trac scipy.fftpack
Milestone

Comments

@scipy-gitbot
Copy link

Original ticket http://projects.scipy.org/scipy/ticket/1714 on 2012-08-01 by trac user serbanul, assigned to @cournape.

Hello!

I use scipy 0.10.1 with numpy 1.6.2 on a Slackware 13.37 32bit box (gcc 4.5.2, glibc 2.13, python 2.6.6) and experience segfaults when using big arrays with fftpack's fft2. This does not happen when the array is obviously too big to fit the memory (4GB nominaly), in which case I receive an MemoryError exception, as it should be. It happens when the size of the array goes beyond a certain limit, which for 64bit floats is approximately 4.55e7 elements. Please find the appropriate code snippet and the back trace for a relevant array size within the attached files. At the end of the file with the backtrace, there is also an example for the case in which a MemoryError exception is raised.

Best regards,

Serban Udrea

@scipy-gitbot
Copy link
Author

Attachment added by trac user serbanul on 2012-08-01: fft_err.py

@scipy-gitbot
Copy link
Author

Attachment added by trac user serbanul on 2012-08-01: fft_err_BT.txt

@scipy-gitbot
Copy link
Author

@rgommers wrote on 2012-08-10

4.55e7 elements is 2.9Gb for float64, so should raise a MemoryError too.

@scipy-gitbot
Copy link
Author

Milestone changed to Unscheduled by @rgommers on 2012-08-10

@scipy-gitbot
Copy link
Author

trac user serbanul wrote on 2012-08-10

Replying to [comment:1 rgommers]:

4.55e7 elements is 2.9Gb for float64, so should raise a MemoryError too.

4.55e7 is an approximation. I didn't check the limit very precisely. Fact is that at nx=22750 and ny=1000 there is no segmentation fault and no MemoryError either. With these values the final matrix has 45502000 elements (because of the two glue rows). The computation runs for a very long time. Actually I did not have the patience to wait for it to finish. After about 30min I killed the process. During this time top shows that one of the machine's CPU cores is running at 100% and that the memory used by the python process stays practically constant at about 64.4%, which makes approx. 2.38GB if I consider the total available memory as reported by top to be approx. 3.69GB.

Best regards,

Serban

@mreineck
Copy link
Contributor

mreineck commented Jul 22, 2019

Most likely no one will be interested in this any more, but I have a likely explanation for the problem: FFTPACK has a n internal limit (at most 15) for the number of passes it can do over the data, so for transforms that require more than 15 passes memory corruption happens, with all kinds of weird results (crashes, silently wrong results, ...).
The biggest length up to which fftpack transforms are guaranteed to work is (2**1)*(3**14)=9565938. Beyond that value things may go wrong.
Scipy and numpy currently have no checks for this.
Starting with numpy 1.17 and scipy 1.4 things will hopefully work better, since fftpack will be replaced by pocketfft, which doesn't have a hard limit for the number of passes.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
defect A clear bug or issue that prevents SciPy from being installed or used as expected Migrated from Trac scipy.fftpack
Projects
None yet
Development

Successfully merging a pull request may close this issue.

3 participants