Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

MemoryError while processing GUPPI .fits files #37

Open
nategarver-daniels opened this issue Jul 11, 2016 · 3 comments
Open

MemoryError while processing GUPPI .fits files #37

nategarver-daniels opened this issue Jul 11, 2016 · 3 comments

Comments

@nategarver-daniels
Copy link

I'm probably doing something wrong, but I've been trying to run burst_guppi on some 820MHz GUPPI search data that I have. The file size is ~11G and I inevitably get a MemoryError after the code runs for 20 minutes or so. Program output is:

INFO:__main__:Rank 0, file guppi_57568_AzEl_180.021_070.000_0001_0036.fits.
INFO:burst_search.dedisperse:Using 524288 effective channels with a memory footprint of 25600.0 MB/s
INFO:__main__:Doing actions: save_plot_dm,print
INFO:burst_search.manager:Processing block 1.
INFO:burst_search.manager:Dispersion index 2.0, spectral index 0.0.
Traceback (most recent call last):
  File "/opt/pulsar/src/burst_search/scripts/burst_guppi", line 153, in <module>
    Searcher.process_all()
  File "build/bdist.linux-x86_64/egg/burst_search/manager.py", line 342, in process_all
  File "build/bdist.linux-x86_64/egg/burst_search/manager.py", line 312, in process_next_block
  File "burst_search/dedisperse.pyx", line 301, in burst_search.dedisperse.DMTransform.__call__ (burst_search/dedisperse.c:5832)
MemoryError

I suspect that there is probably something simple that I'm not doing.

@kiyo-masui
Copy link
Owner

Try using the 'scrunch' functionality. Right now it is up-sampling the frequency axis to 524288 channels to get to the maximum DM. With scrunching it will instead down sample the time axis for the higher DMs (where time is smeared anyhow).

@kiyo-masui
Copy link
Owner

Let me know if this works for you. To close this issue, I suppose I should put a note about memory requirements, and how to get around them, in the README. Your thoughts?

@nategarver-daniels
Copy link
Author

Kiyo,

I think you can close the issue. Makeing a note about the memory
requirements would be good though. I tried the --scrunch option but
ended up with a different error (but one possibly related to compiling
the package). I also tried limiting the DM range to just a few DM but
was still getting memory errors. Trying to sort out if it's a system
issue or if I really need to be using a lot of nodes over MPI to spread
things out (and reduce the memory pressure on each individual node)

-Nate

On 07/14/2016 12:31 PM, Kiyoshi Masui wrote:

Let me know if this works for you. To close this issue, I suppose I should put a note about memory requirements, and how to get around them, in the README. Your thoughts?


You are receiving this because you authored the thread.
Reply to this email directly or view it on GitHub:
#37 (comment)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants