Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[joss-review] memory requirement of example run #22

Closed
zonca opened this issue Apr 24, 2024 · 3 comments
Closed

[joss-review] memory requirement of example run #22

zonca opened this issue Apr 24, 2024 · 3 comments

Comments

@zonca
Copy link

zonca commented Apr 24, 2024

the first step of the analysis requires about 20 GB of RAM, it would be useful to specify it in the docs, I ran with 15 GB and the process was killed (probably for out of memory)

I'll also estimate memory requirement for the second step and add it here.

@jburba
Copy link
Collaborator

jburba commented May 7, 2024

I've updated the "Test Dataset" docs under usage.rst with RAM requirements (b9af6b5). When I ran the test dataset on a slurm system, I see the matrix construction step required ~17 GB of RAM according to the MaxRSS column in the slurm database and the power spectrum analysis required ~12 GB of RAM. The RAM requirement for the matrix construction being ~17 GB is probably why your job was killed with 15 GB of RAM but succeeded with 20 GB.

@zonca
Copy link
Author

zonca commented May 7, 2024

it seems like the docs website is not automatically updating:
https://bayeseor.readthedocs.io/en/latest/usage.html#running-the-power-spectrum-analysis

as you estimated the power spectrum requirement for power spectrum analysis, I think it would be a good idea to add that number in.
Do you also know how much GPU memory is needed?

@jburba
Copy link
Collaborator

jburba commented May 8, 2024

Ah, yes. I had to manually rebuild the docs (I haven't automated the docs building yet). The documentation should be updated now.

Regarding GPU memory, I believe the routing we are using in MAGMA is capable of partial i/o so it should be capable of running on a wide range of hardware. I can try re-running the test data analysis and using e.g. nvidia-smi to try and get some information on GPU memory usage if necessary.

@zonca zonca closed this as completed May 8, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants