-
Notifications
You must be signed in to change notification settings - Fork 2
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Huge Memory consumption #3
Comments
--filter_sTF 1 --filter_sStart 3 --filter_sEnd 4 --suffix_label 0 --genus (should not affect) --corr (should not affect) if there is still a memory problem it might be because of the memory it needs to create the huge images. So input should not been affecting memory but image creation is affecting it. |
Thanks, I am trying the options now and will keep you up to date |
How much memory did you have allocated in slurm?
|
@Colorstorm I haven't tried the script out, but I saw uses up to 480 GB RAM on one server this morning. |
Konstantinos' working command: command i used on Ubuntu * ubuntu on windows 10..! on a 8 GB deskop on toy dataset (Github) :
|
Huge RAM use was due to the coverage window, and not the bam.txt files, being used as input. |
Hi,
I have tried to run some test and had a lot of trouble with the huge amount of memory that is needed.
@konnosif which settings do you use?
Maybe we could think about making the settings with the least memory usage default and add a comment that other settings take a lot of memory
The text was updated successfully, but these errors were encountered: