-
Notifications
You must be signed in to change notification settings - Fork 27
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Memory Issue on [INFO] Merge chunked contigs vcf files #45
Comments
Also! Is it possible to resume this step? |
Hi, It should be an out of memory issue, here are some suggestions for your reference:
We will also improve the Hope it helps! |
Thank you for the response! Will try these suggestions. |
Seems that your link is not visible, you might get some base-caller version details in the BAM header. Actually, you could base-call the We suggest you use our ONT default Guppy3-4 model(in your path Hope it helps! |
Hey guys! Again the suggestion to reduce threads is what we opted for, at least for the time being and we had some success with that. However, I'm seeing an issue now in another run that is giving me the error: Too many open files.. Any thoughts on this? |
In your running environment, please run
Your error was triggered by a not high enough |
Awesome, thanks for suggestion! For anyone who may encounter this issue: I found this thread the most helpful for debugging the issue of why my root users limits were increased but not my normal users. https://superuser.com/questions/1200539/cannot-increase-open-file-limit-past-4096-ubuntu |
Hi, The new release(v0.1-r6) reduced the memory footprint in merging VCF as well as reduced |
Hi guys,
Running clair3, and have encountered this issue twice now. I'm not sure whats going on.
I'm running on virtual machine with 64gb of memory, and on a disk with ~85gb free.
The bam I am processing is around 90gb.
Here is the runlog
run_clair3.log
Any info appreciated,
Thanks!
The text was updated successfully, but these errors were encountered: