-
Notifications
You must be signed in to change notification settings - Fork 630
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
"Too many open files" error #100
Comments
This is not directly related to nextflow. That error is reported by the The easiest solution is to increase that limit with |
I'm closing the issue. Feel free to comment if you need more explanations. |
unfortunately, our ulimit is set to unlimited |
That is weird. Are you sure you are checking that in the cluster computing |
i found the following link may be helpful to solve it. |
yes, we check ulimit on every node of the cluster |
Yes, this confirms that it's a problem with sambamba (I mean the biod/sambamba#177 issue) |
when running nextflow on a grid engine cluster, always got a "too many open files" error during the middle of the running. detailed log as follows.
$ cat .command.log
finding positions of the duplicate reads in the file...
sambamba-markdup: Cannot open or create file './sambamba-pid6283-ztgh/sorted.50.bam' : Too many open files
is it possible related to nextflow?
The text was updated successfully, but these errors were encountered: