-
Notifications
You must be signed in to change notification settings - Fork 170
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
nodup bam #59
Comments
Remove any FASTQ definition (
|
I'm getting an error: [error] WorkflowManagerActor Workflow a7b66b90-77f3-47b9-afa6-d874bd8e21ab failed (during ExecutingWorkflowState): Job atac.spr:2:1 exited with return code 1 which has not been declared as a valid return code. See 'continueOnReturnCode' runtime attribute for more details. {
} |
Did you download genome database for test samples? |
Sorry, that was the wrong output. This is my current one: |
We will look into this problem. Please do not |
Thanks. I also realized I sent the old run.sh. This is the current one. #!/bin/bash export _JAVA_OPTIONS="-Xms256M -Xmx16384M -XX:ParallelGCThreads=1" source activate encode-atac-seq-pipeline |
Please attach a tar ball of all logs for debugging. When you created this issue, you should see an instruction to make a tar ball. Is there a particular reason that two memory settings differ ( |
No good reason, I'll remove -Xmx16384M in the future. |
Your TAG-ALIGN looks trivially small.
Something is wrong with your BAM inputs? |
I don't think the bam file is the issue. It was generated from your previous pipeline and gave a reasonable-sized tag-align. -rw-r--r-- 1 kirsty.jamieson shen 7026944483 Sep 22 12:51 KJ064_R1_TRIM.trim.bam |
Can you upload an input JSON for that previous pipeline run (which starts from FASTQs)? |
I used the json you provided. Yes, it's paired end. |
Could you send me a json file that you used for a successful run starting from bam? |
@kirstyjamieson: Here is an example input JSON file which starts from
I tested pipelines starting from FASTQs and NODUP_BAMs. Both worked fine. Got the same BED outputs (
|
I took the nodup bams generated from the successful run when I started at your subsampled fastq and that worked. I get mixed results when I start with my own tag-align files. For one pair, it succeeded and for another pair it failed at the very end with: "SingleWorkflowRunnerActor received Failure message: connection exception: closed java.sql.SQLNonTransientConnectionException: connection exception: closed." I did manage to get idr peaks before it failed. I also tried 3 replicates with tag-aligns and got a different failure, much earlier on: "WorkflowManagerActor Workflow 72c2bf95-1c83-49e4-9e6f-e48151ac76bb failed (during ExecutingWorkflowState): cromwell.backend.standard.StandardAsyncExecutionActor$$anon$2: Failed to evaluate job outputs" So the pipeline seems to work well with subsampled files but fails with my much larger files. |
Let's stick to a single failure case (starting from your own large nodup_bam files). You got this good-sized NODUP_BAM and TAG-ALIGN when you started a pipeline from FASTQ
But you failed to get the same TAG-ALIGN when you started from NODUP_BAM. Please upload your input JSON files for those two cases. |
Those files were generated with the previous pipeline. |
I have processed >100 large samples from ENCODE and they all worked fine. Please increase memory in your job submission shell script first.
Also remove all resource settings from your input JSON file. Those resources are too small for your large samples.
Next time please use a template |
I tried increasing memory and used the default parameters from the template. I am still getting mixed results. |
@kirstyjamieson: Please pick one failed sample and upload an error log, tar ball and input JSON for it. |
run_induced_nodupbam.sh.txt induced_nodupbam.json
|
Pipeline failed to convert your BAMs into BEDs. Let's do some line-by-line debugging.
Can you also share your BAM files ( |
Yeah, I can't get to the beds. I shared the files on google drive.
Output
|
I replicated this error. It turns out that your BAM file does not have any paired-ended reads.
Please try with single-ended settings. {
...
"atac.paired_end" : false,
...
} |
Yes, you are right! I had a typo in my script from the original pipeline where I generated the bam, so it only recognized the first read of the pair. Thank you for your help! |
Hi Jin,
Could you provide an example .json file for starting with nodup bam instead of fastq?
Thanks,
Kirsty
The text was updated successfully, but these errors were encountered: