Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[ERROR coverm::bam_generator] The STDERR for the minimap2 part #148

Open
WoCer2019 opened this issue Dec 27, 2022 · 8 comments
Open

[ERROR coverm::bam_generator] The STDERR for the minimap2 part #148

WoCer2019 opened this issue Dec 27, 2022 · 8 comments

Comments

@WoCer2019
Copy link

Hi, I used the nice tool CoverM for calculating the coverage of MAG, but get an error. Please give me some advice, thanks!

Code:
coverm genome --coupled Reads/BPA_1.fastq Reads/BPA_2.fastq Reads/DBP_1.fastq Reads/DBP_2.fastq Reads/DEP_1.fastq Reads/DEP_2.fastq Reads/DOP_1.fastq Reads/DOP_2.fastq Reads/GLU_1.fastq Reads/GLU_2.fastq Reads/NP_1.fastq Reads/NP_2.fastq Reads/S29.raw_1.fastq Reads/S29.raw_2.fastq Reads/S30.raw_1.fastq Reads/S30.raw_2.fastqReads/S31.raw_1.fastq Reads/S31.raw_2.fastq Reads/S32.raw_1.fastq Reads/S32.raw_2.fastq Reads/S33.raw_1.fastq Reads/S33.raw_2.fastq Reads/S34.raw_1.fastq Reads/S34.raw_2.fastq Reads/S35.raw_1.fastq Reads/S35.raw_2.fastq Reads/Seed_1.fastq Reads/Seed_2.fastq Reads/TC_1.fastq Reads/TC_2.fastq -d dereplicated/output/dereplicated_genomes -x fa -m rpkm -o MAG.abundance.csv -t 80

log file:
nohup.out.txt

@wwood
Copy link
Owner

wwood commented Dec 27, 2022 via email

@WoCer2019
Copy link
Author

Thanks for your reply! I used the same reads file for calculating the coverage of contig, it was successful. So, if the reads file has a problem, the result of calculating the coverage of contig will not be created. I feel it very strange. Why can calculate the coverage of contig and cannot calculate MAG?

Code (calculate contig):
coverm contig --coupled Reads/BPA_1.fastq Reads/BPA_2.fastq Reads/DBP_1.fastq Reads/DBP_2.fastq Reads/DEP_1.fastq Reads/DEP_2.fastq Reads/DOP_1.fastq Reads/DOP_2.fastq Reads/GLU_1.fastq Reads/GLU_2.fastq Reads/NP_1.fastq Reads/NP_2.fastq Reads/S29.raw_1.fastq Reads/S29.raw_2.fastq Reads/S30.raw_1.fastq Reads/S30.raw_2.fastqReads/S31.raw_1.fastq Reads/S31.raw_2.fastq Reads/S32.raw_1.fastq Reads/S32.raw_2.fastq Reads/S33.raw_1.fastq Reads/S33.raw_2.fastq Reads/S34.raw_1.fastq Reads/S34.raw_2.fastq Reads/S35.raw_1.fastq Reads/S35.raw_2.fastq Reads/Seed_1.fastq Reads/Seed_2.fastq Reads/TC_1.fastq Reads/TC_2.fastq -r all.contig.fa --min-read-percent-identity 95 --min-read-aligned-percent 75 -m rpkm -o contig.abundance.csv -t 80

@wwood
Copy link
Owner

wwood commented Dec 27, 2022 via email

@WoCer2019
Copy link
Author

Ok. I check the number of reads. Meanwhile, I wish to get the subsequent reply too. Thanks!

@nvpatin
Copy link

nvpatin commented Apr 28, 2023

Hi, was this issue ever resolved? I'm getting the same error trying to map interleaved reads to a contigs fasta. According to the log file, the BAM files are successfully generated and cached in the provided directory, but then the mapping calculations fail with the below error. Also, when I look in the bam cache directory, it only has two bam files (out of >60 expected).

[2023-04-28T20:38:44Z ERROR coverm::bam_generator] Failed to correctly find or parse BAM file at "/tmp/coverm_fifo.pWJk6n4hDNHa/foo.pipe": unable to open SAM/BAM/CRAM file at /tmp/coverm_fifo.pWJk6n4hDNHa/foo.pipe

And here is my full command:

coverm genome
--reference eCruises_2018-2021_MAGs_HQ_dRep_concat.fa
-s "_"
-m relative_abundance
--interleaved *fq.gz
--min-read-aligned-percent 0.75
--min-read-percent-identity 0.95
--min-covered-fraction 0
-x fa -t 10 -o Flyer2018_MAG_relabund.tsv
--bam-file-cache-directory Flyer2018_MAG_bams &> log-relabund.txt

@wwood
Copy link
Owner

wwood commented Apr 28, 2023

Hi @nvpatin that seems like a completely different issue. Maybe ran out of RAM?

Can you please check and if not can you post another issue with more details ie the whole output please?

For this issue need to know read numbers in each file, seems like corrupted input read files.

@nvpatin
Copy link

nvpatin commented Apr 30, 2023

Thanks for the quick response. I think it's because I ran out of temporary storage space during the mapping process. I removed the --bam-file-cache-directory parameter and it worked fine, but the bam files were not saved so I guess I'll have to generate them anew if I want to run coverage calculations.

If I'm running this job in a loop on 50+ metagenomes, is there any way to specify that files in /tmp are deleted after each alignment and/or mapping job? I'm happy to open a new issue if you prefer but maybe it's an easy (or impossible) fix.

@wwood
Copy link
Owner

wwood commented Apr 30, 2023

Hi,

I can't remember actually, what files are kept that you would like to see deleted?

Maybe the shortest path to a solution for you is to specify TMPDIR - see the FAQ.
ben

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants