Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Memory issue #117

Open
metanav opened this issue Jun 22, 2018 · 1 comment
Open

Memory issue #117

metanav opened this issue Jun 22, 2018 · 1 comment

Comments

@metanav
Copy link

metanav commented Jun 22, 2018

Command line: /usr/local/bin/spades.py -1 /app/data/R1.fastq -2 /app/data/R2.fastq -o /app/data/output/spades_assembly/read_correction --only-error-correction

System information:
SPAdes version: 3.12.0
Python version: 2.7.15
OS: Linux-3.10.0-514.10.2.el7.x86_64-x86_64-with-Ubuntu-18.04-bionic

I am getting following error.
0:17:11.642 3G / 18G ERROR K-mer Counting (kmer_data.cpp : 353) The reads contain too many k-mers to fit into available memory. You need approx. 303.601GB of free RAM to assemble your dataset

I tried with default threads, "-t 16" and "-t 32" but same error.
The params.txt shows "Memory limit (in Gb): 250".

My linux box has 1 TB memory. How can I pass needed memory params to unicycler or set it somewhere?

@rrwick
Copy link
Owner

rrwick commented Jul 2, 2018

You'd have to modify Unicycler's SPAdes call in its source code. These are the relevant lines. It would need --memory 1000 or something like that added on. Alternatively, you could run error correction separately and then run Unicycler with --no_correct.

But why does it need so much memory in the first place?! How big is your read set? How big is the genome you're assembling? Don't forget that Unicycler is really just for bacterial isolates, so if you're trying to assemble a large genome, you'll probably run into lots of other problems beside SPAdes' memory usage.

Ryan

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants