-
Notifications
You must be signed in to change notification settings - Fork 270
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Error Reading in Hash Table after building db #84
Comments
I get also some hash table problem as well with Kraken2 installed with Homebrew. The classification runs well with special databases I build unlike Stravrosnco, but with minikraken2_v1_8GB and _v2_8GB, I get: The classification with minikraken2_v2_8GB was working just fine with Kraken2 installed from source code. I would also appreciate any suggestion. Thank you! |
Hello, I have the same issue with kraken2-2.0.8-beta:
Have you fixed your issues ? Thanks a lot ! |
@jfourquet2 just a note that I came across this issue when troubleshooting a kraken2 analysis, so maybe this will help. First, the --fastq-input argument is no longer used in kraken2 it appears. Second, kraken2 'classify' step reads in the entire hash table, so you need to give it enough memory to do it or it will throw the 'Loading database information...classify: Error reading in hash table' error when it gets killed for running out of available memory. |
I am also experiencing this issue. Has there been any progress in a resolution? I'm trying to load a ~64GB hash file (hash.k2d) on a machine with >400GB of memory. Kraken2 reports the Kraken2 version 2.0.8-beta |
Yes - I was able to solve this problem by running Kraken2 on a server with more RAM. Thank you for the feedback! If I only used 1 thread, I might have been able to run this locally (if there is no high performance cluster available)? |
How many of these issues are on PBS SGE or SLURM? |
Hello, Cheers, |
Hi, I use the version of v.2.1.1 and run it on the server with a larger ram, i still get the followed error: Has anyone could help to solve it? Best, |
@Danny-Science I have the same problem. Were you able to solve this? Thanks! |
I am using a 16gb machine with minikraken db which is 8gb. why am I having this error if I have more than enough memory? |
Kraken fails inconsistent with this error: `Loading database information...classify: Error reading in hash table` which seems to be memory related: DerrickWood/kraken2#84 I do not really have a good starting point of how much memory is required. Let's start with 10GB and check with @ENGYS data and increase if needed.
Hello,
I have been trying to follow the steps to generate a kraken2 db of all bacteria. I've run the commands
kraken2-build --threads 8 --download-taxonomy --db all_bac
kraken2-build --download-library "bacteria" --db all_bac/ --threads 8
kraken2-build --build --db all_bac/ --threads 8 --max-db-size 8589934592
This resulted in:
Creating sequence ID to taxonomy ID map (step 1)... Sequence ID to taxonomy ID map already present, skipping map creation. Estimating required capacity (step 2)... Estimated hash table requirement: 31314263768 bytes Specifying lower maximum hash table size of 8589934592 bytes Capacity estimation complete. [9m42.202s] Building database files (step 3)... Taxonomy parsed and converted. CHT created with 14 bits reserved for taxid. Completed processing of 29226 sequences, 55998669238 bp Writing data to disk... complete. Database files completed. [13m54.035s] Database construction complete. [Total: 23m36.338s]
However when I try to classify using:
kraken2 --db all_bac/ /mnt/c/Data/minION/e_coli/analysis/MS1/FAH73738_dc131eb88b18432f748a865caf3119fecb4e17d8_ms1_0.fastq
Or even run:
kraken2-inspect --db all_bac/
I get the error message:
dump_table: Error reading in hash table
This is using Kraken version 2.0.7-beta.
Any suggestions on what to do to resolve this? I also experienced the same error using the prebuilt MiniKraken V1 and V2 databases.
Thank you very much!
The text was updated successfully, but these errors were encountered: