Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Filter.seqs / Screen.seqs - sequences are not the same length error #160

Closed
mothur-westcott opened this issue Sep 10, 2015 · 8 comments

Comments

@mothur-westcott
Copy link
Contributor

commented Sep 10, 2015

Rare occurrence. Need to take a closer look.

@mothur-westcott mothur-westcott added this to the Version 1.37.0 milestone Sep 10, 2015
@mothur-westcott

This comment has been minimized.

Copy link
Contributor Author

commented Nov 16, 2015

2 reports with use of summary file. Not sure if this is the only way the bug occurs. Number of processors has no effect.

screen.seqs(fasta=seqs.trim.contigs.good.unique.align, count=seqs.trim.contigs.good.count_table, summary=seqs.trim.contigs.good.unique.summary, optimize=end-start-maxhomop, criteria=95, minlength=400)

@mothur-westcott

This comment has been minimized.

Copy link
Contributor Author

commented Nov 16, 2015

screen.seqs with summary file overwrites summary file with new screened summary file. Could this be causing issues if rerun? Either way we should change it.

@mothur-westcott

This comment has been minimized.

Copy link
Contributor Author

commented Nov 17, 2015

One user reported low hard drive space as the cause of the issue.

@mothur-westcott

This comment has been minimized.

Copy link
Contributor Author

commented Nov 30, 2015

RAM issue caused the problem for another user. The process failed with no error and appended unprocessed reads. She was a windows users. Investigate adding GetLastError() function. https://msdn.microsoft.com/de-de/library/windows/desktop/ms679360(v=vs.85).aspx

@mothur-westcott

This comment has been minimized.

Copy link
Contributor Author

commented Feb 29, 2016

Completed with 24d445d

@SophieS9

This comment has been minimized.

Copy link

commented Apr 25, 2017

Hi! I'm working with a researcher who is experiencing this error with mothur (version 1.39.0) when running the following:

filter.seqs(fasta=Samples.makecontigsfile.trim.contigs.good.unique.good.align, vertical=T, trump=.)

They've tried with both 80G and 160G memory and after some time it crashes with the following error:

[ERROR]: Sequences are not all the same length, please correct.

Can't see where it's going wrong. I've attached the four log files produced so far, unfortunately the last command doesn't come up on the log.

Any ideas welcome!!

mothur.1492684290.logfile.txt
mothur.1492690230.logfile.txt
mothur.1492690374.logfile.txt
mothur.1493110766.logfile.txt

@mothur-westcott

This comment has been minimized.

Copy link
Contributor Author

commented Apr 25, 2017

Have you tried this with our current version, 1.39.5? If you still have an error, can you try running the filter.seqs command in debug mode?

mothur > set.dir(debug=t)
mothur > filter.seqs(...)

@SophieS9

This comment has been minimized.

Copy link

commented Apr 26, 2017

Thanks for getting back to me. Unfortunately we're stuck with version 1.39.0 because of the cluster system we work on, but if this is going to be the only fix I can push admin to upgrade!

I've attached the log file from running the command in debug mode. It's mostly blank lines but there is output at the end of the file. The majority of the sequences showed length 50,000 but there are five that are different. Although the filter.seqs command finished, the filtered fasta file wasn't produced.

mothur.1493196199.logfile.txt

UPDATE We've managed to resolve this. Seemed to be an issue caused by hard trimming off the primer sequences from the ends of the reads. Using the raw data without trimming, we don't see this error.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
2 participants
You can’t perform that action at this time.