Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

only histo file outputs from smoove call #235

Open
Ana1335 opened this issue Dec 5, 2023 · 5 comments
Open

only histo file outputs from smoove call #235

Ana1335 opened this issue Dec 5, 2023 · 5 comments

Comments

@Ana1335
Copy link

Ana1335 commented Dec 5, 2023

Hello developers,
I have started running smoove
This is my running command:

smoove call --outdir ${OUT_DIR} --exclude ${BED} --name ${SAMPLE_ID} --fasta ${FASTA} -p 1 --genotype ${INPUT_DIR}/${SAMPLE_ID}.rmdup1.bam

However the only output I get is just ${sample}.histo files as outputs.
I think I am running the right command but I do not get any vcf files.
Am I doing something wrong?

Thanks

@brentp
Copy link
Owner

brentp commented Dec 5, 2023

hi, can you show the stderr and stdout from the command?

@Ana1335
Copy link
Author

Ana1335 commented Dec 6, 2023

Hi, thanks for your reply.
When I run the command above I only get the following outputs: (for now I am running only on three cell lines)

GCT-rmdup1.histo
RD-rmdup1.histo
SW982-rmdup1.histo

head GCT-rmdup1.histo
0	0.005097314201
1	0.006356269899
2	0.006353033509
3	0.006309342244
4	0.006260796395
5	0.006285069319

and the log file I get

Current working directory: /oak/stanford/groups/analysis/shaghayegh/scripts/SR-scripts
Starting run at: Mon Dec  4 22:03:41 PST 2023

Job Array ID / Job ID: 37319724 / 37319725
This is job 1 out of 3 jobs.

[smoove] 2023/12/04 22:03:43 starting with version 0.2.8
[smoove] 2023/12/04 22:03:43 calculating bam stats for 1 bams
[smoove]: ([E]lumpy-filter) 2023/12/04 22:03:43 /bin/bash: lumpy_filter: command not found
[smoove]:2023/12/04 22:03:43 finished process: lumpy-filter (set -eu; lumpy_filter -f /oak/stanford/groups/emoding/sequencing/pipeline/indices/hg19.fa /oak/stanf) in user-time:1.787ms system-time:815µs
[smoove]:2023/12/04 22:03:43 error running command: set -eu; lumpy_filter -f /oak/stanford/groups/sequencing/pipeline/indices/hg19.fa /oak/stanford/groups/emoding/analysis/shaghayegh/shortreads-SV/alignment-BWA/dedupped_BAMs_picard/GCT.rmdup1.bam /oak/stanford/groups/emoding/analysis/shaghayegh/shortreads-SV/results-smoove/GCT-rmdup1.split.bam.tmp.bam /oak/stanford/groups/emoding/analysis/shaghayegh/shortreads-SV/results-smoove/GCT-rmdup1.disc.bam.tmp.bam 2 && mv /oak/stanford/groups/emoding/analysis/shaghayegh/shortreads-SV/results-smoove/GCT-rmdup1.split.bam.tmp.bam /oak/stanford/groups/emoding/analysis/shaghayegh/shortreads-SV/results-smoove/GCT-rmdup1.split.bam && mv /oak/stanford/groups/emoding/analysis/shaghayegh/shortreads-SV/results-smoove/GCT-rmdup1.disc.bam.tmp.bam /oak/stanford/groups/emoding/analysis/shaghayegh/shortreads-SV/results-smoove/GCT-rmdup1.disc.bam && cp /oak/stanford/groups/emoding/analysis/shaghayegh/shortreads-SV/results-smoove/GCT-rmdup1.split.bam /oak/stanford/groups/emoding/analysis/shaghayegh/shortreads-SV/results-smoove/GCT-rmdup1.split.bam.orig.bam && cp /oak/stanford/groups/emoding/analysis/shaghayegh/shortreads-SV/results-smoove/GCT-rmdup1.disc.bam /oak/stanford/groups/emoding/analysis/shaghayegh/shortreads-SV/results-smoove/GCT-rmdup1.disc.bam.orig.bam -> exit status 127
[smoove] 2023/12/04 22:03:52 done calculating bam stats
panic: exit status 127

goroutine 1 [running]:
github.com/brentp/smoove/lumpy.Lumpy(0x7ffc8f2b4ea0, 0x3, 0x7ffc8f2b4eac, 0x40, 0x7ffc8f2b4dd8, 0x4d, 0xc0002307e0, 0x1, 0x1, 0xc000117900, ...)
	/home/brentp/src/smoove/lumpy/lumpy.go:141 +0x587
github.com/brentp/smoove/lumpy.Main()
	/home/brentp/src/smoove/lumpy/lumpy.go:351 +0x29b
main.main()
	/home/brentp/src/smoove/cmd/smoove/smoove.go:121 +0x1c4
Job finished with exit code 2 at: Mon Dec  4 22:03:52 PST 2023

These are the only files I get so far by running the command above

@brentp
Copy link
Owner

brentp commented Dec 6, 2023

Hi, you need to set up so that lumpy_filter is on your $PATH.
Note this in the output:

 lumpy_filter: command not found

and the non-zero exit code. Once you have lumpy_filter and the other required programs it should get further.

@Ana1335
Copy link
Author

Ana1335 commented Dec 6, 2023

Great, thanks

@Ana1335
Copy link
Author

Ana1335 commented Dec 8, 2023

Hey @brentp . I am still dealing with smoke running and no lumpy_filter error. I installed smoove via conda and the version I have installed is 0.2.8 (https://anaconda.org/bioconda/smoove). when I run smoove with no arguments, see a list of tools but it seems there is no lumpy_filter.

smoove version: 0.2.8

smoove calls several programs. Those with 'Y' are found on your $PATH. Only those with '*' are required.

 *[Y] bgzip [ sort   -> (compress) ->   index ]
 *[Y] gsort [(sort)  ->  compress   ->  index ]
 *[Y] tabix [ sort   ->  compress   -> (index)]
 *[Y] lumpy
 *[ ] lumpy_filter
 *[Y] samtools
 *[Y] svtyper
 *[Y] mosdepth [extra filtering of split and discordant files for better scaling]

  [Y] duphold [(optional) annotate calls with depth changes]
  [Y] svtools [only needed for large cohorts].

Available sub-commands are below. Each can be run with -h for additional help.

call        : call lumpy (and optionally svtyper)
merge       : merge and sort (using svtools) calls from multiple samples
genotype    : parallelize svtyper on an input VCF
paste       : square final calls from multiple samples (each with same number of variants)
plot-counts : plot counts of split, discordant reads before, after smoove filtering
annotate    : annotate a VCF with gene and quality of SV call
hipstr      : run hipSTR in parallel
duphold     : run duphold in parallel (this can be done by adding a flag to call or genotype)

is it just an old version of conda for smoove? why lumpy_filter is not installed?
Thanks a lot in advance for your help.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants