New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Process get_software_versions crashes #42
Comments
Hi,
Could you try to add conda in your PATH please ? Then, I noticed a typo
I will check in the code ! Thanks |
get_software_versions
terminated with an error exit status (1)
The typo was only in the help message. Has been fixed in the |
Thanks for such a fast response! |
Max Fragment Size is null in the log although I've run the script as follows:
Do you think it could also affect the crash? |
For the typo, you just have to update your command line. I guess you previously used Regarding your error
To me, it means that it was not able to run |
Are you running this on a cluster? If yes, please check whether conda is available on all nodes - e.g. if you need to run a |
Yes, conda is available on every node.
|
The order of processes do not really matter. |
Yes, that works. $ nextflow run nf-core/hic --reads '/nfs/data/Hennighausen/p6/merged/final/sample1/*_R{1,2}.fastq.gz' --genome GRCm38 -profile conda --min_mapq '30' --restriction_site '^GATC' --ligation_site '^GATCGATC' --min_restriction_fragment_size '100' --max_restriction_fragment_size '100000' --min_insert_size '100' --max_insert_size '600' --rm_singleton --rm_dup --rm_multi --bins_size '2500,5000,10000,25000,500000,100000' -c nextflow_old.config --max_memory '10.GB' --max_time '2.d' --max_cpus 40 -resume
N E X T F L O W ~ version 19.10.0
Launching `nf-core/hic` [suspicious_montalcini] - revision: 481964d91c [master]
----------------------------------------------------
,--./,-.
___ __ __ __ ___ /,-._.--~'
|\ | |__ __ / ` / \ |__) |__ } {
| \| | \__, \__/ | \ |___ \`-._,-`-,
`._,._,'
nf-core/atacseq v1.1.0
----------------------------------------------------
WARN: Access to undefined parameter `max_restriction_framgnet_size` -- Initialise it to a default value eg. `params.max_restriction_framgnet_size = some_value`
Pipeline Release : master
Run Name : nf-core hic processing
Reads : /nfs/data/Hennighausen/p6/merged/final/sample1/*_R{1,2}.fastq.gz
splitFastq : false
Fasta Ref : s3://ngi-igenomes/igenomes//Mus_musculus/Ensembl/GRCm38/Sequence/WholeGenomeFasta/genome.fa
Restriction Motif : ^GATC
Ligation Motif : ^GATCGATC
DNase Mode : false
Remove Dup : true
Min MAPQ : 30
Min Fragment Size : 100
Max Fragment Size : null
Min Insert Size : 100
Max Insert Size : 600
Min CIS dist : true
Maps resolution : 1000000,500000
Max Memory : 10.GB
Max CPUs : 40
Max Time : 2.0
Output dir : /nfs/data/Hennighausen/results/
Working dir : /nfs/data/Hennighausen/work
Container Engine : null
Current home : /nfs/home/users/olgala
Current user : olgala
Current path : /nfs/data/Hennighausen
Script dir : /nfs/home/users/olgala/.nextflow/assets/nf-core/hic
Config Profile : conda
----------------------------------------------------
[- ] process > get_software_versions -
[- ] process > get_software_versions -
executor > local (6)
executor > local (6)
[4b/41bebc] process > get_software_versions [100%] 1 of 1, failed: 1 ✘
[4c/c6bdb1] process > makeChromSize (genome.fa) [100%] 1 of 1, failed: 1 ✘
[ce/d2444d] process > getRestrictionFragments (genome.fa [^GATC]) [100%] 1 of 1, failed: 1 ✘
[2d/8c704b] process > bowtie2_end_to_end (paired_R1) [100%] 2 of 2, failed: 2 ✘
[- ] process > trim_reads -
[- ] process > bowtie2_on_trimmed_reads -
[- ] process > merge_mapping_steps -
[- ] process > combine_mapped_files -
[- ] process > get_valid_interaction -
[- ] process > remove_duplicates -
[- ] process > merge_sample -
[- ] process > build_contact_maps -
[- ] process > run_ice -
[- ] process > generate_cool -
[- ] process > multiqc -
[68/11725f] process > output_documentation (1) [100%] 1 of 1, failed: 1 ✘
[0;35m[nf-core/hic] Pipeline completed with errors
Error executing process > 'bowtie2_end_to_end (paired_R2)'
Caused by:
Process exceeded running time limit (2ms)
Command executed:
bowtie2 --rg-id BMG --rg SM:paired_R2 \
--very-sensitive -L 30 --score-min L,-0.6,-0.2 --end-to-end --reorder \
-p 4 \
-x Bowtie2Index/genome \
--un paired_R2_unmap.fastq \
--un p-U paired_R2.fastq.gz | samtools view -F 4 -bS - > paired_R2.bam
Command exit status:
-
Command output:
(empty)
Work dir:
/nfs/data/Hennighausen/work/64/a93bf5aedc9e0bb958fc276619cbd1
Tip: view the complete command output by changing to the process work dir and entering the command `cat .command.out`
Tip: view the complete command output by changing to the process work dir and entering the command `cat .command.out` |
so it doesn't work :) |
Any news on this thread ? |
Hi there, I have a similar problem that gives the same error. I downloaded the pipeline and singularity container using nf-core. My pipe crashes because it can't find Thanks, $ nextflow20 run /data/RDCO/code/nf-core/nf-core-hic-1.1.0/workflow/ -c /data/RDCO/code/hic/alt/biowulf.config -profile singularity,test -with-singularity /data/RDCO/code/nf-core/nf-core-hic-1.1.0.simg
N E X T F L O W ~ version 20.01.0
Launching `/data/RDCO/code/nf-core/nf-core-hic-1.1.0/workflow/main.nf` [backstabbing_torvalds] - revision: b4eae689fb
----------------------------------------------------
,--./,-.
___ __ __ __ ___ /,-._.--~'
|\ | |__ __ / ` / \ |__) |__ } {
| \| | \__, \__/ | \ |___ \`-._,-`-,
`._,._,'
nf-core/atacseq v1.1.0
----------------------------------------------------
WARN: Access to undefined parameter `max_restriction_framgnet_size` -- Initialise it to a default value eg. `params.max_restriction_framgnet_size = some_value`
Run Name : backstabbing_torvalds
Reads : *{1,2}.fastq.gz
splitFastq : false
Fasta Ref : https://github.com/nf-core/test-datasets/raw/hic/reference/W303_SGD_2015_JRIU00000000.fsa
Restriction Motif : A^AGCTT
Ligation Motif : AAGCTAGCTT
DNase Mode : false
Remove Dup : true
Min MAPQ : 0
Min Fragment Size : true
Max Fragment Size : null
Min Insert Size : true
Max Insert Size : true
Min CIS dist : true
Maps resolution : 1000000,500000
Max Memory : 125 GB
Max CPUs : 16
Max Time : 10d
Output dir : ./results
Working dir : /gpfs/gsfs11/users/RDCO/hicPipeTest/work
Container Engine : singularity
Container : /data/RDCO/code/nf-core/nf-core-hic-1.1.0.simg
Current home : /home/brickkm
Current user : brickkm
Current path : /data/RDCO/hicPipeTest
Script dir : /data/RDCO/code/nf-core/nf-core-hic-1.1.0/workflow
Config Profile : singularity,test
Config Description: NIh biowulf config for nf-core/configs.
Config Contact : Kevin Brick (@kevbrick)
----------------------------------------------------
executor > slurm (5)
executor > slurm (5)
[7e/22cd5f] process > get_software_versions [100%] 1 of 1, failed: 1 ✘
[0e/dd7d05] process > makeBowtie2Index [100%] 1 of 1, failed: 1 ✘
[25/d4dc0e] process > makeChromSize [100%] 1 of 1, failed: 1 ✘
[9e/6d280e] process > getRestrictionFragments [100%] 1 of 1, failed: 1 ✘
[- ] process > bowtie2_end_to_end -
[- ] process > trim_reads -
[- ] process > bowtie2_on_trimmed_reads -
[- ] process > merge_mapping_steps -
[- ] process > combine_mapped_files -
[- ] process > get_valid_interaction -
[- ] process > remove_duplicates -
[- ] process > merge_sample -
[- ] process > build_contact_maps -
[- ] process > run_ice -
[- ] process > generate_cool -
[- ] process > multiqc -
[5a/3c0e9f] process > output_documentation [100%] 1 of 1, failed: 1 ✘
[0;35m[nf-core/hic] Pipeline completed with errors
WARN: To render the execution DAG in the required format it is required to install Graphviz -- See http://www.graphviz.org for more info.
Error executing process > 'output_documentation (1)'
Caused by:
Process `output_documentation (1)` terminated with an error exit status (127)
Command executed:
markdown_to_html.r output.md results_description.html
Command exit status:
127
Command output:
(empty)
Command error:
.command.sh: line 2: markdown_to_html.r: command not found
Work dir:
/gpfs/gsfs11/users/RDCO/hicPipeTest/work/5a/3c0e9f1f4a20a72f888d08d375d414
Tip: you can replicate the issue by changing to the process work dir and entering the command `bash .command.run` |
Hi @kevbrick, What is the pipeline Can you try running The These scripts are not included in the singularity containers, they are packaged within the pipeline code and mounted into the container. So it's not a container issue. Thanks, Phil |
ok, the Line 990 in 481964d
Still, given that the pipeline code has come on a long since this issue, I will close it. If it recurs with the latest release we can reopen. |
Hi @ewels, Sounds good, and thanks for the reply. I'll try the newer version at some stage ... |
Hi!
I am experiencing difficulties while trying to run nf core hic on my data, while the test run went completely fine. I would really appreciate if you could help to figure out what goes wrong here.
I'm attaching the log file:
The text was updated successfully, but these errors were encountered: