Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Process get_software_versions crashes #42

Closed
Laolga opened this issue Nov 19, 2019 · 15 comments
Closed

Process get_software_versions crashes #42

Laolga opened this issue Nov 19, 2019 · 15 comments

Comments

@Laolga
Copy link

Laolga commented Nov 19, 2019

Hi!
I am experiencing difficulties while trying to run nf core hic on my data, while the test run went completely fine. I would really appreciate if you could help to figure out what goes wrong here.

I'm attaching the log file:

N E X T F L O W  ~  version 19.10.0
Launching `nf-core/hic` [wise_hypatia] - revision: 481964d91c [master]
----------------------------------------------------
                                        ,--./,-.
        ___     __   __   __   ___     /,-._.--~'
  |\ | |__  __ /  ` /  \ |__) |__         }  {
  | \| |       \__, \__/ |  \ |___     \`-._,-`-,
                                        `._,._,'
  nf-core/atacseq v1.1.0
----------------------------------------------------
WARN: Access to undefined parameter `max_restriction_framgnet_size` -- Initialise it to a default value eg. `params.max_restriction_framgnet_size = some_value`
Pipeline Release  : master
Run Name          : wise_hypatia
Reads             : /nfs/data/Hennighausen/p6/merged/final/sample1/*_R{1,2}.fastq.gz
splitFastq        : false
Fasta Ref         : s3://ngi-igenomes/igenomes//Mus_musculus/Ensembl/GRCm38/Sequence/WholeGenomeFasta/genome.fa
Restriction Motif : ^GATC
Ligation Motif    : ^GATCGATC
DNase Mode        : false
Remove Dup        : true
Min MAPQ          : 30
Min Fragment Size : 100
Max Fragment Size : null
Min Insert Size   : 100
Max Insert Size   : 600
Min CIS dist      : true
Maps resolution   : 1000000,500000
Max Memory        : 10.GB
Max CPUs          : 40
Max Time          : 2.0
Output dir        : ./results
Working dir       : /nfs/data/Hennighausen/work
Container Engine  : null
Current home      : /nfs/home/users/olgala
Current user      : olgala
Current path      : /nfs/data/Hennighausen
Script dir        : /nfs/home/users/olgala/.nextflow/assets/nf-core/hic
Config Profile    : conda
----------------------------------------------------
[-        ] process > get_software_versions    -
[-        ] process > makeChromSize            -
[-        ] process > getRestrictionFragments  -
[-        ] process > bowtie2_end_to_end       -
[-        ] process > trim_reads               -
[-        ] process > bowtie2_on_trimmed_reads -
[-        ] process > merge_mapping_steps      -
[-        ] process > combine_mapped_files     -
[-        ] process > get_valid_interaction    -
[-        ] process > remove_duplicates        -
[-        ] process > merge_sample             -
[-        ] process > build_contact_maps       -
[-        ] process > run_ice                  -
[-        ] process > generate_cool            -
[-        ] process > multiqc                  -
[-        ] process > output_documentation     -
Creating Conda env: /nfs/home/users/olgala/.nextflow/assets/nf-core/hic/environment.yml [cache /nfs/data/Hennighausen/work/conda/nf-core-hic-1.1.0-b0d9faeab5a09c5a9485e611e955124d]

[-        ] process > get_software_versions    -
[-        ] process > makeChromSize            -
[-        ] process > getRestrictionFragments  -
[-        ] process > bowtie2_end_to_end       -
[-        ] process > trim_reads               -
[-        ] process > bowtie2_on_trimmed_reads -
[-        ] process > merge_mapping_steps      -
[-        ] process > combine_mapped_files     -
[-        ] process > get_valid_interaction    -
[-        ] process > remove_duplicates        -
[-        ] process > merge_sample             -
[-        ] process > build_contact_maps       -
[-        ] process > run_ice                  -
[-        ] process > generate_cool            -
[-        ] process > multiqc                  -
[-        ] process > output_documentation     -
Creating Conda env: /nfs/home/users/olgala/.nextflow/assets/nf-core/hic/environment.yml [cache /nfs/data/Hennighausen/work/conda/nf-core-hic-1.1.0-b0d9faeab5a09c5a9485e611e955124d]
Staging foreign file: s3://ngi-igenomes/igenomes/Mus_musculus/Ensembl/GRCm38/Sequence/WholeGenomeFasta/genome.fa

[-        ] process > get_software_versions    -
[-        ] process > makeChromSize            -
[-        ] process > getRestrictionFragments  -
[-        ] process > bowtie2_end_to_end       -
[-        ] process > trim_reads               -
[-        ] process > bowtie2_on_trimmed_reads -
[-        ] process > merge_mapping_steps      -
[-        ] process > combine_mapped_files     -
[-        ] process > get_valid_interaction    -
[-        ] process > remove_duplicates        -
[-        ] process > merge_sample             -
[-        ] process > build_contact_maps       -
[-        ] process > run_ice                  -
[-        ] process > generate_cool            -
[-        ] process > multiqc                  -
[-        ] process > output_documentation     -
Staging foreign file: s3://ngi-igenomes/igenomes/Mus_musculus/Ensembl/GRCm38/Sequence/WholeGenomeFasta/genome.fa
Staging foreign file: s3://ngi-igenomes/igenomes/Mus_musculus/Ensembl/GRCm38/Sequence/Bowtie2Index


executor >  local (1)
[af/5f9e5c] process > get_software_versions    [  0%] 0 of 1
[-        ] process > makeChromSize            -
[-        ] process > getRestrictionFragments  -
[-        ] process > bowtie2_end_to_end       -
[-        ] process > trim_reads               -
[-        ] process > bowtie2_on_trimmed_reads -
[-        ] process > merge_mapping_steps      -
[-        ] process > combine_mapped_files     -
[-        ] process > get_valid_interaction    -
[-        ] process > remove_duplicates        -
[-        ] process > merge_sample             -
[-        ] process > build_contact_maps       -
[-        ] process > run_ice                  -
[-        ] process > generate_cool            -
[-        ] process > multiqc                  -
[-        ] process > output_documentation     -



executor >  local (6)
[af/5f9e5c] process > get_software_versions          [  0%] 0 of 1
[2c/5c2ead] process > makeChromSize (genome.fa)      [  0%] 0 of 1
[32/3d0dff] process > getRestrictionFragments (ge... [  0%] 0 of 1
[35/16b000] process > bowtie2_end_to_end (paired_R1) [  0%] 0 of 2
[-        ] process > trim_reads                     -
[-        ] process > bowtie2_on_trimmed_reads       -
[-        ] process > merge_mapping_steps            -
[-        ] process > combine_mapped_files           -
[-        ] process > get_valid_interaction          -
[-        ] process > remove_duplicates              -
[-        ] process > merge_sample                   -
[-        ] process > build_contact_maps             -
[-        ] process > run_ice                        -
[-        ] process > generate_cool                  -
[-        ] process > multiqc                        -
[88/d216aa] process > output_documentation (1)       [  0%] 0 of 1
Error executing process > 'get_software_versions'

Caused by:
  Process `get_software_versions` terminated with an error exit status (1)

Command executed:

  echo 1.1.0 > v_pipeline.txt
  echo 19.10.0 > v_nextflow.txt
  bowtie2 --version > v_bowtie2.txt
  python --version > v_python.txt 2>&1
  samtools --version > v_samtools.txt
  multiqc --version > v_multiqc.txt
  scrape_software_versions.py &> software_versions_mqc.yaml

Command exit status:
  1

Command output:
  (empty)

Command wrapper:
  .command.run: line 260: activate: No such file or directory

Work dir:
  /nfs/data/Hennighausen/work/af/5f9e5ca56bf657ef850571d4f19ce5

Tip: view the complete command output by changing to the process work dir and entering the command `cat .command.out`

executor >  local (6)
[af/5f9e5c] process > get_software_versions          [100%] 1 of 1, failed: 1 ✘
[2c/5c2ead] process > makeChromSize (genome.fa)      [100%] 1 of 1, failed: 1 ✘
[32/3d0dff] process > getRestrictionFragments (ge... [100%] 1 of 1, failed: 1 ✘
[35/16b000] process > bowtie2_end_to_end (paired_R1) [100%] 2 of 2, failed: 2 ✘
[-        ] process > trim_reads                     -
[-        ] process > bowtie2_on_trimmed_reads       -
[-        ] process > merge_mapping_steps            -
[-        ] process > combine_mapped_files           -
[-        ] process > get_valid_interaction          -
[-        ] process > remove_duplicates              -
[-        ] process > merge_sample                   -
[-        ] process > build_contact_maps             -
[-        ] process > run_ice                        -
[-        ] process > generate_cool                  -
[-        ] process > multiqc                        -
[88/d216aa] process > output_documentation (1)       [100%] 1 of 1, failed: 1 ✘
Execution cancelled -- Finishing pending tasks before exit
Error executing process > 'get_software_versions'

Caused by:
  Process `get_software_versions` terminated with an error exit status (1)

Command executed:

  echo 1.1.0 > v_pipeline.txt
  echo 19.10.0 > v_nextflow.txt
  bowtie2 --version > v_bowtie2.txt
  python --version > v_python.txt 2>&1
  samtools --version > v_samtools.txt
  multiqc --version > v_multiqc.txt
  scrape_software_versions.py &> software_versions_mqc.yaml

Command exit status:
  1

Command output:
  (empty)

Command wrapper:
  .command.run: line 260: activate: No such file or directory

Work dir:
  /nfs/data/Hennighausen/work/af/5f9e5ca56bf657ef850571d4f19ce5

Tip: view the complete command output by changing to the process work dir and entering the command `cat .command.out`
@nservant
Copy link
Collaborator

Hi,

 .command.run: line 260: activate: No such file or directory

Could you try to add conda in your PATH please ?

Then, I noticed a typo

params.max_restriction_framgnet_size

I will check in the code ! Thanks

@apeltzer apeltzer changed the title Process get_software_versions terminated with an error exit status (1) Process get_software_versions crashes Nov 19, 2019
@nservant
Copy link
Collaborator

The typo was only in the help message. Has been fixed in the dev branch

@Laolga
Copy link
Author

Laolga commented Nov 19, 2019

Thanks for such a fast response!
I believe conda was in my path already though.
If the typo was only in the help message then what can be the issue?

@Laolga
Copy link
Author

Laolga commented Nov 19, 2019

Max Fragment Size is null in the log although I've run the script as follows:

nextflow run nf-core/hic --reads '/nfs/data/Hennighausen/p6/merged/final/sample1/*_R{1,2}.fastq.gz' --genome GRCm38 -profile conda --min_mapq '30' --restriction_site '^GATC' --ligation_site '^GATCGATC' --min_restriction_fragment_size '100' --max_restriction_fragment_size '100000' --min_insert_size '100' --max_insert_size '600' --rm_singleton --rm_dup --rm_multi --bins_size '2500,5000,10000,25000,500000,100000' -c nextflow.config --max_memory '10.GB' --max_time '2.d' --max_cpus 40

Do you think it could also affect the crash?

@nservant
Copy link
Collaborator

For the typo, you just have to update your command line. I guess you previously used
--max_restriction_framgnet_size NUMBER
Whereas it expects
--max_restriction_fragment_size NUMBER

Regarding your error

.command.run: line 260: activate: No such file or directory

To me, it means that it was not able to run source activate of your conda before running the process, that's why I suggested to check if conda is in your path (and on all the nodes if you run the pipeline on your cluster ?)

@apeltzer
Copy link
Member

Are you running this on a cluster? If yes, please check whether conda is available on all nodes - e.g. if you need to run a module activate tum/conda or similar, you may have to add a config for the TUM first that describes how to run/start jobs on your cluster.

@Laolga
Copy link
Author

Laolga commented Nov 20, 2019

Yes, conda is available on every node.
Interestingly, the pipeline crashes on different places every time, but in the same process. After I've verified that conda is available everywhere the crash happend 3 lines further:

executor >  local (6)
[a8/07e98b] process > get_software_versions          [100%] 1 of 1, failed: 1 ✘
[8c/b8f27b] process > makeChromSize (genome.fa)      [100%] 1 of 1, failed: 1 ✘
[a5/d0fea7] process > getRestrictionFragments (ge... [100%] 1 of 1, failed: 1 ✘
[74/b1c6c9] process > bowtie2_end_to_end (paired_R1) [100%] 2 of 2, failed: 2 ✘
[-        ] process > trim_reads                     -
[-        ] process > bowtie2_on_trimmed_reads       -
[-        ] process > merge_mapping_steps            -
[-        ] process > combine_mapped_files           -
[-        ] process > get_valid_interaction          -
[-        ] process > remove_duplicates              -
[-        ] process > merge_sample                   -
[-        ] process > build_contact_maps             -
[-        ] process > run_ice                        -
[-        ] process > generate_cool                  -
[-        ] process > multiqc                        -
[53/a5f2ea] process > output_documentation (1)       [100%] 1 of 1, failed: 1 ✘
Execution cancelled -- Finishing pending tasks before exit
Error executing process > 'output_documentation (1)'

Caused by:
  Process `output_documentation (1)` terminated with an error exit status (1)

Command executed:

  markdown_to_html.r output.md results_description.html

Command exit status:
  1

Command output:
  (empty)

Command wrapper:
  .command.run: line 263: activate: No such file or directory

@nservant
Copy link
Collaborator

The order of processes do not really matter.
Can you try to simply run the -profile test locally ?

@Laolga
Copy link
Author

Laolga commented Nov 20, 2019

Yes, that works.
On my data and locally the crash happens a bit further:

$ nextflow run nf-core/hic --reads '/nfs/data/Hennighausen/p6/merged/final/sample1/*_R{1,2}.fastq.gz' --genome GRCm38  -profile conda --min_mapq '30' --restriction_site '^GATC' --ligation_site '^GATCGATC' --min_restriction_fragment_size '100' --max_restriction_fragment_size '100000' --min_insert_size '100' --max_insert_size '600' --rm_singleton --rm_dup --rm_multi --bins_size '2500,5000,10000,25000,500000,100000' -c nextflow_old.config --max_memory '10.GB' --max_time '2.d' --max_cpus 40 -resume

N E X T F L O W  ~  version 19.10.0
Launching `nf-core/hic` [suspicious_montalcini] - revision: 481964d91c [master]
----------------------------------------------------
                                        ,--./,-.
        ___     __   __   __   ___     /,-._.--~'
  |\ | |__  __ /  ` /  \ |__) |__         }  {
  | \| |       \__, \__/ |  \ |___     \`-._,-`-,
                                        `._,._,'
  nf-core/atacseq v1.1.0
----------------------------------------------------
WARN: Access to undefined parameter `max_restriction_framgnet_size` -- Initialise it to a default value eg. `params.max_restriction_framgnet_size = some_value`
Pipeline Release  : master
Run Name          : nf-core hic processing
Reads             : /nfs/data/Hennighausen/p6/merged/final/sample1/*_R{1,2}.fastq.gz
splitFastq        : false
Fasta Ref         : s3://ngi-igenomes/igenomes//Mus_musculus/Ensembl/GRCm38/Sequence/WholeGenomeFasta/genome.fa
Restriction Motif : ^GATC
Ligation Motif    : ^GATCGATC
DNase Mode        : false
Remove Dup        : true
Min MAPQ          : 30
Min Fragment Size : 100
Max Fragment Size : null
Min Insert Size   : 100
Max Insert Size   : 600
Min CIS dist      : true
Maps resolution   : 1000000,500000
Max Memory        : 10.GB
Max CPUs          : 40
Max Time          : 2.0
Output dir        : /nfs/data/Hennighausen/results/
Working dir       : /nfs/data/Hennighausen/work
Container Engine  : null
Current home      : /nfs/home/users/olgala
Current user      : olgala
Current path      : /nfs/data/Hennighausen
Script dir        : /nfs/home/users/olgala/.nextflow/assets/nf-core/hic
Config Profile    : conda
----------------------------------------------------
[-        ] process > get_software_versions    -
[-        ] process > get_software_versions    -
executor >  local (6)
executor >  local (6)
[4b/41bebc] process > get_software_versions                       [100%] 1 of 1, failed: 1 ✘
[4c/c6bdb1] process > makeChromSize (genome.fa)                   [100%] 1 of 1, failed: 1 ✘
[ce/d2444d] process > getRestrictionFragments (genome.fa [^GATC]) [100%] 1 of 1, failed: 1 ✘
[2d/8c704b] process > bowtie2_end_to_end (paired_R1)              [100%] 2 of 2, failed: 2 ✘
[-        ] process > trim_reads                                  -
[-        ] process > bowtie2_on_trimmed_reads                    -
[-        ] process > merge_mapping_steps                         -
[-        ] process > combine_mapped_files                        -
[-        ] process > get_valid_interaction                       -
[-        ] process > remove_duplicates                           -
[-        ] process > merge_sample                                -
[-        ] process > build_contact_maps                          -
[-        ] process > run_ice                                     -
[-        ] process > generate_cool                               -
[-        ] process > multiqc                                     -
[68/11725f] process > output_documentation (1)                    [100%] 1 of 1, failed: 1 ✘
[0;35m[nf-core/hic] Pipeline completed with errors
Error executing process > 'bowtie2_end_to_end (paired_R2)'

Caused by:
  Process exceeded running time limit (2ms)

Command executed:

  bowtie2 --rg-id BMG --rg SM:paired_R2 \
  --very-sensitive -L 30 --score-min L,-0.6,-0.2 --end-to-end --reorder \
  -p 4 \
  -x Bowtie2Index/genome \
  --un paired_R2_unmap.fastq \
  --un p-U paired_R2.fastq.gz | samtools view -F 4 -bS - > paired_R2.bam

Command exit status:
  -

Command output:
  (empty)

Work dir:
  /nfs/data/Hennighausen/work/64/a93bf5aedc9e0bb958fc276619cbd1

Tip: view the complete command output by changing to the process work dir and entering the command `cat .command.out`
Tip: view the complete command output by changing to the process work dir and entering the command `cat .command.out`

@nservant
Copy link
Collaborator

so it doesn't work :)
I'm almost sure there is something wrong with the conda environment ...
Could you try again without -resume ... and check if the conda environment is well created ?

@nservant
Copy link
Collaborator

Any news on this thread ?

@kevbrick
Copy link

kevbrick commented Mar 14, 2020

Hi there,

I have a similar problem that gives the same error. I downloaded the pipeline and singularity container using nf-core.

My pipe crashes because it can't find markdown_to_html.r. Indeed, if I run a shell within the container, I can't find that script. Is it simply that this file (and perhaps others) is/are missing from the container?

Thanks,
Kevin

$ nextflow20 run /data/RDCO/code/nf-core/nf-core-hic-1.1.0/workflow/ -c /data/RDCO/code/hic/alt/biowulf.config  -profile singularity,test -with-singularity /data/RDCO/code/nf-core/nf-core-hic-1.1.0.simg 

N E X T F L O W  ~  version 20.01.0
Launching `/data/RDCO/code/nf-core/nf-core-hic-1.1.0/workflow/main.nf` [backstabbing_torvalds] - revision: b4eae689fb
----------------------------------------------------
                                        ,--./,-.
        ___     __   __   __   ___     /,-._.--~'
  |\ | |__  __ /  ` /  \ |__) |__         }  {
  | \| |       \__, \__/ |  \ |___     \`-._,-`-,
                                        `._,._,'
  nf-core/atacseq v1.1.0
----------------------------------------------------
WARN: Access to undefined parameter `max_restriction_framgnet_size` -- Initialise it to a default value eg. `params.max_restriction_framgnet_size = some_value`
Run Name          : backstabbing_torvalds
Reads             : *{1,2}.fastq.gz
splitFastq        : false
Fasta Ref         : https://github.com/nf-core/test-datasets/raw/hic/reference/W303_SGD_2015_JRIU00000000.fsa
Restriction Motif : A^AGCTT
Ligation Motif    : AAGCTAGCTT
DNase Mode        : false
Remove Dup        : true
Min MAPQ          : 0
Min Fragment Size : true
Max Fragment Size : null
Min Insert Size   : true
Max Insert Size   : true
Min CIS dist      : true
Maps resolution   : 1000000,500000
Max Memory        : 125 GB
Max CPUs          : 16
Max Time          : 10d
Output dir        : ./results
Working dir       : /gpfs/gsfs11/users/RDCO/hicPipeTest/work
Container Engine  : singularity
Container         : /data/RDCO/code/nf-core/nf-core-hic-1.1.0.simg
Current home      : /home/brickkm
Current user      : brickkm
Current path      : /data/RDCO/hicPipeTest
Script dir        : /data/RDCO/code/nf-core/nf-core-hic-1.1.0/workflow
Config Profile    : singularity,test
Config Description: NIh biowulf config for nf-core/configs.
Config Contact    : Kevin Brick (@kevbrick)
----------------------------------------------------
executor >  slurm (5)
executor >  slurm (5)
[7e/22cd5f] process > get_software_versions    [100%] 1 of 1, failed: 1 ✘
[0e/dd7d05] process > makeBowtie2Index         [100%] 1 of 1, failed: 1 ✘
[25/d4dc0e] process > makeChromSize            [100%] 1 of 1, failed: 1 ✘
[9e/6d280e] process > getRestrictionFragments  [100%] 1 of 1, failed: 1 ✘
[-        ] process > bowtie2_end_to_end       -
[-        ] process > trim_reads               -
[-        ] process > bowtie2_on_trimmed_reads -
[-        ] process > merge_mapping_steps      -
[-        ] process > combine_mapped_files     -
[-        ] process > get_valid_interaction    -
[-        ] process > remove_duplicates        -
[-        ] process > merge_sample             -
[-        ] process > build_contact_maps       -
[-        ] process > run_ice                  -
[-        ] process > generate_cool            -
[-        ] process > multiqc                  -
[5a/3c0e9f] process > output_documentation     [100%] 1 of 1, failed: 1 ✘
[0;35m[nf-core/hic] Pipeline completed with errors
WARN: To render the execution DAG in the required format it is required to install Graphviz -- See http://www.graphviz.org for more info.
Error executing process > 'output_documentation (1)'

Caused by:
  Process `output_documentation (1)` terminated with an error exit status (127)

Command executed:

  markdown_to_html.r output.md results_description.html

Command exit status:
  127

Command output:
  (empty)

Command error:
  .command.sh: line 2: markdown_to_html.r: command not found

Work dir:
  /gpfs/gsfs11/users/RDCO/hicPipeTest/work/5a/3c0e9f1f4a20a72f888d08d375d414

Tip: you can replicate the issue by changing to the process work dir and entering the command `bash .command.run`

@ewels
Copy link
Member

ewels commented Nov 7, 2020

Hi @kevbrick,

What is the pipeline /data/RDCO/code/nf-core/nf-core-hic-1.1.0/workflow/? It does not seem to be the exact nf-core/hic pipeline - the header says nf-core/atacseq v1.1.0

Can you try running nextflow run nf-core/hic -r 1.2.2 instead?

The markdown_to_html.r script is depreciated and has been replaced by markdown_to_html.py and is no longer referenced in the nf-core/hic v1.2.2 pipeline, so this error should be impossible if running the correct pipeline code.

These scripts are not included in the singularity containers, they are packaged within the pipeline code and mounted into the container. So it's not a container issue.

Thanks,

Phil

@ewels
Copy link
Member

ewels commented Nov 7, 2020

ok, the nf-core/atacseq header thing was a red herring - this was a typo in v1.1.0 of the pipeline:

hic/main.nf

Line 990 in 481964d

${c_purple} nf-core/atacseq v${workflow.manifest.version}${c_reset}

Still, given that the pipeline code has come on a long since this issue, I will close it. If it recurs with the latest release we can reopen.

@ewels ewels closed this as completed Nov 7, 2020
@kevbrick
Copy link

kevbrick commented Nov 7, 2020

Hi @ewels,

Sounds good, and thanks for the reply. I'll try the newer version at some stage ...

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

5 participants