Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Documentation vs reality: workflow.container #3394

Closed
keiranmraine opened this issue Nov 16, 2022 · 5 comments · Fixed by #4190
Closed

Documentation vs reality: workflow.container #3394

keiranmraine opened this issue Nov 16, 2022 · 5 comments · Fixed by #4190

Comments

@keiranmraine
Copy link

keiranmraine commented Nov 16, 2022

Bug report

Expected behavior and actual behavior

The metadata documentation indicates that workflow.container returns a map of all containers:

Docker image used to run workflow tasks. When more than one image is used it returns a map object containing [process name, image name] pair entries.

A map is not returned when multiple containers defined, only the top level value when defined (process.container).

If top level process.container has not been defined we get [:].

nextflow.config

params {
    // output folder options
    outdir          = 'results'
    tracedir        = "${params.outdir}/pipeline_info"
}

docker.enabled = true

process {
    container     = 'ubuntu:18.04' // gives this value as result, comment to get `[:]`

    withName: bash_step {
        executor = 'local'
        container = null
    }

    withName: job_a {
        container = 'quay.io/wtsicgp/expansion_hunter:5.0.0'
    }

    withName: job_b {
        container = 'quay.io/wtsicgp/pcap-core:5.7.0'
    }
}

main.nf

#!/usr/bin/env nextflow
nextflow.enable.dsl=2

process bash_step {
    input:
        val(example)
    output:
        path '*.out', emit: wfl
    script:
        """
        echo ${workflow.container} > ${example}.out
        """
}

process job_a {
    input:
        path(wfl)
    output:
        path '*.out.gz', emit: gzip
    script:
        """
        gzip -c ${wfl} > ${wfl}.gz
        """
}

process job_b {
    input:
        path(wfl)
        path(gzip)
    
    output:
        path 'compress.comp'

    publishDir "${params.tracedir}", mode: "copy"

    shell:
        '''
        echo native: $(wc -c !{wfl}) > compress.comp
        echo compressed: $(wc -c !{gzip}) >> compress.comp
        echo "## content follows ##" >> compress.comp
        cat !{wfl} >> compress.comp
        echo "## EOC ##" >> compress.comp
        '''
}

workflow {
    main:
        bash_step('bob')
        job_a(bash_step.out.wfl)
        job_b(bash_step.out.wfl, job_a.out.gzip)
}

Run with:

nextflow run -stub-run main.nf
# cat results/pipeline_info/compress.comp

Gives:

native: 13 bob.out
compressed: 41 bob.out.gz
## content follows ##
ubuntu:18.04
## EOC ##

Would expect:

[bash_step : null, job_a : 'quay.io/wtsicgp/expansion_hunter:5.0.0', job_b: 'quay.io/wtsicgp/pcap-core:5.7.0']

(possibly no entry for bash_step)

Steps to reproduce the problem

As above

Program output

As above

Environment

  • Nextflow version: 22.10.2.5832
  • Java version: openjdk version "17.0.5" 2022-10-18
  • Operating system: Linux (GitHub CodeSpaces)
  • Bash version: GNU bash, version 5.1.4(1)-release (x86_64-pc-linux-gnu)

Additional context

Appears related to #1758, if this is not a suitable way to introspect this value please update documentation.

@Z-Zen
Copy link

Z-Zen commented Feb 6, 2023

Bumping this

One could do like this:

#!/usr/bin/env nextflow
nextflow.enable.dsl=2

// creating an empty dictionary
mapContainers = [:]

process bash_step {
    input:
        val(example)
    output:
        path '*.out', emit: wfl
    script:
    if (!mapContainers.containsKey(task.container)) {
      mapContainers[task.container] = task.process
     }
        """
        echo ${workflow.container} > ${example}.out
        """
}

process job_a {
    input:
        path(wfl)
    output:
        path '*.out.gz', emit: gzip
    script:
    if (!mapContainers.containsKey(task.container)) {
      mapContainers[task.container] = task.process
     }
        """
        gzip -c ${wfl} > ${wfl}.gz
        """
}

process job_b {
    input:
        path(wfl)
        path(gzip)
    
    output:
        path 'compress.comp'

    publishDir "${params.tracedir}", mode: "copy"

    shell:
    if (!mapContainers.containsKey(task.container)) {
      mapContainers[task.container] = task.process
     }
        '''
        echo native: $(wc -c !{wfl}) > compress.comp
        echo compressed: $(wc -c !{gzip}) >> compress.comp
        echo "## content follows ##" >> compress.comp
        cat !{wfl} >> compress.comp
        echo "## EOC ##" >> compress.comp
        '''
}

workflow {
    main:
        bash_step('bob')
        job_a(bash_step.out.wfl)
        job_b(bash_step.out.wfl, job_a.out.gzip)
}


// print the mapContainers here
workflow.onComplete {
  def status = "NA"
  if(workflow.success) {

    status = "SUCCESS"

    println("""
    Workflow container    : ${mapContainers}
    """
    )
  }
}

the problem is that this approach won't work when using with modules....

@stale
Copy link

stale bot commented Aug 12, 2023

This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs. Thank you for your contributions.

@stale stale bot added stale and removed stale labels Aug 12, 2023
@bentsherman
Copy link
Member

The relevant code is here:

config.process.each { String name, value ->
if( name.startsWith('$') && value instanceof Map && value.container ) {
result[name] = resolveClosure(value.container)
}
}

However when I inspect the process config at this point, I get the following:

process.container = ubuntu:18.04
process.withName:bash_step = [executor:local, container:null]
process.withName:job_a = [container:quay.io/wtsicgp/expansion_hunter:5.0.0]
process.withName:job_b = [container:quay.io/wtsicgp/pcap-core:5.7.0]

@pditommaso can you explain the logic here? I don't know what the $ means, but it doesn't seem to be valid anymore.

@pditommaso
Copy link
Member

I have to say I have no memory about that. Tho there's even a test for that

def 'should fetch containers definition' () {
String text
when:
text = '''
process.container = 'beta'
'''
then:
new Session(cfg(text)).fetchContainers() == 'beta'
when:
text = '''
process {
$proc1 { container = 'alpha' }
$proc2 { container ='beta' }
}
'''
then:
new Session(cfg(text)).fetchContainers() == ['$proc1': 'alpha', '$proc2': 'beta']
when:
text = '''
process {
$proc1 { container = 'alpha' }
$proc2 { container ='beta' }
}
process.container = 'gamma'
'''
then:
new Session(cfg(text)).fetchContainers() == ['$proc1': 'alpha', '$proc2': 'beta', default: 'gamma']
when:
text = '''
process.container = { "ngi/rnaseq:${workflow.getRevision() ?: 'latest'}" }
'''
def meta = Mock(WorkflowMetadata); meta.getRevision() >> '1.2'
def session = new Session(cfg(text))
session.binding.setVariable('workflow',meta)
then:
session.fetchContainers() == 'ngi/rnaseq:1.2'
}

@bentsherman
Copy link
Member

The original container code was added long ago at a time when process names had to have a $ prefix in the config (docs).

I will draft a fix.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging a pull request may close this issue.

4 participants