Skip to content
Closed
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
783 changes: 783 additions & 0 deletions docs/hello_nextflow/03_hello-nf-test-part-1.md

Large diffs are not rendered by default.

Original file line number Diff line number Diff line change
Expand Up @@ -35,7 +35,7 @@ If you're continuing on directly from Part 6, you'll need to move up one directo
cd hello-nf-test
```

The `hello-nf-test` directory has the same content and structure that you're expected to end up with in `hello-modules` on completion of Part 6.
The `hello-nf-test-part2` directory has the same content and structure that you're expected to end up with in `hello-modules` on completion of Part 6.

```console title="Directory contents"
hello-nf-test/
Expand Down Expand Up @@ -179,9 +179,9 @@ In plain English, the logic of the test reads as follows:

The expected results are formulated as `assert` statements.

- `assert process.success` states that we expect the process to run successfully and complete without any failures.
- `snapshot(process.out).match()` states that we expect the result of the run to be identical to the result obtained in a previous run (if applicable).
We discuss this in more detail later.
- `assert process.success` states that we expect the process to run successfully and complete without any failures.
- `snapshot(process.out).match()` states that we expect the result of the run to be identical to the result obtained in a previous run (if applicable).
We discuss this in more detail later.

For most real-world modules (which usually require some kind of input), this is not yet a functional test.
We need to add the inputs that will be fed to the process, and any parameters if applicable.
Expand Down Expand Up @@ -293,9 +293,9 @@ params {

Finally, it's time to run our test! Let's break down the syntax.

- The basic command is `nf-test test`.
- To that, we add `--profile docker_on` to specify that we want Nextflow to run the test with Docker enabled.
- Then the test file that we want to run.
- The basic command is `nf-test test`.
- To that, we add `--profile docker_on` to specify that we want Nextflow to run the test with Docker enabled.
- Then the test file that we want to run.

!!!note

Expand Down Expand Up @@ -345,8 +345,8 @@ If we re-run the test, the program will check that the new output matches the ou

If, in the course of future development, something in the code changes that causes the output to be different, the test will fail and we will have to determine whether the change is expected or not.

- If it turns out that something in the code broke, we will have to fix it, with the expectation that the fixed code will pass the test.
- If it is an expected change (e.g., the tool has been improved and the results are better) then we will need to update the snapshot to accept the new output as the reference to match, using the parameter `--update-snapshot` when we run the test command.
- If it turns out that something in the code broke, we will have to fix it, with the expectation that the fixed code will pass the test.
- If it is an expected change (e.g., the tool has been improved and the results are better) then we will need to update the snapshot to accept the new output as the reference to match, using the parameter `--update-snapshot` when we run the test command.

### 1.7. Add more tests to `SAMTOOLS_INDEX`

Expand Down Expand Up @@ -460,8 +460,8 @@ Now that we know how to handle the simplest case, we're going to kick things up
As the second step in our pipeline, its input depends on the output of another process.
We can deal with this in two ways:

- Manually generate some static test data that is suitable as intermediate input to the process;
- Use a special [setup method](https://www.nf-test.com/docs/testcases/setup/) to handle it dynamically for us.
- Manually generate some static test data that is suitable as intermediate input to the process;
- Use a special [setup method](https://www.nf-test.com/docs/testcases/setup/) to handle it dynamically for us.

**Spoiler:** We're going to use the setup method.

Expand Down
56 changes: 56 additions & 0 deletions hello-nextflow/hello-nf-test-part1/main.nf
Original file line number Diff line number Diff line change
@@ -0,0 +1,56 @@
#!/usr/bin/env nextflow

/*
* Pipeline parameters
*/
params.input_file = "data/greetings.csv"

/*
* Use echo to print 'Hello World!' to standard out
*/
process sayHello {

publishDir 'results', mode: 'copy'

input:
val greeting

output:
path "${greeting}-output.txt"

script:
"""
echo '$greeting' > '$greeting-output.txt'
"""
}

/*
* Use a text replace utility to convert the greeting to uppercase
*/
process convertToUpper {

publishDir 'results', mode: 'copy'

input:
path input_file

output:
path "UPPER-${input_file}"

script:
"""
cat '$input_file' | tr '[a-z]' '[A-Z]' > UPPER-${input_file}
"""
}

workflow {

// create a channel for inputs from a CSV file
greeting_ch = Channel.fromPath(params.input_file).splitCsv().flatten()

// emit a greeting
sayHello(greeting_ch)

// convert the greeting to uppercase
convertToUpper(sayHello.out)
}
8 changes: 8 additions & 0 deletions hello-nextflow/hello-nf-test-part1/nf-test.config
Original file line number Diff line number Diff line change
@@ -0,0 +1,8 @@
config {

testsDir "tests"
workDir ".nf-test"
configFile "tests/nextflow.config"
profile ""

}
Original file line number Diff line number Diff line change
Expand Up @@ -27,7 +27,7 @@ workflow {
ref_dict_file,
intervals_file
)

// Collect variant calling outputs across samples
all_gvcfs_ch = GATK_HAPLOTYPECALLER.out.vcf.collect()
all_idxs_ch = GATK_HAPLOTYPECALLER.out.idx.collect()
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -7,7 +7,7 @@ docker.fixOwnership = true
params {
// Primary input (file of input files, one per line)
reads_bam = null

// Output directory
params.outdir = "results_genomics"

Expand Down
Loading