Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Suggestions from nf-core #29

Closed
nvnieuwk opened this issue Oct 11, 2022 · 14 comments
Closed

Suggestions from nf-core #29

nvnieuwk opened this issue Oct 11, 2022 · 14 comments
Labels

Comments

@nvnieuwk
Copy link

Hi again,

After a discussion in the nf-core community we came up with a couple more suggestions that could make incorporation of nf-test into nf-core/modules a bit easier to implement.

  1. Could it be possible to supply a .yaml file with the assertions (a bit similar to how it is done in pytest => e.g. this file).
  2. Could it also be possible to add a tag system (similar to pytest --tag). This would make it easier to run the needed tests on Github Actions instead of having to supply the whole file path.

Cheers,
Nicolas

@nvnieuwk nvnieuwk changed the title Supply assertions as a yaml Suggestions from nf-core Oct 11, 2022
@lukfor
Copy link
Collaborator

lukfor commented Oct 12, 2022

Hi,

  1. Could it be possible to supply a .yaml file with the assertions (a bit similar to how it is done in pytest => e.g. this file).

Thanks for the suggestion and the thoughts. I understand that this would be quite useful for nf-core, but @seppinho and I are not sure if this also makes sense in the context of nf-test. nf-test is highly expressive and benefits by using the power assertions provided from Groovy and Nextflow specific extensions. However, we plan to integrate some kind of snapshot testing for output channels (similar to https://jestjs.io/docs/snapshot-testing). I think this could also be very helpful for you.

  1. Could it also be possible to add a tag system (similar to pytest --tag). This would make it easier to run the needed tests on Github Actions instead of having to supply the whole file path.

Tags are a very good idea and I see a lot of useful use cases! 👌 We will definitifly implement them in one of the next releases!

Best, lukas.

@nvnieuwk
Copy link
Author

Thanks for the suggestion and the thoughts. I understand that this would be quite useful for nf-core, but @seppinho and I are not sure if this also makes sense in the context of nf-test. nf-test is highly expressive and benefits by using the power assertions provided from Groovy and Nextflow specific extensions. However, we plan to integrate some kind of snapshot testing for output channels (similar to https://jestjs.io/docs/snapshot-testing). I think this could also be very helpful for you.

I like this idea! Could it also be possible to add snapshot testing to the files inside of the output directory? We mainly test those files instead of output channels. Thanks for the feedback!

Tags are a very good idea and I see a lot of useful use cases! ok_hand We will definitifly implement them in one of the next releases!

Awesome! Thanks for all the good work!

Cheers,
Nicolas

@lukfor
Copy link
Collaborator

lukfor commented Oct 25, 2022

Hi @nvnieuwk!

We just released nf-test 0.7.0. This release contains now snapshot testing. This could be helpful for nf-core:
https://code.askimed.com/nf-test/assertions/snapshots/

Happy to hear your feedback!

Best, lukas.

@edmundmiller
Copy link
Contributor

Awesome stuff! I'm really excited to get this into nf-core/modules! Do you have anything you can share about the road map for tags? I know we can use pytest-workflow to run nf-test just like we do nextflow right now, but I'd love to cut out the extra dependancy and migrate to nf-test in one move.

So I guess I'm just asking is it a few weeks off, before 2023, before or after March 2023 (our next hackathon which would be a good time to migrate).

@nvnieuwk
Copy link
Author

Thanks @lukfor! I'll have a look when I got some free time and will surely let you know if we encounter any problems. Thanks for the nice addition 🥳

@nvnieuwk
Copy link
Author

Hi, I've been trying out the new snapshots feature for a little bit and found one small issue. It won't run the snapshot test in the following example and I get no warning or error telling me what I did wrong.

nextflow_pipeline {

    name "Tests of the pipeline with all optional parameters on default"
    script "main.nf"

    test("Success") {

        expect {
            assert workflow.success
            assert snapshot(
                workflow,
                path("${outputDir}/families/Proband_12345/reports/Proband_12345.TsTv.count"),
                path("${outputDir}/families/Proband_12345/reports/Proband_12345.bcftools_stats.txt"),
                file("${outputDir}/families/Proband_12345/reports/Proband_12345.TsTv.qual").exists(),
                path("${outputDir}/families/Proband_12345/reports/Proband_12345.FILTER.summary"),
                path("${outputDir}/families/Proband_12345/Proband_12345.vcf.gz").linesGzip[68],
                file("${outputDir}/families/Proband_12345/Proband_12345.vcf.gz.tbi").exists(),
                file("${outputDir}/multiqc_reports/multiqc_report.html").exists(),
                path("${outputDir}/individuals/NA24385/NA24385.g.vcf.gz").linesGzip[53],
                file("${outputDir}/individuals/NA24385/NA24385.g.vcf.gz.tbi").exists(),
                path("${outputDir}/individuals/NA12878/NA12878.g.vcf.gz").linesGzip[53],
                file("${outputDir}/individuals/NA12878/NA12878.g.vcf.gz.tbi").exists()
            )
        }
    }
}

This test checks for:

  • md5sum of files that stay consistent during multiple runs
  • the existence of files that differ during multiple runs and cannot be read using linesGzip
  • the presence of a line in a file if the file is variable but has some constants

Is it possible to implement a bit of error handling if invalid snapshots creation is attempted? I'll be happy to put this in a different issue if you would prefer it.

Cheers,
Nicolas

@lukfor
Copy link
Collaborator

lukfor commented Oct 29, 2022

You created the snapshot but you forgot to execute the match() method to compare it with the saved snapshot.

However, I agree with you that we need a better error reporting for that case.

@lukfor
Copy link
Collaborator

lukfor commented Oct 29, 2022

Awesome stuff! I'm really excited to get this into nf-core/modules! Do you have anything you can share about the road map for tags? I know we can use pytest-workflow to run nf-test just like we do nextflow right now, but I'd love to cut out the extra dependancy and migrate to nf-test in one move.

So I guess I'm just asking is it a few weeks off, before 2023, before or after March 2023 (our next hackathon which would be a good time to migrate).

Thanks. We plan to release this feature in the next version in about 2-3 weeks.

@jfy133
Copy link
Contributor

jfy133 commented Nov 1, 2022

I'm not sure if we should keep a thread of all the nf-core requests here rather than seaprate issues, but just in case x-ref: #43

@edmundmiller
Copy link
Contributor

Maybe a tag for nf-core? I think new issues should be made, rather than just trying to keep track of what we want in one long thread.

@nvnieuwk
Copy link
Author

nvnieuwk commented Nov 2, 2022

You created the snapshot but you forgot to execute the match() method to compare it with the saved snapshot.

However, I agree with you that we need a better error reporting for that case.

Thanks! 🥳

@lukfor
Copy link
Collaborator

lukfor commented Nov 2, 2022

Maybe a tag for nf-core? I think new issues should be made, rather than just trying to keep track of what we want in one long thread.

Thanks for your suggestion. We created a nf-core label and it would help us if you can create separate issues.

@lukfor lukfor added the nf-core label Nov 2, 2022
@edmundmiller
Copy link
Contributor

@lukfor Can we change the color of the label to #24B064?

@lukfor
Copy link
Collaborator

lukfor commented Nov 3, 2022

Tags moved to #49

@lukfor lukfor closed this as completed Nov 3, 2022
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

4 participants