New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

AFNI Automated testing framework #1

Open
Shotgunosine opened this Issue Sep 14, 2018 · 2 comments

Comments

Projects
None yet
3 participants
@Shotgunosine
Collaborator

Shotgunosine commented Sep 14, 2018

Project Title

AFNI Automated testing framework

Contributors

Name Position Institute
Dylan Nielson Neuroimaging Data Scientist NIMH-DSST
Rick Reynolds Computer Scientist NIMH-SSCC
Bob Cox Director NIMH-SSCC
Jakub Kaczmarzyk Technical Assistant MIT
Dorota Jarecka Research Scientist MIT
Yaroslav Halchenko Assistant Professor Dartmouth

Project Summary

We are implementing automated testing of AFNI programs. The goal is that when any change to AFNI's code is made or proposed on GitHub. Tests are automatically run on CircleCI and the percentage of the code base covered by tests is evaluated and automatically posted to CodeCov and on the github repository itself.

Project Significance

This will provide ongoing assurance that the software is working as expected. Additionally, we are using a public testing framework so that users can determine what tests are being run, propose new tests, and monitor changes in the percentage of the codebase covered by tests. Finally, this framework will facilitate faster development and improved collaboration because it will make it easier to determine if changes to the code have broken previously implemented functionality. We hope that AFNI's adoption of an open continuous testing framework will encourage other major neuroimaging analysis packages to adopt similar practices, improving the quality of neuroimaging analysis software for the community as a whole.

Progress Report

We established the automated building and testing framework on CircleCI and coverage reports posted with CodeCov. Proof of concept tests have been added for 4 AFNI programs, covering 2.5% of AFNI's codebase. Jakub's expertise with docker containers and CircleCI was instrumental in getting testing set up this week and having him on site to work with us saved us several weeks of debugging on our own.

Project URLs

https://github.com/afni/afni
https://circleci.com/gh/afni/afni
https://codecov.io/gh/afni/afni
https://hub.docker.com/u/afni/

@djarecka

This comment has been minimized.

Show comment
Hide comment
@djarecka

djarecka Sep 14, 2018

Project Summary

Working on a testing framework for regression testing of scientific pipelines in different environments and adding existing AFNI pipelines.
In the testing framework docker images are automatically created based on user specification (neurodocker used to create all dockerfiles). Scientific pipelines are run in all docker containers, output of the pipelines are passed to the user specified tests, and results of the tests are presented on a dashboard.

Project Significance

The goal of the testing framework is to provide tool that helps to create regression tests from the existing pipelines used in publications, etc., and compare the results across multiple versions of various software used in the pipeline. In the future, the framework will allow to run the tests locally.

It will help testing the AFNI pipeline for various versions of the packages and external software (including pipelines that are too long to run for every PR).

Progress Report

  • improving the specification for parameters.yaml file provided by the user
  • adding an example of existing AFNI pipeline
  • improving (slowly...) the JavaScript part of the dashboard (parallel plots, tables), thanks Anisha!

Project URLs

djarecka commented Sep 14, 2018

Project Summary

Working on a testing framework for regression testing of scientific pipelines in different environments and adding existing AFNI pipelines.
In the testing framework docker images are automatically created based on user specification (neurodocker used to create all dockerfiles). Scientific pipelines are run in all docker containers, output of the pipelines are passed to the user specified tests, and results of the tests are presented on a dashboard.

Project Significance

The goal of the testing framework is to provide tool that helps to create regression tests from the existing pipelines used in publications, etc., and compare the results across multiple versions of various software used in the pipeline. In the future, the framework will allow to run the tests locally.

It will help testing the AFNI pipeline for various versions of the packages and external software (including pipelines that are too long to run for every PR).

Progress Report

  • improving the specification for parameters.yaml file provided by the user
  • adding an example of existing AFNI pipeline
  • improving (slowly...) the JavaScript part of the dashboard (parallel plots, tables), thanks Anisha!

Project URLs

@yarikoptic

This comment has been minimized.

Show comment
Hide comment
@yarikoptic

yarikoptic Sep 15, 2018

May be it is just my limited search capabilities while on the phone, but I couldn't determine where are actually the tests and how they run to provide that coverage reporting which is already posted on codecov. Would someone be so kind to point me to the file where tests are run with coverage enabled?
Thanks in advance!

yarikoptic commented Sep 15, 2018

May be it is just my limited search capabilities while on the phone, but I couldn't determine where are actually the tests and how they run to provide that coverage reporting which is already posted on codecov. Would someone be so kind to point me to the file where tests are run with coverage enabled?
Thanks in advance!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment