Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[REVIEW]: NiTransforms: A Python tool to read, represent, manipulate, and apply n-dimensional spatial transforms #3459

Closed
40 tasks done
whedon opened this issue Jul 8, 2021 · 56 comments
Assignees
Labels
accepted published Papers published in JOSS Python recommend-accept Papers recommended for acceptance in JOSS. review TeX

Comments

@whedon
Copy link

whedon commented Jul 8, 2021

Submitting author: @mgxd (Mathias Goncalves)
Repository: https://github.com/poldracklab/nitransforms
Version: 21.0.0
Editor: @osorensen
Reviewer: @robbisg, @PeerHerholz
Archive: 10.5281/zenodo.5499694

⚠️ JOSS reduced service mode ⚠️

Due to the challenges of the COVID-19 pandemic, JOSS is currently operating in a "reduced service mode". You can read more about what that means in our blog post.

Status

status

Status badge code:

HTML: <a href="https://joss.theoj.org/papers/10870dbd43c36af2f836e7d3b98d2336"><img src="https://joss.theoj.org/papers/10870dbd43c36af2f836e7d3b98d2336/status.svg"></a>
Markdown: [![status](https://joss.theoj.org/papers/10870dbd43c36af2f836e7d3b98d2336/status.svg)](https://joss.theoj.org/papers/10870dbd43c36af2f836e7d3b98d2336)

Reviewers and authors:

Please avoid lengthy details of difficulties in the review thread. Instead, please create a new issue in the target repository and link to those issues (especially acceptance-blockers) by leaving comments in the review thread below. (For completists: if the target issue tracker is also on GitHub, linking the review thread in the issue or vice versa will create corresponding breadcrumb trails in the link target.)

Reviewer instructions & questions

@robbisg & @PeerHerholz, please carry out your review in this issue by updating the checklist below. If you cannot edit the checklist please:

  1. Make sure you're logged in to your GitHub account
  2. Be sure to accept the invite at this URL: https://github.com/openjournals/joss-reviews/invitations

The reviewer guidelines are available here: https://joss.readthedocs.io/en/latest/reviewer_guidelines.html. Any questions/concerns please let @osorensen know.

Please start on your review when you are able, and be sure to complete your review in the next six weeks, at the very latest

Review checklist for @robbisg

Conflict of interest

  • I confirm that I have read the JOSS conflict of interest (COI) policy and that: I have no COIs with reviewing this work or that any perceived COIs have been waived by JOSS for the purpose of this review.

Code of Conduct

General checks

  • Repository: Is the source code for this software available at the repository url?
  • License: Does the repository contain a plain-text LICENSE file with the contents of an OSI approved software license?
  • Contribution and authorship: Has the submitting author (@mgxd) made major contributions to the software? Does the full list of paper authors seem appropriate and complete?
  • Substantial scholarly effort: Does this submission meet the scope eligibility described in the JOSS guidelines

Functionality

  • Installation: Does installation proceed as outlined in the documentation?
  • Functionality: Have the functional claims of the software been confirmed?
  • Performance: If there are any performance claims of the software, have they been confirmed? (If there are no claims, please check off this item.)

Documentation

  • A statement of need: Do the authors clearly state what problems the software is designed to solve and who the target audience is?
  • Installation instructions: Is there a clearly-stated list of dependencies? Ideally these should be handled with an automated package management solution.
  • Example usage: Do the authors include examples of how to use the software (ideally to solve real-world analysis problems).
  • Functionality documentation: Is the core functionality of the software documented to a satisfactory level (e.g., API method documentation)?
  • Automated tests: Are there automated tests or manual steps described so that the functionality of the software can be verified?
  • Community guidelines: Are there clear guidelines for third parties wishing to 1) Contribute to the software 2) Report issues or problems with the software 3) Seek support

Software paper

  • Summary: Has a clear description of the high-level functionality and purpose of the software for a diverse, non-specialist audience been provided?
  • A statement of need: Does the paper have a section titled 'Statement of Need' that clearly states what problems the software is designed to solve and who the target audience is?
  • State of the field: Do the authors describe how this software compares to other commonly-used packages?
  • Quality of writing: Is the paper well written (i.e., it does not require editing for structure, language, or writing quality)?
  • References: Is the list of references complete, and is everything cited appropriately that should be cited (e.g., papers, datasets, software)? Do references in the text use the proper citation syntax?

Review checklist for @PeerHerholz

Conflict of interest

  • I confirm that I have read the JOSS conflict of interest (COI) policy and that: I have no COIs with reviewing this work or that any perceived COIs have been waived by JOSS for the purpose of this review.

Code of Conduct

General checks

  • Repository: Is the source code for this software available at the repository url?
  • License: Does the repository contain a plain-text LICENSE file with the contents of an OSI approved software license?
  • Contribution and authorship: Has the submitting author (@mgxd) made major contributions to the software? Does the full list of paper authors seem appropriate and complete?
  • Substantial scholarly effort: Does this submission meet the scope eligibility described in the JOSS guidelines

Functionality

  • Installation: Does installation proceed as outlined in the documentation?
  • Functionality: Have the functional claims of the software been confirmed?
  • Performance: If there are any performance claims of the software, have they been confirmed? (If there are no claims, please check off this item.)

Documentation

  • A statement of need: Do the authors clearly state what problems the software is designed to solve and who the target audience is?
  • Installation instructions: Is there a clearly-stated list of dependencies? Ideally these should be handled with an automated package management solution.
  • Example usage: Do the authors include examples of how to use the software (ideally to solve real-world analysis problems).
  • Functionality documentation: Is the core functionality of the software documented to a satisfactory level (e.g., API method documentation)?
  • Automated tests: Are there automated tests or manual steps described so that the functionality of the software can be verified?
  • Community guidelines: Are there clear guidelines for third parties wishing to 1) Contribute to the software 2) Report issues or problems with the software 3) Seek support

Software paper

  • Summary: Has a clear description of the high-level functionality and purpose of the software for a diverse, non-specialist audience been provided?
  • A statement of need: Does the paper have a section titled 'Statement of Need' that clearly states what problems the software is designed to solve and who the target audience is?
  • State of the field: Do the authors describe how this software compares to other commonly-used packages?
  • Quality of writing: Is the paper well written (i.e., it does not require editing for structure, language, or writing quality)?
  • References: Is the list of references complete, and is everything cited appropriately that should be cited (e.g., papers, datasets, software)? Do references in the text use the proper citation syntax?
@whedon
Copy link
Author

whedon commented Jul 8, 2021

Hello human, I'm @whedon, a robot that can help you with some common editorial tasks. @robbisg, @PeerHerholz it looks like you're currently assigned to review this paper 🎉.

⚠️ JOSS reduced service mode ⚠️

Due to the challenges of the COVID-19 pandemic, JOSS is currently operating in a "reduced service mode". You can read more about what that means in our blog post.

⭐ Important ⭐

If you haven't already, you should seriously consider unsubscribing from GitHub notifications for this (https://github.com/openjournals/joss-reviews) repository. As a reviewer, you're probably currently watching this repository which means for GitHub's default behaviour you will receive notifications (emails) for all reviews 😿

To fix this do the following two things:

  1. Set yourself as 'Not watching' https://github.com/openjournals/joss-reviews:

watching

  1. You may also like to change your default settings for this watching repositories in your GitHub profile here: https://github.com/settings/notifications

notifications

For a list of things I can do to help you, just type:

@whedon commands

For example, to regenerate the paper pdf after making changes in the paper's md or bib files, type:

@whedon generate pdf

@whedon
Copy link
Author

whedon commented Jul 8, 2021

Failed to discover a Statement of need section in paper

@whedon
Copy link
Author

whedon commented Jul 8, 2021

Reference check summary (note 'MISSING' DOIs are suggestions that need verification):

OK DOIs

- 10.5281/zenodo.591597 is OK
- 10.1016/j.neuroimage.2012.01.021 is OK
- 10.1016/j.neuroimage.2011.09.015 is OK
- 10.1016/j.media.2007.06.004 is OK

MISSING DOIs

- None

INVALID DOIs

- 10.1002/(SICI)1099-1492(199706/08)10:4/5	extless171::AID-NBM453	extgreater3.0.CO;2-L is INVALID

@whedon
Copy link
Author

whedon commented Jul 8, 2021

Software report (experimental):

github.com/AlDanial/cloc v 1.88  T=0.16 s (285.5 files/s, 85904.8 lines/s)
-------------------------------------------------------------------------------
Language                     files          blank        comment           code
-------------------------------------------------------------------------------
Python                          25            772            911           2479
SVG                              4              1              1           1368
YAML                             5             31              7            389
Jupyter Notebook                 3              0           7278            238
make                             1             30              6            148
Dockerfile                       1             18             15            125
Markdown                         2             17              0            100
reStructuredText                 4             16             28             77
TeX                              1              5              0             70
TOML                             1              1              0              9
-------------------------------------------------------------------------------
SUM:                            47            891           8246           5003
-------------------------------------------------------------------------------


Statistical information for the repository '3744697bb45cd68c5f464001' was
gathered on 2021/07/08.
The following historical commit information, by author, was found:

Author                     Commits    Insertions      Deletions    % of changes
Chris Markiewicz                 2             6              2            0.08
Mathias Goncalves                1             1              1            0.02
Oscar Esteban                   16           337             90            4.29
Stefano Moia                     1             1              0            0.01
mathiasg                        43          1873            181           20.66
oesteban                        66          4791           2591           74.24
smoia                            4            42             27            0.69

Below are the number of rows from each author that have survived and are still
intact in the current revision:

Author                     Rows      Stability          Age       % in comments
Chris Markiewicz              4           66.7         15.4                0.00
Oscar Esteban              3486         1034.4         16.2               12.65
mathiasg                    657           35.1         10.7               11.42
smoia                        15           35.7         19.5                6.67

@whedon
Copy link
Author

whedon commented Jul 8, 2021

👉📄 Download article proof 📄 View article proof on GitHub 📄 👈

@PeerHerholz
Copy link

To the authors,

I sincerely hope you and your loved ones are still doing ok during these ongoing difficult times.

At first, I would like to thank all authors for their very interesting submission and hard, dedicated work they put into it. The topic and aim of this resource are both fascinating and important. Within the following I would like to provide additional information regarding my review decision, which will be divided into the following respective issues (also linked above) opened within the tool's repository:

In case some of my comments are misleading, difficult to understand or simply wrong based on an insufficient understanding on my side, I would like to apologize and kindly ask the authors to address such concerns. The same holds true for the nature of my comments: I aim to only provide helpful and constructive comments and criticism. If my wording appears too harsh or not helpful, please let me know and I will try my best to address this.

The presented tool, NiTransforms, allows researchers to work with an extensive amount and variety of transform formats. This includes both basic file handling (read/write/load/save), conversion between transform formats and application of transforms to images. Given the current state-of-the-art and increasing amount of workflows/pipelines that combine multiple software packages, the presented tool fills a prominent gap in the python neuroimaging ecosystem and will tremendously help to boost the reproducibility of neuroimaging results, while at the same time presenting the foundation for various other applications that could build upon it. Thus, it's clear "accept" from my side after the above outlined points were addressed/discussed.

Thanks again for this great contribution and please let me know if there are questions!

Cheers, Peer

oesteban added a commit to nipy/nitransforms that referenced this issue Jul 14, 2021
Updates the JOSS submission to incorporate the suggestions by @PeerHerholz.

References: openjournals/joss-reviews#3459
Resolves: #119.
@robbisg
Copy link

robbisg commented Jul 14, 2021

The proposed tool, nitransforms, is used to convert transformation files and manage them very easily.

I think that this tool is very useful and make it available will extremely help the python neuroimaging community. I agree with @PeerHerholz that this tool is very useful to ease building complex and automated pipelines and help the reproducibility of neuroimaging studies.

I opened an issue in the package repository (nipy/nitransforms#123) with my suggestions.
My suggestions are related to the documentation part, I encouraged the authors to add examples or tutorials, since from my perspective, this is the entry point of new users and a way to understand the usefulness of the software.
Moreover a brief overview of other tools used for this purposes should be included in the paper, if no other tools are available, a clear statement that this is the only tool with these goals can help to add importance to it.

Once that these suggestions will be addressed, I will be happy to accept the publication to JOSS.

If you have any other questions, let me know. I would like to apologize if some of my comments are misleading.

Cheers,
Roberto

@osorensen
Copy link
Member

osorensen commented Jul 16, 2021

Thanks for the reviews, @robbisg and @PeerHerholz, and for opening issues in the source repository. @mgxd please take your time to resolve the issue, and feel free to reach out to me if there is anything I can help with.

@whedon
Copy link
Author

whedon commented Jul 22, 2021

👋 @robbisg, please update us on how your review is going (this is an automated reminder).

@whedon
Copy link
Author

whedon commented Jul 22, 2021

👋 @PeerHerholz, please update us on how your review is going (this is an automated reminder).

@osorensen
Copy link
Member

@mgxd, could you please update us on how it's going with the issues opened by the reviewers in the source repository?

@mgxd
Copy link

mgxd commented Aug 9, 2021

Hi all,

Thank you @PeerHerholz and @robbisg for the reviews! We have spent the past few weeks addressing concerns, and feel that we're at a good place - when you can, please give nitransforms another look.

Cheers,
Mathias

@PeerHerholz
Copy link

@whedon generate pdf

@whedon
Copy link
Author

whedon commented Aug 21, 2021

👉📄 Download article proof 📄 View article proof on GitHub 📄 👈

@PeerHerholz
Copy link

Ahoi hoi folks,

thx so much for all the work on this @mgxd & @oesteban, I think the changes look great!

Two small things: as the contribution guidelines were added, I was wondering about the CoC and regarding the examples, I wanted to ask if a few sentences/explanations could be added to the Preparing Images and AFNI Deoblique notebooks, as well as functionality to download the examples either as .ipynb or .py?

Thanks again, cheers, Peer

@osorensen
Copy link
Member

Thanks for all your work with this review @PeerHerholz! If you think that some of the issues on the review checklist have been fixed, please remember to check off the boxes.

@oesteban
Copy link

@whedon generate pdf

@whedon
Copy link
Author

whedon commented Sep 10, 2021

👉📄 Download article proof 📄 View article proof on GitHub 📄 👈

@osorensen
Copy link
Member

@oesteban before we proceed, could you please do the following?

  • Make a tagged release of the software.
  • Archive the reviewed software in Zenodo or a similar service (e.g., figshare, an institutional repository)
  • Check the archival deposit (e.g., in Zenodo) has the correct metadata. This includes the title (should match the paper title) and author list (make sure the list is correct and people who only made a small fix are not on it). You may also add the authors' ORCID.
  • Please list the DOI of the archived version here.

@oesteban
Copy link

Sure thing, I just realized a self-citation was missed (fMRIPrep was the only piece of software mentioned without a due reference). I'll check on the Zenodo record - I believe we set it up, but I might be wrong. Anyways that was on the todo list down the line, so happy to get it right before finishing with the review stage.

@osorensen
Copy link
Member

Good! I see there is a Zenodo archive for the paper (https://osf.io/8aq7b/), but please create one for the final version of the software too, so that we can get a doi link for the version that gets published.

@oesteban
Copy link

DOI of the archived version: 10.5281/zenodo.5499694

@osorensen
Copy link
Member

@whedon generate pdf

@whedon
Copy link
Author

whedon commented Sep 10, 2021

👉📄 Download article proof 📄 View article proof on GitHub 📄 👈

@osorensen
Copy link
Member

@whedon check references

@whedon
Copy link
Author

whedon commented Sep 10, 2021

Reference check summary (note 'MISSING' DOIs are suggestions that need verification):

OK DOIs

- 10.1002/(SICI)1099-1492(199706/08)10:4/5<171::AID-NBM453>3.0.CO;2-L is OK
- 10.5281/zenodo.591597 is OK
- 10.1016/j.neuroimage.2012.01.021 is OK
- 10.1016/j.neuroimage.2011.09.015 is OK
- 10.1016/j.media.2007.06.004 is OK
- 10.1038/s41592-018-0235-4 is OK

MISSING DOIs

- None

INVALID DOIs

- None

@osorensen
Copy link
Member

@whedon set 10.5281/zenodo.5499694 as archive

@whedon
Copy link
Author

whedon commented Sep 10, 2021

OK. 10.5281/zenodo.5499694 is the archive.

@osorensen
Copy link
Member

@whedon set 21.0.0 as version

@whedon
Copy link
Author

whedon commented Sep 10, 2021

OK. 21.0.0 is the version.

@osorensen
Copy link
Member

@whedon recommend-accept

@whedon
Copy link
Author

whedon commented Sep 10, 2021

Attempting dry run of processing paper acceptance...

@whedon whedon added the recommend-accept Papers recommended for acceptance in JOSS. label Sep 10, 2021
@whedon
Copy link
Author

whedon commented Sep 10, 2021

Reference check summary (note 'MISSING' DOIs are suggestions that need verification):

OK DOIs

- 10.1002/(SICI)1099-1492(199706/08)10:4/5<171::AID-NBM453>3.0.CO;2-L is OK
- 10.5281/zenodo.591597 is OK
- 10.1016/j.neuroimage.2012.01.021 is OK
- 10.1016/j.neuroimage.2011.09.015 is OK
- 10.1016/j.media.2007.06.004 is OK
- 10.1038/s41592-018-0235-4 is OK

MISSING DOIs

- None

INVALID DOIs

- None

@whedon
Copy link
Author

whedon commented Sep 10, 2021

👋 @openjournals/joss-eics, this paper is ready to be accepted and published.

Check final proof 👉 openjournals/joss-papers#2582

If the paper PDF and Crossref deposit XML look good in openjournals/joss-papers#2582, then you can now move forward with accepting the submission by compiling again with the flag deposit=true e.g.

@whedon accept deposit=true

@arfon
Copy link
Member

arfon commented Sep 10, 2021

@whedon accept deposit=true

@whedon
Copy link
Author

whedon commented Sep 10, 2021

Doing it live! Attempting automated processing of paper acceptance...

@whedon whedon added accepted published Papers published in JOSS labels Sep 10, 2021
@whedon
Copy link
Author

whedon commented Sep 10, 2021

🐦🐦🐦 👉 Tweet for this paper 👈 🐦🐦🐦

@whedon
Copy link
Author

whedon commented Sep 10, 2021

🚨🚨🚨 THIS IS NOT A DRILL, YOU HAVE JUST ACCEPTED A PAPER INTO JOSS! 🚨🚨🚨

Here's what you must now do:

  1. Check final PDF and Crossref metadata that was deposited 👉 Creating pull request for 10.21105.joss.03459 joss-papers#2583
  2. Wait a couple of minutes, then verify that the paper DOI resolves https://doi.org/10.21105/joss.03459
  3. If everything looks good, then close this review issue.
  4. Party like you just published a paper! 🎉🌈🦄💃👻🤘

Any issues? Notify your editorial technical team...

@arfon
Copy link
Member

arfon commented Sep 10, 2021

@robbisg, @PeerHerholz – many thanks for your reviews here and to @osorensen for editing this submission! JOSS relies upon the volunteer effort of people like you and we simply wouldn't be able to do this without you ✨

@mgxd – your paper is now accepted and published in JOSS ⚡🚀💥

@arfon arfon closed this as completed Sep 10, 2021
@whedon
Copy link
Author

whedon commented Sep 10, 2021

🎉🎉🎉 Congratulations on your paper acceptance! 🎉🎉🎉

If you would like to include a link to your paper from your README use the following code snippets:

Markdown:
[![DOI](https://joss.theoj.org/papers/10.21105/joss.03459/status.svg)](https://doi.org/10.21105/joss.03459)

HTML:
<a style="border-width:0" href="https://doi.org/10.21105/joss.03459">
  <img src="https://joss.theoj.org/papers/10.21105/joss.03459/status.svg" alt="DOI badge" >
</a>

reStructuredText:
.. image:: https://joss.theoj.org/papers/10.21105/joss.03459/status.svg
   :target: https://doi.org/10.21105/joss.03459

This is how it will look in your documentation:

DOI

We need your help!

Journal of Open Source Software is a community-run journal and relies upon volunteer effort. If you'd like to support us please consider doing either one (or both) of the the following:

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
accepted published Papers published in JOSS Python recommend-accept Papers recommended for acceptance in JOSS. review TeX
Projects
None yet
Development

No branches or pull requests

7 participants