Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[REVIEW]: MIRP: A Python package for standardised radiomics #6413

Open
editorialbot opened this issue Feb 26, 2024 · 31 comments
Open

[REVIEW]: MIRP: A Python package for standardised radiomics #6413

editorialbot opened this issue Feb 26, 2024 · 31 comments
Assignees
Labels
Python R review TeX Track: 2 (BCM) Biomedical Engineering, Biosciences, Chemistry, and Materials

Comments

@editorialbot
Copy link
Collaborator

editorialbot commented Feb 26, 2024

Submitting author: @alexzwanenburg (Alex Zwanenburg)
Repository: https://github.com/oncoray/mirp
Branch with paper.md (empty if default branch): paper
Version: v2.1.0
Editor: @emdupre
Reviewers: @surajpaib, @Matthew-Jennings, @drcandacemakedamoore, @theanega
Archive: Pending

Status

status

Status badge code:

HTML: <a href="https://joss.theoj.org/papers/165c85b1ecad891550a21b12c8b2e577"><img src="https://joss.theoj.org/papers/165c85b1ecad891550a21b12c8b2e577/status.svg"></a>
Markdown: [![status](https://joss.theoj.org/papers/165c85b1ecad891550a21b12c8b2e577/status.svg)](https://joss.theoj.org/papers/165c85b1ecad891550a21b12c8b2e577)

Reviewers and authors:

Please avoid lengthy details of difficulties in the review thread. Instead, please create a new issue in the target repository and link to those issues (especially acceptance-blockers) by leaving comments in the review thread below. (For completists: if the target issue tracker is also on GitHub, linking the review thread in the issue or vice versa will create corresponding breadcrumb trails in the link target.)

Reviewer instructions & questions

@surajpaib & @Matthew-Jennings & @drcandacemakedamoore & @theanega, your review will be checklist based. Each of you will have a separate checklist that you should update when carrying out your review.
First of all you need to run this command in a separate comment to create the checklist:

@editorialbot generate my checklist

The reviewer guidelines are available here: https://joss.readthedocs.io/en/latest/reviewer_guidelines.html. Any questions/concerns please let @emdupre know.

Please start on your review when you are able, and be sure to complete your review in the next six weeks, at the very latest

Checklists

📝 Checklist for @surajpaib

📝 Checklist for @Matthew-Jennings

📝 Checklist for @drcandacemakedamoore

📝 Checklist for @theanega

@editorialbot
Copy link
Collaborator Author

Hello humans, I'm @editorialbot, a robot that can help you with some common editorial tasks.

For a list of things I can do to help you, just type:

@editorialbot commands

For example, to regenerate the paper pdf after making changes in the paper's md or bib files, type:

@editorialbot generate pdf

@editorialbot
Copy link
Collaborator Author

Reference check summary (note 'MISSING' DOIs are suggestions that need verification):

OK DOIs

- 10.1038/s41571-022-00707-0 is OK
- 10.1038/nrclinonc.2016.162 is OK
- 10.1148/radiol.2020191145 is OK
- 10.1038/s41598-017-13448-3 is OK
- 10.1038/s41598-018-36938-4 is OK
- 10.1038/s41598-022-13967-8 is OK
- 10.1148/radiol.211604 is OK
- 10.1148/radiol.231319 is OK
- 10.1038/nrclinonc.2017.141 is OK
- 10.1038/s41467-023-44591-3 is OK

MISSING DOIs

- None

INVALID DOIs

- None

@editorialbot
Copy link
Collaborator Author

Software report:

github.com/AlDanial/cloc v 1.88  T=1.70 s (105.6 files/s, 31142.2 lines/s)
-------------------------------------------------------------------------------
Language                     files          blank        comment           code
-------------------------------------------------------------------------------
Python                         104           5692           4648          19852
HTML                            25           1334             75           7703
Markdown                        12           1193              0           5535
SVG                              2              1              1           2996
JavaScript                      12            131            221            880
CSS                              4            190             35            779
reStructuredText                 7            169            159            351
XML                              4              0            336            256
TeX                              1             19              0            236
R                                1             25              8             77
YAML                             3              6              4             57
TOML                             1              5              0             47
DOS Batch                        2              8              1             28
make                             1              4              7              9
Bourne Shell                     1              0              0              1
-------------------------------------------------------------------------------
SUM:                           180           8777           5495          38807
-------------------------------------------------------------------------------


gitinspector failed to run statistical information for the repository

@editorialbot
Copy link
Collaborator Author

Wordcount for paper.md is 1025

@editorialbot
Copy link
Collaborator Author

👉📄 Download article proof 📄 View article proof on GitHub 📄 👈

@emdupre
Copy link
Member

emdupre commented Feb 26, 2024

👋 Hi @surajpaib, @Matthew-Jennings, @drcandacemakedamoore, @theanega, and thank you again for agreeing to review this submission for MIRP !

The review will take place in this issue, and you can generate your individual reviewer checklists by asking editorialbot directly with @editorialbot generate my checklist.

In working through the checklist, you're likely to have specific feedback on MIRP. Whenever possible, please open relevant issues on the software repository (and cross-link them with this issue) rather than discussing them here. This helps to make sure that feedback is translated into actionable items to improve the software !

If you aren't sure how to get started, please see the Reviewing for JOSS guide -- and, of course, feel free to ping me with any questions !

@surajpaib
Copy link

surajpaib commented Feb 26, 2024

Review checklist for @surajpaib

Conflict of interest

  • I confirm that I have read the JOSS conflict of interest (COI) policy and that: I have no COIs with reviewing this work or that any perceived COIs have been waived by JOSS for the purpose of this review.

Code of Conduct

General checks

  • Repository: Is the source code for this software available at the https://github.com/oncoray/mirp?
  • License: Does the repository contain a plain-text LICENSE or COPYING file with the contents of an OSI approved software license?
  • Contribution and authorship: Has the submitting author (@alexzwanenburg) made major contributions to the software? Does the full list of paper authors seem appropriate and complete?
  • Substantial scholarly effort: Does this submission meet the scope eligibility described in the JOSS guidelines
  • Data sharing: If the paper contains original data, data are accessible to the reviewers. If the paper contains no original data, please check this item.
  • Reproducibility: If the paper contains original results, results are entirely reproducible by reviewers. If the paper contains no original results, please check this item.
  • Human and animal research: If the paper contains original data research on humans subjects or animals, does it comply with JOSS's human participants research policy and/or animal research policy? If the paper contains no such data, please check this item.

Functionality

  • Installation: Does installation proceed as outlined in the documentation?
  • Functionality: Have the functional claims of the software been confirmed?
  • Performance: If there are any performance claims of the software, have they been confirmed? (If there are no claims, please check off this item.)

Documentation

  • A statement of need: Do the authors clearly state what problems the software is designed to solve and who the target audience is?
  • Installation instructions: Is there a clearly-stated list of dependencies? Ideally these should be handled with an automated package management solution.
  • Example usage: Do the authors include examples of how to use the software (ideally to solve real-world analysis problems).
  • Functionality documentation: Is the core functionality of the software documented to a satisfactory level (e.g., API method documentation)?
  • Automated tests: Are there automated tests or manual steps described so that the functionality of the software can be verified?
  • Community guidelines: Are there clear guidelines for third parties wishing to 1) Contribute to the software 2) Report issues or problems with the software 3) Seek support

Software paper

  • Summary: Has a clear description of the high-level functionality and purpose of the software for a diverse, non-specialist audience been provided?
  • A statement of need: Does the paper have a section titled 'Statement of need' that clearly states what problems the software is designed to solve, who the target audience is, and its relation to other work?
  • State of the field: Do the authors describe how this software compares to other commonly-used packages?
  • Quality of writing: Is the paper well written (i.e., it does not require editing for structure, language, or writing quality)?
  • References: Is the list of references complete, and is everything cited appropriately that should be cited (e.g., papers, datasets, software)? Do references in the text use the proper citation syntax?

@drcandacemakedamoore
Copy link

drcandacemakedamoore commented Feb 27, 2024

Review checklist for @drcandacemakedamoore

Conflict of interest

  • I confirm that I have read the JOSS conflict of interest (COI) policy and that: I have no COIs with reviewing this work or that any perceived COIs have been waived by JOSS for the purpose of this review.

Code of Conduct

General checks

  • Repository: Is the source code for this software available at the https://github.com/oncoray/mirp?
  • License: Does the repository contain a plain-text LICENSE or COPYING file with the contents of an OSI approved software license?
  • Contribution and authorship: Has the submitting author (@alexzwanenburg) made major contributions to the software? Does the full list of paper authors seem appropriate and complete?
  • Substantial scholarly effort: Does this submission meet the scope eligibility described in the JOSS guidelines
  • Data sharing: If the paper contains original data, data are accessible to the reviewers. If the paper contains no original data, please check this item.
  • Reproducibility: If the paper contains original results, results are entirely reproducible by reviewers. If the paper contains no original results, please check this item.
  • Human and animal research: If the paper contains original data research on humans subjects or animals, does it comply with JOSS's human participants research policy and/or animal research policy? If the paper contains no such data, please check this item.

Functionality

  • Installation: Does installation proceed as outlined in the documentation?
  • Functionality: Have the functional claims of the software been confirmed?
  • Performance: If there are any performance claims of the software, have they been confirmed? (If there are no claims, please check off this item.)

Documentation

  • A statement of need: Do the authors clearly state what problems the software is designed to solve and who the target audience is?
  • Installation instructions: Is there a clearly-stated list of dependencies? Ideally these should be handled with an automated package management solution.
  • Example usage: Do the authors include examples of how to use the software (ideally to solve real-world analysis problems).
  • Functionality documentation: Is the core functionality of the software documented to a satisfactory level (e.g., API method documentation)?
  • Automated tests: Are there automated tests or manual steps described so that the functionality of the software can be verified?
  • Community guidelines: Are there clear guidelines for third parties wishing to 1) Contribute to the software 2) Report issues or problems with the software 3) Seek support

Software paper

  • Summary: Has a clear description of the high-level functionality and purpose of the software for a diverse, non-specialist audience been provided?
  • A statement of need: Does the paper have a section titled 'Statement of need' that clearly states what problems the software is designed to solve, who the target audience is, and its relation to other work?
  • State of the field: Do the authors describe how this software compares to other commonly-used packages?
  • Quality of writing: Is the paper well written (i.e., it does not require editing for structure, language, or writing quality)?
  • References: Is the list of references complete, and is everything cited appropriately that should be cited (e.g., papers, datasets, software)? Do references in the text use the proper citation syntax?

@Matthew-Jennings
Copy link

Matthew-Jennings commented Feb 27, 2024

Review checklist for @Matthew-Jennings

Conflict of interest

  • I confirm that I have read the JOSS conflict of interest (COI) policy and that: I have no COIs with reviewing this work or that any perceived COIs have been waived by JOSS for the purpose of this review.

Code of Conduct

General checks

  • Repository: Is the source code for this software available at the https://github.com/oncoray/mirp?
  • License: Does the repository contain a plain-text LICENSE or COPYING file with the contents of an OSI approved software license?
  • Contribution and authorship: Has the submitting author (@alexzwanenburg) made major contributions to the software? Does the full list of paper authors seem appropriate and complete?
  • Substantial scholarly effort: Does this submission meet the scope eligibility described in the JOSS guidelines
  • Data sharing: If the paper contains original data, data are accessible to the reviewers. If the paper contains no original data, please check this item.
  • Reproducibility: If the paper contains original results, results are entirely reproducible by reviewers. If the paper contains no original results, please check this item.
  • Human and animal research: If the paper contains original data research on humans subjects or animals, does it comply with JOSS's human participants research policy and/or animal research policy? If the paper contains no such data, please check this item.

Functionality

  • Installation: Does installation proceed as outlined in the documentation?
  • Functionality: Have the functional claims of the software been confirmed?
  • Performance: If there are any performance claims of the software, have they been confirmed? (If there are no claims, please check off this item.)

Documentation

  • A statement of need: Do the authors clearly state what problems the software is designed to solve and who the target audience is?
  • Installation instructions: Is there a clearly-stated list of dependencies? Ideally these should be handled with an automated package management solution.
  • Example usage: Do the authors include examples of how to use the software (ideally to solve real-world analysis problems).
  • Functionality documentation: Is the core functionality of the software documented to a satisfactory level (e.g., API method documentation)?
  • Automated tests: Are there automated tests or manual steps described so that the functionality of the software can be verified?
  • Community guidelines: Are there clear guidelines for third parties wishing to 1) Contribute to the software 2) Report issues or problems with the software 3) Seek support

Software paper

  • Summary: Has a clear description of the high-level functionality and purpose of the software for a diverse, non-specialist audience been provided?
  • A statement of need: Does the paper have a section titled 'Statement of need' that clearly states what problems the software is designed to solve, who the target audience is, and its relation to other work?
  • State of the field: Do the authors describe how this software compares to other commonly-used packages?
  • Quality of writing: Is the paper well written (i.e., it does not require editing for structure, language, or writing quality)?
  • References: Is the list of references complete, and is everything cited appropriately that should be cited (e.g., papers, datasets, software)? Do references in the text use the proper citation syntax?

@theanega
Copy link

theanega commented Feb 28, 2024

Review checklist for @theanega

Conflict of interest

  • I confirm that I have read the JOSS conflict of interest (COI) policy and that: I have no COIs with reviewing this work or that any perceived COIs have been waived by JOSS for the purpose of this review.

Code of Conduct

General checks

  • Repository: Is the source code for this software available at the https://github.com/oncoray/mirp?
  • License: Does the repository contain a plain-text LICENSE or COPYING file with the contents of an OSI approved software license?
  • Contribution and authorship: Has the submitting author (@alexzwanenburg) made major contributions to the software? Does the full list of paper authors seem appropriate and complete?
  • Substantial scholarly effort: Does this submission meet the scope eligibility described in the JOSS guidelines
  • Data sharing: If the paper contains original data, data are accessible to the reviewers. If the paper contains no original data, please check this item.
  • Reproducibility: If the paper contains original results, results are entirely reproducible by reviewers. If the paper contains no original results, please check this item.
  • Human and animal research: If the paper contains original data research on humans subjects or animals, does it comply with JOSS's human participants research policy and/or animal research policy? If the paper contains no such data, please check this item.

Functionality

  • Installation: Does installation proceed as outlined in the documentation?
  • Functionality: Have the functional claims of the software been confirmed?
  • Performance: If there are any performance claims of the software, have they been confirmed? (If there are no claims, please check off this item.)

Documentation

  • A statement of need: Do the authors clearly state what problems the software is designed to solve and who the target audience is?
  • Installation instructions: Is there a clearly-stated list of dependencies? Ideally these should be handled with an automated package management solution.
  • Example usage: Do the authors include examples of how to use the software (ideally to solve real-world analysis problems).
  • Functionality documentation: Is the core functionality of the software documented to a satisfactory level (e.g., API method documentation)?
  • Automated tests: Are there automated tests or manual steps described so that the functionality of the software can be verified?
  • Community guidelines: Are there clear guidelines for third parties wishing to 1) Contribute to the software 2) Report issues or problems with the software 3) Seek support

Software paper

  • Summary: Has a clear description of the high-level functionality and purpose of the software for a diverse, non-specialist audience been provided?
  • A statement of need: Does the paper have a section titled 'Statement of need' that clearly states what problems the software is designed to solve, who the target audience is, and its relation to other work?
  • State of the field: Do the authors describe how this software compares to other commonly-used packages?
  • Quality of writing: Is the paper well written (i.e., it does not require editing for structure, language, or writing quality)?
  • References: Is the list of references complete, and is everything cited appropriately that should be cited (e.g., papers, datasets, software)? Do references in the text use the proper citation syntax?

@alexzwanenburg
Copy link

@editorialbot generate pdf

@editorialbot
Copy link
Collaborator Author

👉📄 Download article proof 📄 View article proof on GitHub 📄 👈

@drcandacemakedamoore
Copy link

drcandacemakedamoore commented Mar 15, 2024

@alexzwanenburg Can you please clarify if Sebastian Starke and Steffan Lock are the same person. I do not see any code contributions from a Steffan Lock. This may be fine if he helped make the package, but please clarify. Or maybe Steffan Leger as mentioned in the thank you at the bottom is the same as Steffan Lock?

@alexzwanenburg
Copy link

alexzwanenburg commented Mar 15, 2024

@drcandacemakedamoore Steffen Löck is my professor and advised on the paper and the package. Stefan Leger contributed to early in-house versions of MIRP, prior to moving GitHub. Sebastian Starke also made minor contributions to an earlier version of MIRP.

@drcandacemakedamoore
Copy link

drcandacemakedamoore commented Mar 17, 2024

@alexzwanenburg preferably module names should be all lower case (and super-best is single word or if no other choice with underscore). I notice you have modules that are camelcase. Different file systems have different case conventions. The real point is that it could be possible to import these modules in two different ways then cause problems on different files systems where case conventions are different. I will get to more substantial issues soon, but this already pops out for my eyes, before I even started the real review. But since I'm on superficial issues right now, some badges would not hurt (it's nice to have the pypi version badge and also Anaconda if you released it there, and I can't tell at first glance here.)

@drcandacemakedamoore
Copy link

drcandacemakedamoore commented Mar 17, 2024

@alexzwanenburg on a less superficial issue, I note there is no developer's documentation. Many people may want to tinker with what you have done, and hopefully even contribute to the package. I am looking for documentation somewhere that tells people how to run the testing, so they can test new stuff before sending it. I am also looking for this, because it is not clear if you have any automated testing that runs in CI (did I miss it?) so instead of figuring out how to run it from there I would need instructions to run your tests properly . Update: I see from my Windows machine it is python -m pytest but I have no idea what it will be from a Mac

@emdupre
Copy link
Member

emdupre commented Mar 18, 2024

Thank you, @drcandacemakedamoore !

If you could please open subsequent review comments as issues on the MIRP repository, this will help to make sure there is sufficient space for follow-up discussion and that action items are trackable across reviewers. I know that @alexzwanenburg has started to respond in-thread, but we'll generally ask to keep only high-level discussions in the general review thread and re-direct all other comments to the project issue tracker.

If you have any other questions, of course, please don't hesitate to ask.

@emdupre
Copy link
Member

emdupre commented Mar 25, 2024

👋 Hi everyone, happy Monday !

I just wanted to check-in on the status of this review and make sure that there weren't any current blockers in working through the reviewer checklists.

I did notice that @drcandacemakedamoore has opened oncoray/mirp#66, oncoray/mirp#67, and oncoray/mirp#68 -- thank you ! I'm cross-linking them here, so they're easier for myself (and other reviewers) to track.

@theanega
Copy link

theanega commented Mar 26, 2024

Hello! Thank you very much for the invitation to review this paper. I will finalize the review later this week (only the "functionality" section is missing). For now, I've left my comments on issue #69 .
@emdupre Feel free to let me know if my comments are appropriate, this is my first time reviewing for JOSS so I'm learning, thanks!

@emdupre
Copy link
Member

emdupre commented Apr 2, 2024

👋 Hi everyone ! Thank you for your comments on MIRP to date !

I just wanted to note that we have now passed the four week review window. If you could please work on finalizing your initial reviews as soon as possible, I would appreciate it.

Once you have finalized your initial reviews, you can let me know by responding directly in this thread. Of course, if you have any questions or blockers, please don't hesitate to let me know !

🙏 @surajpaib @Matthew-Jennings @drcandacemakedamoore @theanega

@emdupre
Copy link
Member

emdupre commented Apr 4, 2024

👋 Hello again,

I just wanted to follow up on the previous message as I know if created some concern and confusion :

  • I should have noted that the "initial review" refers to each reviewer working through their checklist and opening up associated issues. I did not mean to imply that those issues should have also been resolved !
  • The "four week review window" was in reference to our original request to finish reviews within four weeks if possible or six weeks at latest. We have just passed five weeks on this review ; however, if you require additional time beyond the six week window, please let me know. I will try to accommodate specific timelines as possible.

Apologies for being unclear on these points in my previous message. If there's anything else I can clarify, please let me know. And thank you again for your work in reviewing MIRP !

@Matthew-Jennings
Copy link

Thanks, @emdupre! No worries.

@drcandacemakedamoore
Copy link

@alexzwanenburg can you confirm the most up to date branch we should be looking at is dev2.2.1 for everything?

@alexzwanenburg
Copy link

@drcandacemakedamoore I can confirm that the most up-to-date branch is dev2.2.1. I have been working on this branch to address your comments and suggestions. This does not include the paper itself, which lives in the paper branch.

@surajpaib
Copy link

Hi @emdupre! I will cross off the remainder of the checklist by the end of this week. I hope that timeline works.

@emdupre
Copy link
Member

emdupre commented Apr 18, 2024

👋 Hi everyone, thanks for the updates !

I just wanted to summarize status:

@theanega : I see that you've completed your checklist, but you've also created issue oncoray/mirp#69 which is still open. Could you please confirm the status of your review ?

@Matthew-Jennings : I see that you've not yet completed your checklist and that you've created the following open issues oncoray/mirp#72, oncoray/mirp#73, and oncoray/mirp#75. I assume you've currently finished the initial review and are now waiting on the resolution of these issues, but if you're instead still working on the initial review,please let me know.

@drcandacemakedamoore and @surajpaib : I know that you were not expecting to finish the initial review until this week. Please let us know when you are able to do so !

I'll also cross-link all of the other associated issues to date, in case this helps in finalizing initial reviews:

@Matthew-Jennings
Copy link

@emdupre: Yep, that's correct!

@alexzwanenburg
Copy link

I released version 2.2.1. This also includes updates to the documentation and a new tutorial: https://oncoray.github.io/mirp/tutorial_compute_radiomics_features_mr.html

I am looking forward to your feedback on these updates.

@surajpaib
Copy link

Hi @emdupre I've opened an issue on the MIRP repo with more of my observations and with that I'm done with my initial review.

Much thanks for your patience

@emdupre
Copy link
Member

emdupre commented May 6, 2024

👋 Hi everyone, and happy Monday !

I wanted to update with my understanding of the current status of this review for MIRP.

Currently, the following JOSS associated issues are open:

At this point, it appears that @alexzwanenburg is actively addressing these comments.

Please let me know, though, if there are any review-related discussions I am not capturing here and should be aware of. And thank you all again for your work in reviewing MIRP to date ! 💐

@alexzwanenburg
Copy link

I have released version 2.2.2 which resolved the linked issues. Thanks for all the feedback!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Python R review TeX Track: 2 (BCM) Biomedical Engineering, Biosciences, Chemistry, and Materials
Projects
None yet
Development

No branches or pull requests

7 participants