Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[REVIEW]: interflow: A Python package to organize, calculate, and visualize sectoral interdependency flow data #4336

Closed
editorialbot opened this issue Apr 22, 2022 · 53 comments
Assignees
Labels
accepted Jupyter Notebook published Papers published in JOSS Python recommend-accept Papers recommended for acceptance in JOSS. review TeX

Comments

@editorialbot
Copy link
Collaborator

editorialbot commented Apr 22, 2022

Submitting author: @kmongird (Kendall Mongird)
Repository: https://github.com/pnnl/interflow
Branch with paper.md (empty if default branch):
Version: v1.0.3
Editor: @fraukewiese
Reviewers: @wiljnich, @j3r3m1
Archive: 10.5281/zenodo.6620928

Status

status

Status badge code:

HTML: <a href="https://joss.theoj.org/papers/ae736aa6e75758498cf79ab8ec3fa886"><img src="https://joss.theoj.org/papers/ae736aa6e75758498cf79ab8ec3fa886/status.svg"></a>
Markdown: [![status](https://joss.theoj.org/papers/ae736aa6e75758498cf79ab8ec3fa886/status.svg)](https://joss.theoj.org/papers/ae736aa6e75758498cf79ab8ec3fa886)

Reviewers and authors:

Please avoid lengthy details of difficulties in the review thread. Instead, please create a new issue in the target repository and link to those issues (especially acceptance-blockers) by leaving comments in the review thread below. (For completists: if the target issue tracker is also on GitHub, linking the review thread in the issue or vice versa will create corresponding breadcrumb trails in the link target.)

Reviewer instructions & questions

@wiljnich & @j3r3m1, your review will be checklist based. Each of you will have a separate checklist that you should update when carrying out your review.
First of all you need to run this command in a separate comment to create the checklist:

@editorialbot generate my checklist

The reviewer guidelines are available here: https://joss.readthedocs.io/en/latest/reviewer_guidelines.html. Any questions/concerns please let @fraukewiese know.

Please start on your review when you are able, and be sure to complete your review in the next six weeks, at the very latest

Checklists

📝 Checklist for @wiljnich

📝 Checklist for @j3r3m1

@editorialbot
Copy link
Collaborator Author

Hello humans, I'm @editorialbot, a robot that can help you with some common editorial tasks.

For a list of things I can do to help you, just type:

@editorialbot commands

For example, to regenerate the paper pdf after making changes in the paper's md or bib files, type:

@editorialbot generate pdf

@editorialbot
Copy link
Collaborator Author

Software report:

github.com/AlDanial/cloc v 1.88  T=0.69 s (128.2 files/s, 60783.2 lines/s)
-------------------------------------------------------------------------------
Language                     files          blank        comment           code
-------------------------------------------------------------------------------
JavaScript                      10           2406           2464           9227
HTML                            24           1059              0           8548
reStructuredText                23           3703             53           4440
Python                          16           1858           1717           4236
CSS                              5            338             52           1306
Markdown                         4             49              0            116
TeX                              1             22              0            104
YAML                             2             11              4             54
DOS Batch                        1              8              1             26
Jupyter Notebook                 1              0            347             15
make                             1              4              7              9
JSON                             1              0              0              1
-------------------------------------------------------------------------------
SUM:                            89           9458           4645          28082
-------------------------------------------------------------------------------


gitinspector failed to run statistical information for the repository

@editorialbot
Copy link
Collaborator Author

Wordcount for paper.md is 901

@editorialbot
Copy link
Collaborator Author

Reference check summary (note 'MISSING' DOIs are suggestions that need verification):

OK DOIs

- 10.1021/acs.est.8b00139 is OK
- 10.1007/s11269-013-0331-2 is OK
- 10.1021/acs.est.6b01065 is OK
- 10.1109/MCSE.2007.55 is OK
- 10.25080/Majora-92bf1922-00a is OK

MISSING DOIs

- None

INVALID DOIs

- None

@editorialbot
Copy link
Collaborator Author

👉📄 Download article proof 📄 View article proof on GitHub 📄 👈

@wiljnich
Copy link

wiljnich commented Apr 22, 2022

Review checklist for @wiljnich

Conflict of interest

  • I confirm that I have read the JOSS conflict of interest (COI) policy and that: I have no COIs with reviewing this work or that any perceived COIs have been waived by JOSS for the purpose of this review.

Code of Conduct

General checks

  • Repository: Is the source code for this software available at the https://github.com/pnnl/interflow?
  • License: Does the repository contain a plain-text LICENSE file with the contents of an OSI approved software license?
  • Contribution and authorship: Has the submitting author (@kmongird) made major contributions to the software? Does the full list of paper authors seem appropriate and complete?
  • Substantial scholarly effort: Does this submission meet the scope eligibility described in the JOSS guidelines

Functionality

  • Installation: Does installation proceed as outlined in the documentation?
  • Functionality: Have the functional claims of the software been confirmed?
  • Performance: If there are any performance claims of the software, have they been confirmed? (If there are no claims, please check off this item.)

Documentation

  • A statement of need: Do the authors clearly state what problems the software is designed to solve and who the target audience is?
  • Installation instructions: Is there a clearly-stated list of dependencies? Ideally these should be handled with an automated package management solution.
  • Example usage: Do the authors include examples of how to use the software (ideally to solve real-world analysis problems).
  • Functionality documentation: Is the core functionality of the software documented to a satisfactory level (e.g., API method documentation)?
  • Automated tests: Are there automated tests or manual steps described so that the functionality of the software can be verified?
  • Community guidelines: Are there clear guidelines for third parties wishing to 1) Contribute to the software 2) Report issues or problems with the software 3) Seek support

Software paper

  • Summary: Has a clear description of the high-level functionality and purpose of the software for a diverse, non-specialist audience been provided?
  • A statement of need: Does the paper have a section titled 'Statement of Need' that clearly states what problems the software is designed to solve and who the target audience is?
  • State of the field: Do the authors describe how this software compares to other commonly-used packages?
  • Quality of writing: Is the paper well written (i.e., it does not require editing for structure, language, or writing quality)?
  • References: Is the list of references complete, and is everything cited appropriately that should be cited (e.g., papers, datasets, software)? Do references in the text use the proper citation syntax?

@wiljnich
Copy link

I am very pleased with this package and happily recommend interflow for publication in JOSS. I have one recommendation, which is not acceptance blocking. To conform with the review criteria (Documentation #5), I believe that a short explainer of the test cases and instructions for their use should be added to the documentation site.

This is a great package that solves a painful need - I have built Sankey energy flows in Python before, and this is a vast improvement. Excellent work by @kmongird and team.

@kmongird
Copy link

@wiljnich thank you very much for your review! We greatly appreciate your time and comments.

Just to get some clarification on the recommendation you’ve described and to make sure we’re on the same page, when you state “short explainer of the test cases and instructions for their use should be added” are you referring to the test suite for the package or the usage of the sample data in the quickstarter?

@wiljnich
Copy link

@kmongird I am referring to the test suite

@kmongird
Copy link

Great, thanks very much.

@fraukewiese
Copy link

@wiljnich Thank you very much for your review!

@fraukewiese
Copy link

@kmongird : Please let us know when you have updated the explainer and instructions for the test suite.

@kmongird
Copy link

Thanks @fraukewiese, will do

@kmongird
Copy link

@fraukewiese and @wiljnich, the documentation has been updated to include an explainer of the test suite which be found here: https://pnnl.github.io/interflow/api_docs.html#test-validation-suite

@fraukewiese
Copy link

Thanks @kmongird
@wiljnich : Do you think that explainer is adequate?

@wiljnich
Copy link

@kmongird thank you for making this update! @fraukewiese, I am pleased. All elements of my review have been satisfied.

@j3r3m1
Copy link

j3r3m1 commented Apr 29, 2022

Review checklist for @j3r3m1

Conflict of interest

  • I confirm that I have read the JOSS conflict of interest (COI) policy and that: I have no COIs with reviewing this work or that any perceived COIs have been waived by JOSS for the purpose of this review.

Code of Conduct

General checks

  • Repository: Is the source code for this software available at the https://github.com/pnnl/interflow?
  • License: Does the repository contain a plain-text LICENSE file with the contents of an OSI approved software license?
  • Contribution and authorship: Has the submitting author (@kmongird) made major contributions to the software? Does the full list of paper authors seem appropriate and complete?
  • Substantial scholarly effort: Does this submission meet the scope eligibility described in the JOSS guidelines

Functionality

  • Installation: Does installation proceed as outlined in the documentation?
  • Functionality: Have the functional claims of the software been confirmed?
  • Performance: If there are any performance claims of the software, have they been confirmed? (If there are no claims, please check off this item.)

Documentation

  • A statement of need: Do the authors clearly state what problems the software is designed to solve and who the target audience is?
  • Installation instructions: Is there a clearly-stated list of dependencies? Ideally these should be handled with an automated package management solution.
  • Example usage: Do the authors include examples of how to use the software (ideally to solve real-world analysis problems).
  • Functionality documentation: Is the core functionality of the software documented to a satisfactory level (e.g., API method documentation)?
  • Automated tests: Are there automated tests or manual steps described so that the functionality of the software can be verified?
  • Community guidelines: Are there clear guidelines for third parties wishing to 1) Contribute to the software 2) Report issues or problems with the software 3) Seek support

Software paper

  • Summary: Has a clear description of the high-level functionality and purpose of the software for a diverse, non-specialist audience been provided?
  • A statement of need: Does the paper have a section titled 'Statement of Need' that clearly states what problems the software is designed to solve and who the target audience is?
  • State of the field: Do the authors describe how this software compares to other commonly-used packages?
  • Quality of writing: Is the paper well written (i.e., it does not require editing for structure, language, or writing quality)?
  • References: Is the list of references complete, and is everything cited appropriately that should be cited (e.g., papers, datasets, software)? Do references in the text use the proper citation syntax?

@j3r3m1
Copy link

j3r3m1 commented May 11, 2022

Hi @kmongird thank you for your contribution. I have started the review today. I will add new issues in your repo soon.

@j3r3m1
Copy link

j3r3m1 commented May 11, 2022

I have finished my review, I think when my comments will be answered this would be fine for me to accept the paper. Nice job @kmongird and coauthors.

@kmongird
Copy link

@j3r3m1 Thank you for your review! I will begin addressing your comments and comment again here when I'm done.

@fraukewiese
Copy link

@editorialbot generate pdf

@editorialbot
Copy link
Collaborator Author

👉📄 Download article proof 📄 View article proof on GitHub 📄 👈

@j3r3m1
Copy link

j3r3m1 commented May 30, 2022

Thank you @kmongird and coauthor for responding my remarks. It is OK for me to publish the manuscript as it is. And thank to you for your contribution to the community.

@kmongird
Copy link

kmongird commented Jun 1, 2022

@fraukewiese at your earliest convenience, please let me know what steps I should take next. Thank you again to both reviewers!

@fraukewiese
Copy link

@kmongird , some minor point regarding the article, please check:

  • Line 21: I think there is something missing in the sentence, maybe "to" before pull?
  • Line 22: Should it be "help "to" investigate?" - Here I am absolutely not sure since I am not a native speaker, so please check and if you think it is correct without "to", then it is fine
  • Reference List: Greenberg et al 2017: Link not working
  • Reference List: Webber 2017: Please include the DOI. Then the title does not need to be linked

@fraukewiese
Copy link

At this point could you:

  • Make a tagged release of your software, and list the version tag of the archived version here.
  • Archive the reviewed software in Zenodo or a similar service (e.g., figshare, an institutional repository)
  • Check the archival deposit (e.g., in Zenodo) has the correct metadata. This includes the title (should match the paper title) and author list (make sure the list is correct and people who only made a small fix are not on it). You may also add the authors' ORCID.
  • Please list the DOI of the archived version here.

I can then move forward with accepting the submission.

@kmongird
Copy link

kmongird commented Jun 7, 2022

@fraukewiese I have updated the link for the Greenberg et al. 2017 reference and provided the DOI for the Webber 2017 reference. Regarding the sentences in lines 21 and 22, both have been double checked in a grammar evaluation software and confirmed to be grammatically correct as is so no changes have been made to the text.

@kmongird
Copy link

kmongird commented Jun 7, 2022

version tag for tagged release:
v1.0.3

Zenodo DOI:
10.5281/zenodo.6620928

@fraukewiese
Copy link

@editorialbot set 10.5281/zenodo.6620928 as archive

@editorialbot
Copy link
Collaborator Author

Done! Archive is now 10.5281/zenodo.6620928

@fraukewiese
Copy link

@editorialbot set v1.0.3 as version

@editorialbot
Copy link
Collaborator Author

Done! version is now v1.0.3

@fraukewiese
Copy link

@editorialbot generate pdf

@editorialbot
Copy link
Collaborator Author

👉📄 Download article proof 📄 View article proof on GitHub 📄 👈

@fraukewiese
Copy link

@kmongird In the latest article proof I cannot see the changes made (Greenberg et al. 2017 link and Webber 2017 DOI) - could you please check where you made the changes? Thank you.

@kmongird
Copy link

kmongird commented Jun 8, 2022

Hi @fraukewiese,

On lines 103-104, the broken Greenberg et al. 2017 link has been removed and has been replaced with this working link: https://flowcharts.llnl.gov/report

For the Webber 2017 reference, I removed the title link and have included the doi that is now shown on line 129: https://doi.org/10.1016/B978-0-12-803237-4.00012-4

@fraukewiese
Copy link

@editorialbot recommend-accept

@editorialbot
Copy link
Collaborator Author

Attempting dry run of processing paper acceptance...

@editorialbot
Copy link
Collaborator Author

Reference check summary (note 'MISSING' DOIs are suggestions that need verification):

OK DOIs

- 10.1016/B978-0-12-803237-4.00012-4 is OK
- 10.1021/acs.est.8b00139 is OK
- 10.1007/s11269-013-0331-2 is OK
- 10.1021/acs.est.6b01065 is OK
- 10.1109/MCSE.2007.55 is OK
- 10.25080/Majora-92bf1922-00a is OK

MISSING DOIs

- None

INVALID DOIs

- None

@editorialbot
Copy link
Collaborator Author

👋 @openjournals/joss-eics, this paper is ready to be accepted and published.

Check final proof 👉 openjournals/joss-papers#3274

If the paper PDF and the deposit XML files look good in openjournals/joss-papers#3274, then you can now move forward with accepting the submission by compiling again with the command @editorialbot accept

@editorialbot editorialbot added the recommend-accept Papers recommended for acceptance in JOSS. label Jun 14, 2022
@arfon
Copy link
Member

arfon commented Jun 14, 2022

@editorialbot accept

@editorialbot
Copy link
Collaborator Author

Doing it live! Attempting automated processing of paper acceptance...

@editorialbot
Copy link
Collaborator Author

🐦🐦🐦 👉 Tweet for this paper 👈 🐦🐦🐦

@editorialbot
Copy link
Collaborator Author

🚨🚨🚨 THIS IS NOT A DRILL, YOU HAVE JUST ACCEPTED A PAPER INTO JOSS! 🚨🚨🚨

Here's what you must now do:

  1. Check final PDF and Crossref metadata that was deposited 👉 Creating pull request for 10.21105.joss.04336 joss-papers#3280
  2. Wait a couple of minutes, then verify that the paper DOI resolves https://doi.org/10.21105/joss.04336
  3. If everything looks good, then close this review issue.
  4. Party like you just published a paper! 🎉🌈🦄💃👻🤘

Any issues? Notify your editorial technical team...

@editorialbot editorialbot added accepted published Papers published in JOSS labels Jun 14, 2022
@arfon
Copy link
Member

arfon commented Jun 14, 2022

@wiljnich, @j3r3m1 – many thanks for your reviews here and to @fraukewiese for editing this submission! JOSS relies upon the volunteer effort of people like you and we simply wouldn't be able to do this without you ✨

@kmongird – your paper is now accepted and published in JOSS ⚡🚀💥

@arfon arfon closed this as completed Jun 14, 2022
@editorialbot
Copy link
Collaborator Author

🎉🎉🎉 Congratulations on your paper acceptance! 🎉🎉🎉

If you would like to include a link to your paper from your README use the following code snippets:

Markdown:
[![DOI](https://joss.theoj.org/papers/10.21105/joss.04336/status.svg)](https://doi.org/10.21105/joss.04336)

HTML:
<a style="border-width:0" href="https://doi.org/10.21105/joss.04336">
  <img src="https://joss.theoj.org/papers/10.21105/joss.04336/status.svg" alt="DOI badge" >
</a>

reStructuredText:
.. image:: https://joss.theoj.org/papers/10.21105/joss.04336/status.svg
   :target: https://doi.org/10.21105/joss.04336

This is how it will look in your documentation:

DOI

We need your help!

The Journal of Open Source Software is a community-run journal and relies upon volunteer effort. If you'd like to support us please consider doing either one (or both) of the the following:

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
accepted Jupyter Notebook published Papers published in JOSS Python recommend-accept Papers recommended for acceptance in JOSS. review TeX
Projects
None yet
Development

No branches or pull requests

6 participants