Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[REVIEW]: SMBLtoODEpy: A software program for converting SBML models into ODE models in Python #1643

Closed
36 tasks done
whedon opened this issue Aug 12, 2019 · 56 comments
Closed
36 tasks done
Assignees
Labels
accepted published Papers published in JOSS recommend-accept Papers recommended for acceptance in JOSS. review

Comments

@whedon
Copy link

whedon commented Aug 12, 2019

Submitting author: @ashleefv (Ashlee N. Ford Versypt)
Repository: https://github.com/SMRuggiero/sbmltoodepy
Version: v1.0.3
Editor: @mgymrek
Reviewer: @SirSharpest, @marouenbg
Archive: 10.5281/zenodo.3441677

Status

status

Status badge code:

HTML: <a href="https://joss.theoj.org/papers/7180dfddc52a8b3bbf13b58eaa0e6d45"><img src="https://joss.theoj.org/papers/7180dfddc52a8b3bbf13b58eaa0e6d45/status.svg"></a>
Markdown: [![status](https://joss.theoj.org/papers/7180dfddc52a8b3bbf13b58eaa0e6d45/status.svg)](https://joss.theoj.org/papers/7180dfddc52a8b3bbf13b58eaa0e6d45)

Reviewers and authors:

Please avoid lengthy details of difficulties in the review thread. Instead, please create a new issue in the target repository and link to those issues (especially acceptance-blockers) by leaving comments in the review thread below. (For completists: if the target issue tracker is also on GitHub, linking the review thread in the issue or vice versa will create corresponding breadcrumb trails in the link target.)

Reviewer instructions & questions

@SirSharpest & @marouenbg, please carry out your review in this issue by updating the checklist below. If you cannot edit the checklist please:

  1. Make sure you're logged in to your GitHub account
  2. Be sure to accept the invite at this URL: https://github.com/openjournals/joss-reviews/invitations

The reviewer guidelines are available here: https://joss.readthedocs.io/en/latest/reviewer_guidelines.html. Any questions/concerns please let @mgymrek know.

Please try and complete your review in the next two weeks

Review checklist for @SirSharpest

Conflict of interest

Code of Conduct

General checks

  • Repository: Is the source code for this software available at the repository url?
  • License: Does the repository contain a plain-text LICENSE file with the contents of an OSI approved software license?
  • Version: v1.0.3
  • Authorship: Has the submitting author (@ashleefv) made major contributions to the software? Does the full list of paper authors seem appropriate and complete?

Functionality

  • Installation: Does installation proceed as outlined in the documentation?
  • Functionality: Have the functional claims of the software been confirmed?
  • Performance: If there are any performance claims of the software, have they been confirmed? (If there are no claims, please check off this item.)

Documentation

  • A statement of need: Do the authors clearly state what problems the software is designed to solve and who the target audience is?
  • Installation instructions: Is there a clearly-stated list of dependencies? Ideally these should be handled with an automated package management solution.
  • Example usage: Do the authors include examples of how to use the software (ideally to solve real-world analysis problems).
  • Functionality documentation: Is the core functionality of the software documented to a satisfactory level (e.g., API method documentation)?
  • Automated tests: Are there automated tests or manual steps described so that the function of the software can be verified?
  • Community guidelines: Are there clear guidelines for third parties wishing to 1) Contribute to the software 2) Report issues or problems with the software 3) Seek support

Software paper

  • Authors: Does the paper.md file include a list of authors with their affiliations?
  • A statement of need: Do the authors clearly state what problems the software is designed to solve and who the target audience is?
  • References: Do all archival references that should have a DOI list one (e.g., papers, datasets, software)?

Review checklist for @marouenbg

Conflict of interest

Code of Conduct

General checks

  • Repository: Is the source code for this software available at the repository url?
  • License: Does the repository contain a plain-text LICENSE file with the contents of an OSI approved software license?
  • Version: v1.0.3
  • Authorship: Has the submitting author (@ashleefv) made major contributions to the software? Does the full list of paper authors seem appropriate and complete?

Functionality

  • Installation: Does installation proceed as outlined in the documentation?
  • Functionality: Have the functional claims of the software been confirmed?
  • Performance: If there are any performance claims of the software, have they been confirmed? (If there are no claims, please check off this item.)

Documentation

  • A statement of need: Do the authors clearly state what problems the software is designed to solve and who the target audience is?
  • Installation instructions: Is there a clearly-stated list of dependencies? Ideally these should be handled with an automated package management solution.
  • Example usage: Do the authors include examples of how to use the software (ideally to solve real-world analysis problems).
  • Functionality documentation: Is the core functionality of the software documented to a satisfactory level (e.g., API method documentation)?
  • Automated tests: Are there automated tests or manual steps described so that the function of the software can be verified?
  • Community guidelines: Are there clear guidelines for third parties wishing to 1) Contribute to the software 2) Report issues or problems with the software 3) Seek support

Software paper

  • Authors: Does the paper.md file include a list of authors with their affiliations?
  • A statement of need: Do the authors clearly state what problems the software is designed to solve and who the target audience is?
  • References: Do all archival references that should have a DOI list one (e.g., papers, datasets, software)?
@whedon
Copy link
Author

whedon commented Aug 12, 2019

Hello human, I'm @whedon, a robot that can help you with some common editorial tasks. @SirSharpest, @marouenbg it looks like you're currently assigned to review this paper 🎉.

⭐ Important ⭐

If you haven't already, you should seriously consider unsubscribing from GitHub notifications for this (https://github.com/openjournals/joss-reviews) repository. As a reviewer, you're probably currently watching this repository which means for GitHub's default behaviour you will receive notifications (emails) for all reviews 😿

To fix this do the following two things:

  1. Set yourself as 'Not watching' https://github.com/openjournals/joss-reviews:

watching

  1. You may also like to change your default settings for this watching repositories in your GitHub profile here: https://github.com/settings/notifications

notifications

For a list of things I can do to help you, just type:

@whedon commands

For example, to regenerate the paper pdf after making changes in the paper's md or bib files, type:

@whedon generate pdf

@whedon
Copy link
Author

whedon commented Aug 12, 2019

Attempting PDF compilation. Reticulating splines etc...

@whedon
Copy link
Author

whedon commented Aug 12, 2019

@AoifeHughes
Copy link

@ashleefv I have made my comments as a new issue on the relevant github page: AnabelSMRuggiero/sbmltoodepy#1

Most are minor, and possibly non-issues. Though, I feel that the clarifications I mentioned for the manuscript would be important to readers. As would being explicit in which python versions are supported. Finally, I feel strongly about unit-testing being done to software engineering standards, but I understand that this has at least been addressed using their own method.

@mgymrek if there is anything else, prior to author response, please let me know and I'll promptly proceed with.

@mgymrek
Copy link

mgymrek commented Sep 1, 2019

Thanks @SirSharpest and @marouenbg. These comments look reasonable.
@ashleefv please review the comments from the reviewers and post responses to their comments here.

@ashleefv
Copy link

ashleefv commented Sep 1, 2019 via email

@ashleefv
Copy link

ashleefv commented Sep 8, 2019

Responses to Reviewer 1's comments (Issue 1):

General Comments

We now specify that the supported versions of python are 3.5, 3.6, and 3.7.

Added comment to README.md about installing from source.

Testing

PyTest could be a good fit for our project, but we expect to include if we have a major update in the future.

Software checked on:

We now specify that the supported versions of python are 3.5, 3.6, and 3.7.

Software style

Removed all of the "import *" statements. While the updated code on this github repo has been updated, we plan on updating the package on PyPI once all reviewer comments have been addressed.

Running the provided Tutorial.md

We addressed the comments by

  • clarifying the case when cloning the repo and pip install make sense together (use on google colab)
  • fixing pip install command
  • hosting a copy of SBMLtoODEpy.ipynb in the repo

Community guidelines

We do not see the scope of the project for others' to make pull requests or suggest features.

Paper Clarifications

In our paper, we compare our work to two other pieces of software.

COPASI- The comparison was intended simply to validate the results of the models generated by SBMLtoODEpy. COPASI uses ODEPACK for its Time Course calculations, which is the same library that scipy's odeint function uses. If the calculations agree, then that suggests SBMLtoODEpy has generated the correct equations. Furthermore, the targeted niche for SBMLtoODEpy is not the same as COPASI.

Systems Biology Format Converter (SBFC)- SBFC is a better software to compare SBMLtoODEpy to. SBFC converts SBML models into various programming and modeling languages, but notably excludes Python. This gap SBFC's functionality is specifically the target niche of SBMLtoODEpy. Additionally, SBMLtoODEpy is designed to generate code that can be incorporated into other python projects. The Octave/MATLAB scripts generated by SBFC implement the model that neither accept arguments nor return any values. SBMLtoODEpy generates code that implements the model as a class with most methods and members intended to be accessible to other code.

The paper has been updated to better convey the scope and intent of the comparisons.

@ashleefv
Copy link

ashleefv commented Sep 8, 2019

Responses to Reviewer 2's comments

Comment 1 (Issue 2)

Since we expect users to take and modify or use the code generated by SBMLtoODEpy in ways we can't predict, the burden of applying SED-ML or KiSAO will have to be on the user. Additionally most SBML components are stored as entries in dictionaries that are members of the class implementing the model. Currently rules and initial assignments are not implemented this way and annotations stored in the SBML model are not retained. We plan to address these in a future update.

We updated the paper to expand on the comparison to COPASI. My comment to reviewer #1 addresses the comparison to COPASI and Systems Biology Format Converter in more depth.

Comment 2 (Issue 3)

We looked into the suggestion and are currently not interested in using Travis-CI. Travis-CI does not play well with PyPI accounts with non-alphanumeric characters in its password. Additionally, any updates to SBMLtoODEpy will change the code generated by it and may require different interfacing with user code. Because of this, we want only want a new version of our code pushed when we are absolutely certain its ready for public use. Building and pushing a new version of our code to PyPI is already a simple process with setuptools and twine.

We did set up a code ocean capsule and tested our code on Ubuntu 16.04 and 18.04. We're not planning on publishing the capsule as we do not believe it to be a good fit to publish this project on code ocean. However, we are glad the reviewer provided this suggestion because it may be a good fit for some of our other projects that involve simulation experiments.

Comment 3 (Issue 4)

We have added in both the readme and tutorial to our documentation page. While we see the value in a troubleshooting section of the documentation, we think we need user feedback before creating it. We plan on adding a changelog section to the documentation as part of finishing addressing reviewer feedback, if future edits are made.

@ashleefv
Copy link

ashleefv commented Sep 8, 2019

@whedon generate pdf

@whedon
Copy link
Author

whedon commented Sep 8, 2019

Attempting PDF compilation. Reticulating splines etc...

@whedon
Copy link
Author

whedon commented Sep 8, 2019

@ashleefv
Copy link

ashleefv commented Sep 8, 2019

@whedon generate pdf

@whedon
Copy link
Author

whedon commented Sep 8, 2019

Attempting PDF compilation. Reticulating splines etc...

@whedon
Copy link
Author

whedon commented Sep 8, 2019

@ashleefv
Copy link

ashleefv commented Sep 8, 2019

Approved the new article proof. We have concluded our responses to the reviewers' comments.

@mgymrek
Copy link

mgymrek commented Sep 8, 2019

Thanks @ashleefv for addressing these comments.
@SirSharpest, @marouenbg, could you review these edits and update the checklists above accordingly?

Also, I noticed you mentioned you mentioned for community guidelines that you do not envision contributions from others to the project. However this is an important aspect of open source projects. Even if you do not envision new features, it's possible a user could find a bug and want to contribute a fix for example. To address this point you could simply add a blurb to the readme suggesting users to contact you or submit a pull request if they would like to contribute a change.

@ashleefv
Copy link

ashleefv commented Sep 8, 2019 via email

@marouenbg
Copy link

I just checked all the boxes on my side. The authors did address the comments either through new commits and/or provided reasonable explanations for sticking to their initial choices. Great job!

@mgymrek
Copy link

mgymrek commented Sep 10, 2019

Perfect, thanks @marouenbg and @SirSharpest for the reviews!

@mgymrek
Copy link

mgymrek commented Sep 10, 2019

@whedon generate pdf

@whedon
Copy link
Author

whedon commented Sep 10, 2019

Attempting PDF compilation. Reticulating splines etc...

@whedon
Copy link
Author

whedon commented Sep 18, 2019

Attempting PDF compilation. Reticulating splines etc...

@whedon
Copy link
Author

whedon commented Sep 18, 2019

@ashleefv
Copy link

@whedon check references

@whedon
Copy link
Author

whedon commented Sep 18, 2019

Attempting to check references...

@whedon
Copy link
Author

whedon commented Sep 18, 2019


OK DOIs

- 10.1093/bioinformatics/btg015 is OK
- 10.1093/nar/gku1181 is OK
- 10.1042/bst0311472 is OK
- 10.1093/nar/gkx1023 is OK
- 10.1093/bioinformatics/btn051 is OK
- 10.1109/MCSE.2011.37 is OK
- 10.1109/MCSE.2007.58 is OK
- 10.1186/s12859-016-1000-2 is OK
- 10.1038/msb.2009.19 is OK
- 10.1146/annurev.ph.34.030172.000305 is OK
- 10.1371/journal.pcbi.1003371 is OK
- 10.1111/iep.12062 is OK
- 10.1007/s11538-005-9022-3 is OK
- 10.1038/msb.2011.22 is OK
- 10.1093/bioinformatics/btl485 is OK

MISSING DOIs

- None

INVALID DOIs

- None

@ashleefv
Copy link

@mgymrek We have updated the Hindmarsh url, fixed the noted typo, finalized the revised pypi release, and created a new zenodo doi for this updated release: https://zenodo.org/record/3441677#.XYLMoyhKjZs

@mgymrek
Copy link

mgymrek commented Sep 20, 2019

@whedon set v1.0.3 as version

@whedon
Copy link
Author

whedon commented Sep 20, 2019

OK. v1.0.3 is the version.

@mgymrek
Copy link

mgymrek commented Sep 20, 2019

@whedon set 10.5281/zenodo.3441677 as archive

@whedon
Copy link
Author

whedon commented Sep 20, 2019

OK. 10.5281/zenodo.3441677 is the archive.

@mgymrek
Copy link

mgymrek commented Sep 20, 2019

@openjournals/joss-eics We are ready to accept this submission!

@kthyng
Copy link

kthyng commented Sep 20, 2019

@whedon accept

@whedon
Copy link
Author

whedon commented Sep 20, 2019

Attempting dry run of processing paper acceptance...

@whedon
Copy link
Author

whedon commented Sep 20, 2019

Check final proof 👉 openjournals/joss-papers#974

If the paper PDF and Crossref deposit XML look good in openjournals/joss-papers#974, then you can now move forward with accepting the submission by compiling again with the flag deposit=true e.g.

@whedon accept deposit=true

@kthyng
Copy link

kthyng commented Sep 20, 2019

@whedon accept deposit=true

@whedon
Copy link
Author

whedon commented Sep 20, 2019

Doing it live! Attempting automated processing of paper acceptance...

@whedon
Copy link
Author

whedon commented Sep 20, 2019

🐦🐦🐦 👉 Tweet for this paper 👈 🐦🐦🐦

@whedon
Copy link
Author

whedon commented Sep 20, 2019

🚨🚨🚨 THIS IS NOT A DRILL, YOU HAVE JUST ACCEPTED A PAPER INTO JOSS! 🚨🚨🚨

Here's what you must now do:

  1. Check final PDF and Crossref metadata that was deposited 👉 Creating pull request for 10.21105.joss.01643 joss-papers#975
  2. Wait a couple of minutes to verify that the paper DOI resolves https://doi.org/10.21105/joss.01643
  3. If everything looks good, then close this review issue.
  4. Party like you just published a paper! 🎉🌈🦄💃👻🤘

Any issues? notify your editorial technical team...

@kthyng
Copy link

kthyng commented Sep 20, 2019

Woohoo!! Congrats @ashleefv on your publication and thank you to reviewers @SirSharpest and @marouenbg!!

@kthyng kthyng closed this as completed Sep 20, 2019
@whedon
Copy link
Author

whedon commented Sep 20, 2019

🎉🎉🎉 Congratulations on your paper acceptance! 🎉🎉🎉

If you would like to include a link to your paper from your README use the following code snippets:

Markdown:
[![DOI](https://joss.theoj.org/papers/10.21105/joss.01643/status.svg)](https://doi.org/10.21105/joss.01643)

HTML:
<a style="border-width:0" href="https://doi.org/10.21105/joss.01643">
  <img src="https://joss.theoj.org/papers/10.21105/joss.01643/status.svg" alt="DOI badge" >
</a>

reStructuredText:
.. image:: https://joss.theoj.org/papers/10.21105/joss.01643/status.svg
   :target: https://doi.org/10.21105/joss.01643

This is how it will look in your documentation:

DOI

We need your help!

Journal of Open Source Software is a community-run journal and relies upon volunteer effort. If you'd like to support us please consider doing either one (or both) of the the following:

@whedon whedon added published Papers published in JOSS recommend-accept Papers recommended for acceptance in JOSS. labels Mar 2, 2020
@ashleefv
Copy link

ashleefv commented Sep 3, 2020

I recently got the feedback that our acronym in the first word of the title reversed the B & M. It should be SBMLtoODEpy. Additionally this accidental reversal of the letters appeared one other time int he document. Can this be corrected?

@ashleefv
Copy link

ashleefv commented Sep 3, 2020

@mgymrek We aren't sure if our title issue and internal typo can be updated. We accidentally switched the letters B&M in the title of our paper and in one other place (I'm fine if that internal typo is not corrected, but the title would be best to fix.). I never caught the error because it blended together for me visually. Someone just commented on our github repository pointing out the issue.

@arfon
Copy link
Member

arfon commented Sep 4, 2020

@ashleefv - please make the necessary changes to your paper.md file and let me know when you've done this. I can then update the paper.

@ashleefv
Copy link

ashleefv commented Sep 4, 2020

@arfon I have updated paper.md in the repository: https://github.com/SMRuggiero/sbmltoodepy

@arfon
Copy link
Member

arfon commented Sep 6, 2020

@ashleefv - this was fixed in openjournals/joss-papers@5f08ae3 although the paper might take a few hours to show as updated on the JOSS website as there's caching in place.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
accepted published Papers published in JOSS recommend-accept Papers recommended for acceptance in JOSS. review
Projects
None yet
Development

No branches or pull requests

7 participants