Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[REVIEW]: MorphoPy: A python package for feature extraction of neural morphologies. #2339

Closed
38 tasks done
whedon opened this issue Jun 15, 2020 · 76 comments
Closed
38 tasks done
Assignees
Labels
accepted published Papers published in JOSS recommend-accept Papers recommended for acceptance in JOSS. review

Comments

@whedon
Copy link

whedon commented Jun 15, 2020

Submitting author: @Visdoom (Sophie Laturnus)
Repository: https://github.com/berenslab/MorphoPy
Version: v0.71
Editor: @oliviaguest
Reviewer: @emptymalei, @mstimberg
Archive: 10.5281/zenodo.3956644

⚠️ JOSS reduced service mode ⚠️

Due to the challenges of the COVID-19 pandemic, JOSS is currently operating in a "reduced service mode". You can read more about what that means in our blog post.

Status

status

Status badge code:

HTML: <a href="https://joss.theoj.org/papers/c2c0b27b7eb01516d3808e8dd309543f"><img src="https://joss.theoj.org/papers/c2c0b27b7eb01516d3808e8dd309543f/status.svg"></a>
Markdown: [![status](https://joss.theoj.org/papers/c2c0b27b7eb01516d3808e8dd309543f/status.svg)](https://joss.theoj.org/papers/c2c0b27b7eb01516d3808e8dd309543f)

Reviewers and authors:

Please avoid lengthy details of difficulties in the review thread. Instead, please create a new issue in the target repository and link to those issues (especially acceptance-blockers) by leaving comments in the review thread below. (For completists: if the target issue tracker is also on GitHub, linking the review thread in the issue or vice versa will create corresponding breadcrumb trails in the link target.)

Reviewer instructions & questions

@emptymalei & @mstimberg, please carry out your review in this issue by updating the checklist below. If you cannot edit the checklist please:

  1. Make sure you're logged in to your GitHub account
  2. Be sure to accept the invite at this URL: https://github.com/openjournals/joss-reviews/invitations

The reviewer guidelines are available here: https://joss.readthedocs.io/en/latest/reviewer_guidelines.html. Any questions/concerns please let @oliviaguest know.

Please try and complete your review in the next six weeks

Review checklist for @emptymalei

Conflict of interest

  • I confirm that I have read the JOSS conflict of interest (COI) policy and that: I have no COIs with reviewing this work or that any perceived COIs have been waived by JOSS for the purpose of this review.

Code of Conduct

General checks

  • Repository: Is the source code for this software available at the repository url?
  • License: Does the repository contain a plain-text LICENSE file with the contents of an OSI approved software license?
  • Contribution and authorship: Has the submitting author (@Visdoom) made major contributions to the software? Does the full list of paper authors seem appropriate and complete?

Functionality

  • Installation: Does installation proceed as outlined in the documentation?
  • Functionality: Have the functional claims of the software been confirmed?
  • Performance: If there are any performance claims of the software, have they been confirmed? (If there are no claims, please check off this item.)

Documentation

  • A statement of need: Do the authors clearly state what problems the software is designed to solve and who the target audience is?
  • Installation instructions: Is there a clearly-stated list of dependencies? Ideally these should be handled with an automated package management solution.
  • Example usage: Do the authors include examples of how to use the software (ideally to solve real-world analysis problems).
  • Functionality documentation: Is the core functionality of the software documented to a satisfactory level (e.g., API method documentation)?
  • Automated tests: Are there automated tests or manual steps described so that the functionality of the software can be verified?
  • Community guidelines: Are there clear guidelines for third parties wishing to 1) Contribute to the software 2) Report issues or problems with the software 3) Seek support

Software paper

  • Summary: Has a clear description of the high-level functionality and purpose of the software for a diverse, non-specialist audience been provided?
  • A statement of need: Do the authors clearly state what problems the software is designed to solve and who the target audience is?
  • State of the field: Do the authors describe how this software compares to other commonly-used packages?
  • Quality of writing: Is the paper well written (i.e., it does not require editing for structure, language, or writing quality)?
  • References: Is the list of references complete, and is everything cited appropriately that should be cited (e.g., papers, datasets, software)? Do references in the text use the proper citation syntax?

Review checklist for @mstimberg

Conflict of interest

  • I confirm that I have read the JOSS conflict of interest (COI) policy and that: I have no COIs with reviewing this work or that any perceived COIs have been waived by JOSS for the purpose of this review.

Code of Conduct

General checks

  • Repository: Is the source code for this software available at the repository url?
  • License: Does the repository contain a plain-text LICENSE file with the contents of an OSI approved software license?
  • Contribution and authorship: Has the submitting author (@Visdoom) made major contributions to the software? Does the full list of paper authors seem appropriate and complete?

Functionality

  • Installation: Does installation proceed as outlined in the documentation?
  • Functionality: Have the functional claims of the software been confirmed?
  • Performance: If there are any performance claims of the software, have they been confirmed? (If there are no claims, please check off this item.)

Documentation

  • A statement of need: Do the authors clearly state what problems the software is designed to solve and who the target audience is?
  • Installation instructions: Is there a clearly-stated list of dependencies? Ideally these should be handled with an automated package management solution.
  • Example usage: Do the authors include examples of how to use the software (ideally to solve real-world analysis problems).
  • Functionality documentation: Is the core functionality of the software documented to a satisfactory level (e.g., API method documentation)?
  • Automated tests: Are there automated tests or manual steps described so that the functionality of the software can be verified?
  • Community guidelines: Are there clear guidelines for third parties wishing to 1) Contribute to the software 2) Report issues or problems with the software 3) Seek support

Software paper

  • Summary: Has a clear description of the high-level functionality and purpose of the software for a diverse, non-specialist audience been provided?
  • A statement of need: Do the authors clearly state what problems the software is designed to solve and who the target audience is?
  • State of the field: Do the authors describe how this software compares to other commonly-used packages?
  • Quality of writing: Is the paper well written (i.e., it does not require editing for structure, language, or writing quality)?
  • References: Is the list of references complete, and is everything cited appropriately that should be cited (e.g., papers, datasets, software)? Do references in the text use the proper citation syntax?
@whedon
Copy link
Author

whedon commented Jun 15, 2020

Hello human, I'm @whedon, a robot that can help you with some common editorial tasks. @emptymalei, @mstimberg it looks like you're currently assigned to review this paper 🎉.

⚠️ JOSS reduced service mode ⚠️

Due to the challenges of the COVID-19 pandemic, JOSS is currently operating in a "reduced service mode". You can read more about what that means in our blog post.

⭐ Important ⭐

If you haven't already, you should seriously consider unsubscribing from GitHub notifications for this (https://github.com/openjournals/joss-reviews) repository. As a reviewer, you're probably currently watching this repository which means for GitHub's default behaviour you will receive notifications (emails) for all reviews 😿

To fix this do the following two things:

  1. Set yourself as 'Not watching' https://github.com/openjournals/joss-reviews:

watching

  1. You may also like to change your default settings for this watching repositories in your GitHub profile here: https://github.com/settings/notifications

notifications

For a list of things I can do to help you, just type:

@whedon commands

For example, to regenerate the paper pdf after making changes in the paper's md or bib files, type:

@whedon generate pdf

@whedon
Copy link
Author

whedon commented Jun 15, 2020

Reference check summary:

OK DOIs

- 10.1007/s12021-010-9093-7 is OK
- 10.1371/journal.pone.0182184 is OK
- 10.1007/s12021-017-9341-1 is OK
- 10.1038/s41467-019-12058-z is OK
- 10.1101/2020.02.03.929158 is OK
- 10.1038/nprot.2008.51 is OK
- 10.1016/j.cell.2007.01.040 is OK

MISSING DOIs

- None

INVALID DOIs

- None

@whedon
Copy link
Author

whedon commented Jun 15, 2020

@oliviaguest
Copy link
Member

👋 👋 👋

Hi everybody!

@Visdoom, @emptymalei, @mstimberg — welcome to where the review itself is going to be.

Any questions please ask me. Any very code-related issues use the repo of the code and open an issue and link to it from here. All other issues, feedback, questions for me or the author, please use this issue. Anything you need, check above first as there is a lot of into up there, but also please feel free to ask me. 😸

Thank you all again and I hope this is a nice experience for us all! 🌼 🌸 🌷

@oliviaguest
Copy link
Member

Can @emptymalei and @mstimberg give me a rough ETA on their reviews please? ☺️

@oliviaguest
Copy link
Member

Oh, @emptymalei can you link to issues you open in the code repo here, please? Thank you! 😊

So far we have:

👍

@mstimberg
Copy link

@oliviaguest I hope to do it before the end of the week ⏰

@emptymalei
Copy link

@oliviaguest
I have tested most of the functions. Will finish this by the end of the week, too.

@emptymalei
Copy link

It's done on my side. 😸

@oliviaguest
Copy link
Member

@emptymalei thank you! Have you opened an issue in the code repo for this?

Automated tests: Are there automated tests or manual steps described so that the functionality of the software can be verified?

@emptymalei
Copy link

emptymalei commented Jun 25, 2020

@oliviaguest

I have a question here.
In fact, the code repo has a nicely illustrated tutorial jupyter notebook. It is neither unit test nor automated but it can serve as a very good manual test notebook.
I didn't check the box only because this is not automated. But based on this following section of the guidelines

OK: Documented manual steps that can be followed to objectively check the expected functionality of the software (e.g., a sample input file to assert behavior)

It is OK.

In this case, shall I create an issue or only provide this as an feedback here?

@oliviaguest
Copy link
Member

If you have a very specific code-related suggestion, then yes, otherwise just here is fine, @emptymalei. 😊

@mstimberg
Copy link

mstimberg commented Jun 25, 2020

Ok, here's my review, adding to @emptymalei's remarks

I opened a few specific issues in the MorphoPy repository (berenslab/MorphoPy#106, berenslab/MorphoPy#107, berenslab/MorphoPy#108, berenslab/MorphoPy#109)

I agree with @emptymalei that the package structure is surprising and support changing it to a more standard organization.

I also wonder whether the API documentation could be made more accessible somehow? One option would be sphinx+readthedocs, but I see that the authors already have generated HTML documentation in html_docs. Maybe this could be put online in some way (Github pages, for example)?

Regarding the test discussion:

  • Given that this tool computes statistics and can be used to gather these statistics automatically for large amounts of data, I think some kind of automated test suite would be really important. I'm sure that the authors checked the correctness of the statistical measurements extensively, but codifying this into some test would increase trust of potential users. This would not have to be very complicated, I could imagine having an artificially created SWC file where the correct answers to all measurements are known and can therefore be directly compared to the results from the tool. Orthogonal to that would be some kind of regression test (to make sure that future changes to the software do not change the results). One possible "brute-force" solution would be to run the tool on all the files in the data directory and store the CSV result files in the repository as well; one could then easily run a script to check whether the results are still the same.
  • Testing that the plots are correct and do not change with changes in the software would of course be more complicated, and I do not think automatizing this in some way would be worth the effort. Having a tutorial notebook is a great solution for this part and also acts as useful documentation.
  • The same goes for the comparison to the TMD tool – I'm happy with a manual comparison in a notebook (but see Error when running the "persistence_test" notebook berenslab/MorphoPy#109).

The software paper itself looks very good to me (apart from the Umlaut issue in the author affiliations, but that seems already having been corrected) – I wonder whether the code examples are a bit too technical, though? Maybe move them to the documentation (if they are not already mentioned there)?

All in all, great work, I think I'm going to use MorphoPy in the future myself!

@Visdoom
Copy link

Visdoom commented Jun 26, 2020

Hi @oliviaguest , @emptymalei and @mstimberg ,

thank you for your reviewing efforts so far. Your feedback has been very helpful and your intend on future use @mstimberg is a great encouragement, thanks!
The remaining issues you have created via github will be addressed in the coming week. The biggest "issue" is to update the PyPi installation to the new folder structure but we are on it.

Regarding the automated test, we hope to provide both, the artificial test_swc morphometrics as well as the "brute-force" solution for checking the validity of future versions.
How is this handled though? Will I just let you know here when we have implemented that?

Best wishes, Sophie

@oliviaguest
Copy link
Member

oliviaguest commented Jun 26, 2020

@Visdoom Hey Sophie, yes, you go off make any and all changes you fancy and report back here. Ideally if you can link from here or however you want highlight the changes you make, so the reviewers can easily spot them. Does that make sense? ☺️

@Visdoom
Copy link

Visdoom commented Jun 26, 2020

@oliviaguest yes, that makes sense. Thank you.

@oliviaguest
Copy link
Member

Oh, and if you need advice from the reviewers, @Visdoom, of course also ask them here! 🌷

@emptymalei
Copy link

I think I'm going to use MorphoPy in the future myself!

Also talked to my friend who's been doing such statistics by hand. She said this covers most of the statistics she's been looking at and will try out on the next project.

@philippberens
Copy link

Let us know if there is anything to add!

@oliviaguest
Copy link
Member

@whedon generate pdf

@oliviaguest
Copy link
Member

@openjournals/joss-eics please check this one and I'll email @Visdoom right now to make sure they check this thread carefully.

@oliviaguest
Copy link
Member

Ah! I just saw the ❤️ react, @Visdoom! Fantastic! 👏

@Visdoom
Copy link

Visdoom commented Aug 3, 2020

@oliviaguest I noticed a minor issue in the text where it says Fig. Figure X. Can I still update it? Otherwise it is also not a tragedy.

@oliviaguest
Copy link
Member

@Visdoom yes, update it and we can fix it. 😺

@Visdoom
Copy link

Visdoom commented Aug 3, 2020

Great, it is fixed in the paper.md.

@oliviaguest
Copy link
Member

@whedon generate pdf

@whedon
Copy link
Author

whedon commented Aug 3, 2020

@oliviaguest
Copy link
Member

@Visdoom if that looks good, perfect. If it needs more changes, make them and run the command: @whedon generate pdf and lemme know. Does it look OK now? 🌸

@Visdoom
Copy link

Visdoom commented Aug 3, 2020

Yes! Now it looks perfect. Thank you!

@danielskatz
Copy link

@whedon accept

@whedon
Copy link
Author

whedon commented Aug 3, 2020

Attempting dry run of processing paper acceptance...

@whedon
Copy link
Author

whedon commented Aug 3, 2020

Reference check summary:

OK DOIs

- 10.1007/s12021-010-9093-7 is OK
- 10.1371/journal.pone.0182184 is OK
- 10.1007/s12021-017-9341-1 is OK
- 10.1038/s41467-019-12058-z is OK
- 10.1101/2020.02.03.929158 is OK
- 10.1038/nprot.2008.51 is OK
- 10.1016/j.cell.2007.01.040 is OK

MISSING DOIs

- None

INVALID DOIs

- None

@whedon
Copy link
Author

whedon commented Aug 3, 2020

👋 @openjournals/joss-eics, this paper is ready to be accepted and published.

Check final proof 👉 openjournals/joss-papers#1611

If the paper PDF and Crossref deposit XML look good in openjournals/joss-papers#1611, then you can now move forward with accepting the submission by compiling again with the flag deposit=true e.g.

@whedon accept deposit=true

@danielskatz
Copy link

@Visdoom - please merge berenslab/MorphoPy#114 before we proceed to acceptance

@Visdoom
Copy link

Visdoom commented Aug 3, 2020

@danielskatz All done!

@danielskatz
Copy link

@whedon accept

@whedon
Copy link
Author

whedon commented Aug 3, 2020

Attempting dry run of processing paper acceptance...

@whedon
Copy link
Author

whedon commented Aug 3, 2020

Reference check summary:

OK DOIs

- 10.1007/s12021-010-9093-7 is OK
- 10.1371/journal.pone.0182184 is OK
- 10.1007/s12021-017-9341-1 is OK
- 10.1038/s41467-019-12058-z is OK
- 10.1101/2020.02.03.929158 is OK
- 10.1038/nprot.2008.51 is OK
- 10.1016/j.cell.2007.01.040 is OK

MISSING DOIs

- None

INVALID DOIs

- None

@whedon
Copy link
Author

whedon commented Aug 3, 2020

👋 @openjournals/joss-eics, this paper is ready to be accepted and published.

Check final proof 👉 openjournals/joss-papers#1612

If the paper PDF and Crossref deposit XML look good in openjournals/joss-papers#1612, then you can now move forward with accepting the submission by compiling again with the flag deposit=true e.g.

@whedon accept deposit=true

@danielskatz
Copy link

@whedon accept deposit=true

@whedon whedon added accepted published Papers published in JOSS labels Aug 3, 2020
@whedon
Copy link
Author

whedon commented Aug 3, 2020

Doing it live! Attempting automated processing of paper acceptance...

@whedon
Copy link
Author

whedon commented Aug 3, 2020

🐦🐦🐦 👉 Tweet for this paper 👈 🐦🐦🐦

@whedon
Copy link
Author

whedon commented Aug 3, 2020

🚨🚨🚨 THIS IS NOT A DRILL, YOU HAVE JUST ACCEPTED A PAPER INTO JOSS! 🚨🚨🚨

Here's what you must now do:

  1. Check final PDF and Crossref metadata that was deposited 👉 Creating pull request for 10.21105.joss.02339 joss-papers#1613
  2. Wait a couple of minutes to verify that the paper DOI resolves https://doi.org/10.21105/joss.02339
  3. If everything looks good, then close this review issue.
  4. Party like you just published a paper! 🎉🌈🦄💃👻🤘

Any issues? Notify your editorial technical team...

@danielskatz
Copy link

Thanks to @emptymalei & @mstimberg for reviewing, and @oliviaguest for editing!

Congratulations to @Visdoom (Sophie Laturnus) and co-authors!!

@whedon
Copy link
Author

whedon commented Aug 3, 2020

🎉🎉🎉 Congratulations on your paper acceptance! 🎉🎉🎉

If you would like to include a link to your paper from your README use the following code snippets:

Markdown:
[![DOI](https://joss.theoj.org/papers/10.21105/joss.02339/status.svg)](https://doi.org/10.21105/joss.02339)

HTML:
<a style="border-width:0" href="https://doi.org/10.21105/joss.02339">
  <img src="https://joss.theoj.org/papers/10.21105/joss.02339/status.svg" alt="DOI badge" >
</a>

reStructuredText:
.. image:: https://joss.theoj.org/papers/10.21105/joss.02339/status.svg
   :target: https://doi.org/10.21105/joss.02339

This is how it will look in your documentation:

DOI

We need your help!

Journal of Open Source Software is a community-run journal and relies upon volunteer effort. If you'd like to support us please consider doing either one (or both) of the the following:

@oliviaguest
Copy link
Member

@emptymalei, @mstimberg thank you so much for all you have done here! 👏

@Visdoom congratulations! 🥳

@emptymalei
Copy link

@Visdoom Congratulations! 🎉

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
accepted published Papers published in JOSS recommend-accept Papers recommended for acceptance in JOSS. review
Projects
None yet
Development

No branches or pull requests

7 participants