Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[REVIEW]: cerebra: A tool for fast and accurate summarizing of variant calling format (VCF) files #2432

Closed
38 tasks done
whedon opened this issue Jul 3, 2020 · 66 comments
Closed
38 tasks done
Assignees
Labels
accepted published Papers published in JOSS recommend-accept Papers recommended for acceptance in JOSS. review

Comments

@whedon
Copy link

whedon commented Jul 3, 2020

Submitting author: @lincoln-harris (Lincoln Harris)
Repository: https://github.com/czbiohub/cerebra
Version: v1.2.0
Editor: @lpantano
Reviewer: @betteridiot, @afrubin
Archive: 10.5281/zenodo.4050557

⚠️ JOSS reduced service mode ⚠️

Due to the challenges of the COVID-19 pandemic, JOSS is currently operating in a "reduced service mode". You can read more about what that means in our blog post.

Status

status

Status badge code:

HTML: <a href="https://joss.theoj.org/papers/44041ee73f22ed846090242bdcda616a"><img src="https://joss.theoj.org/papers/44041ee73f22ed846090242bdcda616a/status.svg"></a>
Markdown: [![status](https://joss.theoj.org/papers/44041ee73f22ed846090242bdcda616a/status.svg)](https://joss.theoj.org/papers/44041ee73f22ed846090242bdcda616a)

Reviewers and authors:

Please avoid lengthy details of difficulties in the review thread. Instead, please create a new issue in the target repository and link to those issues (especially acceptance-blockers) by leaving comments in the review thread below. (For completists: if the target issue tracker is also on GitHub, linking the review thread in the issue or vice versa will create corresponding breadcrumb trails in the link target.)

Reviewer instructions & questions

@betteridiot & @afrubin, please carry out your review in this issue by updating the checklist below. If you cannot edit the checklist please:

  1. Make sure you're logged in to your GitHub account
  2. Be sure to accept the invite at this URL: https://github.com/openjournals/joss-reviews/invitations

The reviewer guidelines are available here: https://joss.readthedocs.io/en/latest/reviewer_guidelines.html. Any questions/concerns please let @lpantano know.

Please try and complete your review in the next six weeks

Review checklist for @betteridiot

Conflict of interest

  • I confirm that I have read the JOSS conflict of interest (COI) policy and that: I have no COIs with reviewing this work or that any perceived COIs have been waived by JOSS for the purpose of this review.

Code of Conduct

General checks

  • Repository: Is the source code for this software available at the repository url?
  • License: Does the repository contain a plain-text LICENSE file with the contents of an OSI approved software license?
  • Contribution and authorship: Has the submitting author (@lincoln-harris) made major contributions to the software? Does the full list of paper authors seem appropriate and complete?

Functionality

  • Installation: Does installation proceed as outlined in the documentation?
  • Functionality: Have the functional claims of the software been confirmed?
  • Performance: If there are any performance claims of the software, have they been confirmed? (If there are no claims, please check off this item.)

Documentation

  • A statement of need: Do the authors clearly state what problems the software is designed to solve and who the target audience is?
  • Installation instructions: Is there a clearly-stated list of dependencies? Ideally these should be handled with an automated package management solution.
  • Example usage: Do the authors include examples of how to use the software (ideally to solve real-world analysis problems).
  • Functionality documentation: Is the core functionality of the software documented to a satisfactory level (e.g., API method documentation)?
  • Automated tests: Are there automated tests or manual steps described so that the functionality of the software can be verified?
  • Community guidelines: Are there clear guidelines for third parties wishing to 1) Contribute to the software 2) Report issues or problems with the software 3) Seek support

Software paper

  • Summary: Has a clear description of the high-level functionality and purpose of the software for a diverse, non-specialist audience been provided?
  • A statement of need: Do the authors clearly state what problems the software is designed to solve and who the target audience is?
  • State of the field: Do the authors describe how this software compares to other commonly-used packages?
  • Quality of writing: Is the paper well written (i.e., it does not require editing for structure, language, or writing quality)?
  • References: Is the list of references complete, and is everything cited appropriately that should be cited (e.g., papers, datasets, software)? Do references in the text use the proper citation syntax?

Review checklist for @afrubin

Conflict of interest

  • I confirm that I have read the JOSS conflict of interest (COI) policy and that: I have no COIs with reviewing this work or that any perceived COIs have been waived by JOSS for the purpose of this review.

Code of Conduct

General checks

  • Repository: Is the source code for this software available at the repository url?
  • License: Does the repository contain a plain-text LICENSE file with the contents of an OSI approved software license?
  • Contribution and authorship: Has the submitting author (@lincoln-harris) made major contributions to the software? Does the full list of paper authors seem appropriate and complete?

Functionality

  • Installation: Does installation proceed as outlined in the documentation?
  • Functionality: Have the functional claims of the software been confirmed?
  • Performance: If there are any performance claims of the software, have they been confirmed? (If there are no claims, please check off this item.)

Documentation

  • A statement of need: Do the authors clearly state what problems the software is designed to solve and who the target audience is?
  • Installation instructions: Is there a clearly-stated list of dependencies? Ideally these should be handled with an automated package management solution.
  • Example usage: Do the authors include examples of how to use the software (ideally to solve real-world analysis problems).
  • Functionality documentation: Is the core functionality of the software documented to a satisfactory level (e.g., API method documentation)?
  • Automated tests: Are there automated tests or manual steps described so that the functionality of the software can be verified?
  • Community guidelines: Are there clear guidelines for third parties wishing to 1) Contribute to the software 2) Report issues or problems with the software 3) Seek support

Software paper

  • Summary: Has a clear description of the high-level functionality and purpose of the software for a diverse, non-specialist audience been provided?
  • A statement of need: Do the authors clearly state what problems the software is designed to solve and who the target audience is?
  • State of the field: Do the authors describe how this software compares to other commonly-used packages?
  • Quality of writing: Is the paper well written (i.e., it does not require editing for structure, language, or writing quality)?
  • References: Is the list of references complete, and is everything cited appropriately that should be cited (e.g., papers, datasets, software)? Do references in the text use the proper citation syntax?
@whedon
Copy link
Author

whedon commented Jul 3, 2020

Hello human, I'm @whedon, a robot that can help you with some common editorial tasks. @betteridiot, @afrubin it looks like you're currently assigned to review this paper 🎉.

⚠️ JOSS reduced service mode ⚠️

Due to the challenges of the COVID-19 pandemic, JOSS is currently operating in a "reduced service mode". You can read more about what that means in our blog post.

⭐ Important ⭐

If you haven't already, you should seriously consider unsubscribing from GitHub notifications for this (https://github.com/openjournals/joss-reviews) repository. As a reviewer, you're probably currently watching this repository which means for GitHub's default behaviour you will receive notifications (emails) for all reviews 😿

To fix this do the following two things:

  1. Set yourself as 'Not watching' https://github.com/openjournals/joss-reviews:

watching

  1. You may also like to change your default settings for this watching repositories in your GitHub profile here: https://github.com/settings/notifications

notifications

For a list of things I can do to help you, just type:

@whedon commands

For example, to regenerate the paper pdf after making changes in the paper's md or bib files, type:

@whedon generate pdf

@whedon
Copy link
Author

whedon commented Jul 3, 2020

Reference check summary:

OK DOIs

- None

MISSING DOIs

- https://doi.org/10.1093/bioinformatics/btl647 may be missing for title: Nested Containment List (NCList): a new algorithm for accelerating interval query of genome alignment and interval databases
- https://doi.org/10.1093/bioinformatics/btp324 may be missing for title: Fast and accurate short read alignment with Burrows-Wheeler Transform
- https://doi.org/10.1016/j.cell.2017.09.004 may be missing for title: Single-Cell Analysis of Human Pancreas Reveals Transcriptional Signatures of Aging and Somatic Mutation Patterns
- https://doi.org/10.1038/s41587-019-0074-6 may be missing for title: An open resource for accurately benchmarking small variant and reference calls
- https://doi.org/10.1038/nbt.2835 may be missing for title: Integrating human sequence data sets provides a resource of benchmark SNP and indel genotype calls
- https://doi.org/10.1101/201178 may be missing for title: Scaling accurate genetic variant discovery to tens of thousands of samples
- https://doi.org/10.1186/s13059-017-1248-5 may be missing for title: BRIE: transcriptome-wide splicing quantification in single cells

INVALID DOIs

- None

@whedon
Copy link
Author

whedon commented Jul 3, 2020

@lpantano
Copy link

lpantano commented Jul 7, 2020

Hi @betteridiot, @afrubin!, this is the review issue, please accept the invitation if you haven't done it so you can start the review here.

@lincoln-harris, can you work on adding DOI to your references?

Thanks all!

@lincoln-harris
Copy link

sure! i can do that

@afrubin
Copy link

afrubin commented Jul 11, 2020

It looks like the references aren't formatting properly, possibly due to missing commas in the author names in the .bib file. @lincoln-harris would you please have a look?

@afrubin
Copy link

afrubin commented Jul 12, 2020

Review

This certainly looks like a useful and performant tool for transforming VCF files into a format that is easier to work with and enables researchers to make biological inferences. I've opened some issues (referenced above) but thought it better to put high-level feedback and comments on the paper in this thread.

I have not run the software yet, but plan to follow the updated installation instructions and run some test data after czbiohub-sf/cerebra#83 is addressed.

Tests

Test coverage is low (81%) and excludes quite a few cases that seem like they might come up when used on diverse data. Error messages/logging may also need to be improved. For example, this section is not covered by tests and includes output to stdout with limited context:
https://github.com/czbiohub/cerebra/blob/3553f707290e57440ca9c2ab617667ab7cc4323a/cerebra/find_peptide_variants.py#L65-L69

Several failure conditions (e.g. https://github.com/czbiohub/cerebra/blob/3553f707290e57440ca9c2ab617667ab7cc4323a/cerebra/germline_filter.py#L114-L115) are not covered by tests, although adding test cases for these ought to be trivial. The multiprocessor use case for germline-filter is also not tested.

Documentation

Most usable documentation is in the README. This would be easier to navigate if it were split into multiple files, but another option is to focus on improving the README's content and structure, and dropping the docs/ directory altogether.

Many parts of the documentation seem to be lacking updates and consistency. For example, the instructions in docs/installation.rst are incomplete and different to what's in the README. The docs/authors.rst references a top-level AUTHORS.rst, which doesn't exist (although there is an AUTHORS.md that doesn't include all the contributors). There are similar issues affecting other files.

The repository is in need of a "polish pass" to clean up these kinds of issues and remove unnecessary files (such as .editorconfig in the top level).

Paper

The motivation section should make it clear that an intended use case is single-cell RNA-seq data. This isn't introduced until the find-peptide-variants section, despite being the research example cited by the authors. The motivation section would also benefit from being written in a more scholarly way. The likely audience of this package are bioinformaticians working with VCFs and single-cell data, so the "typo" metaphor is unnecessary.

The authors also use the term "functional predictions" to mean predicted amino-acid changes. This package does not attempt to predict the "functional consequences of the variant", which is a much more ambitious problem. The authors should avoid using the term "functional" throughout the text and focus on what the tool is actually doing - inferring amino acid consequences given variant calls and gene annotations.

There are also numerous missing citations, across packages used (ncls is not cited despite having citation information in its GitHub README), important genomic formats (VCF and HGVS are not cited), and databases (COSMIC is not cited). I should emphasize that this is a partial list of missing citations.

Regarding the use of databases, the authors make the nonsensical claim that "variants that are not found in the COSMIC database [are] less likely to be pathogenic". While there are good reasons to filter for known variants (e.g. focusing on clinical actionability), this isn't one. COSMIC includes a tiny fraction of pathogenic variants, most of which are rare.

For variant discovery, users may want to filter for variants that are not found in COSMIC, and the authors should consider implementing this as an option. This would be especially useful if support for variant databases like gnomAD is added in the future.

Also, the paper mentions dbSNP filtering but it doesn't look like this is implemented based on the documentation.

One small nomenclature issue is that the transcripts in Fig 2 are referred to as "spliceforms" in the text but when alternative start sites are included (like t3) the more general term "isoforms" should be used.

Summary

This is a useful and interesting tool that would be appropriate for JOSS but additional work is needed across the project before it fulfills the criteria.

@lincoln-harris
Copy link

thanks for the comments, these are currently being addressed with czbiohub-sf/cerebra#84

@betteridiot
Copy link

@lpantano For some reason, GitHub is not allowing me to check the boxes (like in previous reviews). I may need another invitation.

@lpantano
Copy link

@whedon re-invite @betteridiot as reviewer

@whedon
Copy link
Author

whedon commented Jul 16, 2020

OK, the reviewer has been re-invited.

@betteridiot please accept the invite by clicking this link: https://github.com/openjournals/joss-reviews/invitations

@betteridiot
Copy link

Thank you. Apparently it was just a browser extension that was causing the problem.

@betteridiot
Copy link

For transparency's sake, I am still actively working on this review. However, I have a thesis committee meeting this week, and will be not be able to get back to the review process until then. Sorry for the delay.

@lincoln-harris
Copy link

the majority of reviewer comments have been addressed with the latest release.
looking forward to additional feedback!

@afrubin
Copy link

afrubin commented Jul 31, 2020

@lincoln-harris I have a few additional comments on the writing in the germline-filter section of the paper and the README.

A more typical phrasing from cancer genomics would be "tumor vs. normal" rather than "tumor/pathogenic vs control."

The text says that germline-filter can be used "so as to not bias the results by including non-pathogenic variants." While I understand the motivation for this, you can't say that germline variants are non-pathogenic - they may well be strongly pathogenic for another disease!

Instead of "control and the experimental tissue" the text should stick with the word "sample" (and tumor/normal) for clarity.

@lpantano
Copy link

Hi @lincoln-harris,

Any update on the latest @afrubin's comment?

@betteridiot, any update for this review?

Thanks!

@whedon
Copy link
Author

whedon commented Sep 29, 2020

OK. 10.5281/zenodo.4050557 is the archive.

@lpantano
Copy link

@whedon accept

@whedon
Copy link
Author

whedon commented Sep 29, 2020

Attempting dry run of processing paper acceptance...

@whedon whedon added the recommend-accept Papers recommended for acceptance in JOSS. label Sep 29, 2020
@whedon
Copy link
Author

whedon commented Sep 29, 2020

PDF failed to compile for issue #2432 with the following error:

/app/vendor/bundle/ruby/2.4.0/bundler/gems/whedon-364ded062842/lib/whedon/bibtex_parser.rb:77:in doi_citation': undefined method encode' for nil:NilClass (NoMethodError)
from /app/vendor/bundle/ruby/2.4.0/bundler/gems/whedon-364ded062842/lib/whedon/bibtex_parser.rb:64:in make_citation' from /app/vendor/bundle/ruby/2.4.0/bundler/gems/whedon-364ded062842/lib/whedon/bibtex_parser.rb:50:in block in generate_citations'
from /app/vendor/bundle/ruby/2.4.0/gems/bibtex-ruby-5.1.4/lib/bibtex/bibliography.rb:149:in each' from /app/vendor/bundle/ruby/2.4.0/gems/bibtex-ruby-5.1.4/lib/bibtex/bibliography.rb:149:in each'
from /app/vendor/bundle/ruby/2.4.0/bundler/gems/whedon-364ded062842/lib/whedon/bibtex_parser.rb:43:in generate_citations' from /app/vendor/bundle/ruby/2.4.0/bundler/gems/whedon-364ded062842/lib/whedon/compilers.rb:246:in crossref_from_markdown'
from /app/vendor/bundle/ruby/2.4.0/bundler/gems/whedon-364ded062842/lib/whedon/compilers.rb:21:in generate_crossref' from /app/vendor/bundle/ruby/2.4.0/bundler/gems/whedon-364ded062842/lib/whedon/processor.rb:95:in compile'
from /app/vendor/bundle/ruby/2.4.0/bundler/gems/whedon-364ded062842/bin/whedon:82:in compile' from /app/vendor/bundle/ruby/2.4.0/gems/thor-0.20.3/lib/thor/command.rb:27:in run'
from /app/vendor/bundle/ruby/2.4.0/gems/thor-0.20.3/lib/thor/invocation.rb:126:in invoke_command' from /app/vendor/bundle/ruby/2.4.0/gems/thor-0.20.3/lib/thor.rb:387:in dispatch'
from /app/vendor/bundle/ruby/2.4.0/gems/thor-0.20.3/lib/thor/base.rb:466:in start' from /app/vendor/bundle/ruby/2.4.0/bundler/gems/whedon-364ded062842/bin/whedon:119:in <top (required)>'
from /app/vendor/bundle/ruby/2.4.0/bin/whedon:23:in load' from /app/vendor/bundle/ruby/2.4.0/bin/whedon:23:in

'

@lpantano
Copy link

ops, @lincoln-harris do you think is related to your paper? did you change something from the last time it compiled here?

@lincoln-harris
Copy link

lincoln-harris commented Sep 29, 2020

I dont think so? paper.md hasnt been changed since Aug 26, and it seemed to compile fine Sept 15th

it seems to be compiling fine with the whedon paper preview tool

@Kevin-Mattheus-Moerman
Copy link
Member

@whedon accept

@whedon
Copy link
Author

whedon commented Oct 1, 2020

Attempting dry run of processing paper acceptance...

@whedon
Copy link
Author

whedon commented Oct 1, 2020

Reference check summary (note 'MISSING' DOIs are suggestions that need verification):

OK DOIs

- 10.1093/bioinformatics/btl647 is OK
- 10.1093/bioinformatics/btp324 is OK
- 10.1093/nar/gky1015 is OK
- 10.1016/j.cell.2017.09.004 is OK
- 10.1093/nar/gkz966 is OK
- 10.1038/s41587-019-0074-6 is OK
- 10.1038/nbt.2835 is OK
- 10.1101/201178 is OK
- 10.1002/humu.22981 is OK
- 10.1186/s13059-017-1248-5 is OK
- 10.1016/j.cell.2020.07.017 is OK
- 10.7287/peerj.preprints.970v1 is OK
- 10.1093/bioinformatics/bts635 is OK
- 10.21105/joss.00085 is OK

MISSING DOIs

- 10.25080/majora-ebaa42b7-00d may be a valid DOI for title: Building a framework for predictive science

INVALID DOIs

- arXiv:1207.3907v2 is INVALID

@whedon
Copy link
Author

whedon commented Oct 1, 2020

PDF failed to compile for issue #2432 with the following error:

/app/vendor/bundle/ruby/2.4.0/bundler/gems/whedon-364ded062842/lib/whedon/bibtex_parser.rb:77:in doi_citation': undefined method encode' for nil:NilClass (NoMethodError)
from /app/vendor/bundle/ruby/2.4.0/bundler/gems/whedon-364ded062842/lib/whedon/bibtex_parser.rb:64:in make_citation' from /app/vendor/bundle/ruby/2.4.0/bundler/gems/whedon-364ded062842/lib/whedon/bibtex_parser.rb:50:in block in generate_citations'
from /app/vendor/bundle/ruby/2.4.0/gems/bibtex-ruby-5.1.4/lib/bibtex/bibliography.rb:149:in each' from /app/vendor/bundle/ruby/2.4.0/gems/bibtex-ruby-5.1.4/lib/bibtex/bibliography.rb:149:in each'
from /app/vendor/bundle/ruby/2.4.0/bundler/gems/whedon-364ded062842/lib/whedon/bibtex_parser.rb:43:in generate_citations' from /app/vendor/bundle/ruby/2.4.0/bundler/gems/whedon-364ded062842/lib/whedon/compilers.rb:246:in crossref_from_markdown'
from /app/vendor/bundle/ruby/2.4.0/bundler/gems/whedon-364ded062842/lib/whedon/compilers.rb:21:in generate_crossref' from /app/vendor/bundle/ruby/2.4.0/bundler/gems/whedon-364ded062842/lib/whedon/processor.rb:95:in compile'
from /app/vendor/bundle/ruby/2.4.0/bundler/gems/whedon-364ded062842/bin/whedon:82:in compile' from /app/vendor/bundle/ruby/2.4.0/gems/thor-0.20.3/lib/thor/command.rb:27:in run'
from /app/vendor/bundle/ruby/2.4.0/gems/thor-0.20.3/lib/thor/invocation.rb:126:in invoke_command' from /app/vendor/bundle/ruby/2.4.0/gems/thor-0.20.3/lib/thor.rb:387:in dispatch'
from /app/vendor/bundle/ruby/2.4.0/gems/thor-0.20.3/lib/thor/base.rb:466:in start' from /app/vendor/bundle/ruby/2.4.0/bundler/gems/whedon-364ded062842/bin/whedon:119:in <top (required)>'
from /app/vendor/bundle/ruby/2.4.0/bin/whedon:23:in load' from /app/vendor/bundle/ruby/2.4.0/bin/whedon:23:in

'

@Kevin-Mattheus-Moerman
Copy link
Member

Kevin-Mattheus-Moerman commented Oct 1, 2020

@lincoln-harris I am helping to process the acceptance of this work in JOSS. Can you work on the points below?

  • You have one invalid DOI ☝️, this is because you link to an arXiv preprint link rather than a valid DOI. Can you change the doi entry in the bib file to be of the url type instead? I believe this involves replacing:
 doi={arXiv:1207.3907v2}

with

 url={https://arxiv.org/abs/1207.3907}
  • Please add a country to your affiliation(s) (please do not use acronyms for countries)

  • The hyperlink to ENSEMBL in Protein variants are converted to ENSEMBL protein IDs,..., appears not to work, please check and update.

@Kevin-Mattheus-Moerman
Copy link
Member

@openjournals/dev I identified the invalid DOI but am unsure if this is the reason the paper does not compile for @whedon accept, can you confirm what the issue is here? Thanks

@lincoln-harris
Copy link

@Kevin-Mattheus-Moerman done, hopefully that takes care of it.

@Kevin-Mattheus-Moerman
Copy link
Member

@whedon accept

@whedon
Copy link
Author

whedon commented Oct 1, 2020

Attempting dry run of processing paper acceptance...

@whedon
Copy link
Author

whedon commented Oct 1, 2020

👋 @openjournals/joss-eics, this paper is ready to be accepted and published.

Check final proof 👉 openjournals/joss-papers#1770

If the paper PDF and Crossref deposit XML look good in openjournals/joss-papers#1770, then you can now move forward with accepting the submission by compiling again with the flag deposit=true e.g.

@whedon accept deposit=true

@Kevin-Mattheus-Moerman
Copy link
Member

@whedon accept deposit=true

@whedon
Copy link
Author

whedon commented Oct 1, 2020

Doing it live! Attempting automated processing of paper acceptance...

@whedon whedon added accepted published Papers published in JOSS labels Oct 1, 2020
@whedon
Copy link
Author

whedon commented Oct 1, 2020

🐦🐦🐦 👉 Tweet for this paper 👈 🐦🐦🐦

@whedon
Copy link
Author

whedon commented Oct 1, 2020

🚨🚨🚨 THIS IS NOT A DRILL, YOU HAVE JUST ACCEPTED A PAPER INTO JOSS! 🚨🚨🚨

Here's what you must now do:

  1. Check final PDF and Crossref metadata that was deposited 👉 Creating pull request for 10.21105.joss.02432 joss-papers#1771
  2. Wait a couple of minutes to verify that the paper DOI resolves https://doi.org/10.21105/joss.02432
  3. If everything looks good, then close this review issue.
  4. Party like you just published a paper! 🎉🌈🦄💃👻🤘

Any issues? Notify your editorial technical team...

@whedon
Copy link
Author

whedon commented Oct 1, 2020

🎉🎉🎉 Congratulations on your paper acceptance! 🎉🎉🎉

If you would like to include a link to your paper from your README use the following code snippets:

Markdown:
[![DOI](https://joss.theoj.org/papers/10.21105/joss.02432/status.svg)](https://doi.org/10.21105/joss.02432)

HTML:
<a style="border-width:0" href="https://doi.org/10.21105/joss.02432">
  <img src="https://joss.theoj.org/papers/10.21105/joss.02432/status.svg" alt="DOI badge" >
</a>

reStructuredText:
.. image:: https://joss.theoj.org/papers/10.21105/joss.02432/status.svg
   :target: https://doi.org/10.21105/joss.02432

This is how it will look in your documentation:

DOI

We need your help!

Journal of Open Source Software is a community-run journal and relies upon volunteer effort. If you'd like to support us please consider doing either one (or both) of the the following:

@Kevin-Mattheus-Moerman
Copy link
Member

Congratulations @lincoln-harris!

Thanks for your review efforts @betteridiot, @afrubin, also thank you @lpantano for editing!

@lincoln-harris
Copy link

awesome!! thanks @betteridiot, @afrubin and @lpantano!! I learned a ton through the review process

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
accepted published Papers published in JOSS recommend-accept Papers recommended for acceptance in JOSS. review
Projects
None yet
Development

No branches or pull requests

6 participants