Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[REVIEW]: PySensors: A Python Package for Sparse Sensor Placement #2828

Closed
40 tasks done
whedon opened this issue Nov 9, 2020 · 77 comments
Closed
40 tasks done

[REVIEW]: PySensors: A Python Package for Sparse Sensor Placement #2828

whedon opened this issue Nov 9, 2020 · 77 comments
Assignees
Labels
accepted published Papers published in JOSS Python recommend-accept Papers recommended for acceptance in JOSS. review

Comments

@whedon
Copy link

whedon commented Nov 9, 2020

Submitting author: @briandesilva (Brian de Silva)
Repository: https://github.com/dynamicslab/pysensors/
Version: v0.3.3
Editor: @pdebuyl
Reviewer: @jordanperr, @tuelwer
Archive: 10.5281/zenodo.4542530

⚠️ JOSS reduced service mode ⚠️

Due to the challenges of the COVID-19 pandemic, JOSS is currently operating in a "reduced service mode". You can read more about what that means in our blog post.

Status

status

Status badge code:

HTML: <a href="https://joss.theoj.org/papers/f49d5ede5060291a29f76274adde9a65"><img src="https://joss.theoj.org/papers/f49d5ede5060291a29f76274adde9a65/status.svg"></a>
Markdown: [![status](https://joss.theoj.org/papers/f49d5ede5060291a29f76274adde9a65/status.svg)](https://joss.theoj.org/papers/f49d5ede5060291a29f76274adde9a65)

Reviewers and authors:

Please avoid lengthy details of difficulties in the review thread. Instead, please create a new issue in the target repository and link to those issues (especially acceptance-blockers) by leaving comments in the review thread below. (For completists: if the target issue tracker is also on GitHub, linking the review thread in the issue or vice versa will create corresponding breadcrumb trails in the link target.)

Reviewer instructions & questions

@jordanperr & @tuelwer, please carry out your review in this issue by updating the checklist below. If you cannot edit the checklist please:

  1. Make sure you're logged in to your GitHub account
  2. Be sure to accept the invite at this URL: https://github.com/openjournals/joss-reviews/invitations

The reviewer guidelines are available here: https://joss.readthedocs.io/en/latest/reviewer_guidelines.html. Any questions/concerns please let @pdebuyl know.

Please start on your review when you are able, and be sure to complete your review in the next six weeks, at the very latest

Review checklist for @jordanperr

Conflict of interest

  • I confirm that I have read the JOSS conflict of interest (COI) policy and that: I have no COIs with reviewing this work or that any perceived COIs have been waived by JOSS for the purpose of this review.

Code of Conduct

General checks

  • Repository: Is the source code for this software available at the repository url?
  • License: Does the repository contain a plain-text LICENSE file with the contents of an OSI approved software license?
  • Contribution and authorship: Has the submitting author (@briandesilva) made major contributions to the software? Does the full list of paper authors seem appropriate and complete?
  • Substantial scholarly effort: Does this submission meet the scope eligibility described in the JOSS guidelines

Functionality

  • Installation: Does installation proceed as outlined in the documentation?
  • Functionality: Have the functional claims of the software been confirmed?
  • Performance: If there are any performance claims of the software, have they been confirmed? (If there are no claims, please check off this item.)

Documentation

  • A statement of need: Do the authors clearly state what problems the software is designed to solve and who the target audience is?
  • Installation instructions: Is there a clearly-stated list of dependencies? Ideally these should be handled with an automated package management solution.
  • Example usage: Do the authors include examples of how to use the software (ideally to solve real-world analysis problems).
  • Functionality documentation: Is the core functionality of the software documented to a satisfactory level (e.g., API method documentation)?
  • Automated tests: Are there automated tests or manual steps described so that the functionality of the software can be verified?
  • Community guidelines: Are there clear guidelines for third parties wishing to 1) Contribute to the software 2) Report issues or problems with the software 3) Seek support

Software paper

  • Summary: Has a clear description of the high-level functionality and purpose of the software for a diverse, non-specialist audience been provided?
  • A statement of need: Do the authors clearly state what problems the software is designed to solve and who the target audience is?
  • State of the field: Do the authors describe how this software compares to other commonly-used packages?
  • Quality of writing: Is the paper well written (i.e., it does not require editing for structure, language, or writing quality)?
  • References: Is the list of references complete, and is everything cited appropriately that should be cited (e.g., papers, datasets, software)? Do references in the text use the proper citation syntax?

Review checklist for @tuelwer

Conflict of interest

  • I confirm that I have read the JOSS conflict of interest (COI) policy and that: I have no COIs with reviewing this work or that any perceived COIs have been waived by JOSS for the purpose of this review.

Code of Conduct

General checks

  • Repository: Is the source code for this software available at the repository url?
  • License: Does the repository contain a plain-text LICENSE file with the contents of an OSI approved software license?
  • Contribution and authorship: Has the submitting author (@briandesilva) made major contributions to the software? Does the full list of paper authors seem appropriate and complete?
  • Substantial scholarly effort: Does this submission meet the scope eligibility described in the JOSS guidelines

Functionality

  • Installation: Does installation proceed as outlined in the documentation?
  • Functionality: Have the functional claims of the software been confirmed?
  • Performance: If there are any performance claims of the software, have they been confirmed? (If there are no claims, please check off this item.)

Documentation

  • A statement of need: Do the authors clearly state what problems the software is designed to solve and who the target audience is?
  • Installation instructions: Is there a clearly-stated list of dependencies? Ideally these should be handled with an automated package management solution.
  • Example usage: Do the authors include examples of how to use the software (ideally to solve real-world analysis problems).
  • Functionality documentation: Is the core functionality of the software documented to a satisfactory level (e.g., API method documentation)?
  • Automated tests: Are there automated tests or manual steps described so that the functionality of the software can be verified?
  • Community guidelines: Are there clear guidelines for third parties wishing to 1) Contribute to the software 2) Report issues or problems with the software 3) Seek support

Software paper

  • Summary: Has a clear description of the high-level functionality and purpose of the software for a diverse, non-specialist audience been provided?
  • A statement of need: Do the authors clearly state what problems the software is designed to solve and who the target audience is?
  • State of the field: Do the authors describe how this software compares to other commonly-used packages?
  • Quality of writing: Is the paper well written (i.e., it does not require editing for structure, language, or writing quality)?
  • References: Is the list of references complete, and is everything cited appropriately that should be cited (e.g., papers, datasets, software)? Do references in the text use the proper citation syntax?
@whedon
Copy link
Author

whedon commented Nov 9, 2020

Hello human, I'm @whedon, a robot that can help you with some common editorial tasks. @jordanperr, @tuelwer it looks like you're currently assigned to review this paper 🎉.

⚠️ JOSS reduced service mode ⚠️

Due to the challenges of the COVID-19 pandemic, JOSS is currently operating in a "reduced service mode". You can read more about what that means in our blog post.

⭐ Important ⭐

If you haven't already, you should seriously consider unsubscribing from GitHub notifications for this (https://github.com/openjournals/joss-reviews) repository. As a reviewer, you're probably currently watching this repository which means for GitHub's default behaviour you will receive notifications (emails) for all reviews 😿

To fix this do the following two things:

  1. Set yourself as 'Not watching' https://github.com/openjournals/joss-reviews:

watching

  1. You may also like to change your default settings for this watching repositories in your GitHub profile here: https://github.com/settings/notifications

notifications

For a list of things I can do to help you, just type:

@whedon commands

For example, to regenerate the paper pdf after making changes in the paper's md or bib files, type:

@whedon generate pdf

@whedon
Copy link
Author

whedon commented Nov 9, 2020

👉📄 Download article proof 📄 View article proof on GitHub 📄 👈

@whedon
Copy link
Author

whedon commented Nov 9, 2020

Reference check summary (note 'MISSING' DOIs are suggestions that need verification):

OK DOIs

- 10.1109/JSEN.2018.2887044 is OK
- 10.1109/mcs.2018.2810460 is OK
- 10.1137/15M1036713 is OK
- 10.1017/jfm.2011.195 is OK
- 10.2514/6.2004-2415 is OK
- 10.1073/pnas.1517384113 is OK
- 10.1109/access.2018.2886528 is OK
- 10.1109/access.2020.3023625 is OK
- 10.1126/science.1165893 is OK
- 10.1103/physrevmaterials.2.083802 is OK
- 10.1111/j.2517-6161.1996.tb02080.x is OK
- 10.5281/zenodo.1173754 is OK
- 10.1364/oe.24.030433 is OK
- 10.1063/1.5066099 is OK
- 10.1063/1.4977057 is OK
- 10.1016/j.ymssp.2018.08.033 is OK
- 10.1126/sciadv.1602614 is OK
- 10.1098/rspa.2016.0446 is OK
- 10.1137/16m1086637 is OK
- 10.1137/18m116798x is OK
- 10.1017/jfm.2017.823 is OK
- 10.1063/1.5018409 is OK
- 10.1016/j.ifacol.2016.10.249 is OK
- 10.1103/physreve.96.023302 is OK
- 10.1016/j.jcp.2018.10.045 is OK
- 10.1098/rspa.2018.0335 is OK
- 10.1016/j.jcp.2019.07.049 is OK
- 10.1007/s00162-020-00536-w is OK
- 10.1103/physreve.101.010203 is OK
- 10.1115/1.4043148 is OK
- 10.1016/j.ocemod.2009.01.001 is OK
- 10.1109/cdc.2014.7040017 is OK
- 10.1017/jfm.2017.137 is OK
- 10.1073/pnas.1808909115 is OK
- 10.1016/j.jmsy.2018.01.011 is OK
- 10.1017/jfm.2018.147 is OK
- 10.1017/9781108380690 is OK
- 10.1007/s00162-020-00520-4 is OK
- 10.1002/cpa.20124 is OK
- 10.1016/0167-7152(84)90020-8 is OK
- 10.1016/j.crma.2004.08.006 is OK
- 10.1162/0899766053723032 is OK
- 10.1109/TIT.2006.871582 is OK
- 10.1109/tit.2006.885507 is OK
- 10.1109/tit.2005.862083 is OK
- 10.1109/MSP.2007.4286571 is OK
- 10.1109/tit.2009.2034811 is OK
- 10.1016/j.acha.2010.10.002 is OK
- 10.1145/1879141.1879192 is OK
- 10.1137/090766498 is OK
- 10.1111/j.1467-9868.2011.00783.x is OK
- 10.1137/110822724 is OK
- 10.1137/15M1019271 is OK
- 10.1109/tsipn.2016.2614903 is OK
- 10.1109/sam.2016.7569707 is OK
- 10.1137/16m1081270 is OK

MISSING DOIs

- 10.2172/1405271 may be a valid DOI for title: Sensor placement optimization using Chama

INVALID DOIs

- 10.5555/1953048.2078195 is INVALID

@pdebuyl
Copy link

pdebuyl commented Nov 9, 2020

@jordanperr @tuelwer , make sure to accept the invitation to the reviewers group and to have a look at the reviewer guidelines linked to at the top of this review page.

The review process will happen in this issue page, so questions to the author or to me can be added as comments here.

@tuelwer
Copy link

tuelwer commented Nov 11, 2020

@pdebuyl @briandesilva The authors propose the Python library PySensors which allows efficient data-driven sensor placement. The toolbox provides methods for reconstruction and classification based on the data of the sensors. The main contribution of the paper is the implementation of the sparse sensor placement optimization algorithm for reconstruction (SSPOR) and classification (SSPOC). Besides this, the toolbox provides useful bindings to existing methods. The API is designed in the style of scikit-learn which makes the toolbox very easy to use!

The paper is well-written and gives a good overview over related work. However, I would have liked a more formal definition of the problem setting that is solved by the PySensors toolbox. The references seem to be complete. The scikit-learn JMLR paper has indeed no DOI stated on the website of the journal. There is a missing DOI for Chama which whedon found.

The toolbox itself is well-documented. I especially like the example notebooks which are nicely written and give a good overview over the features of the toolbox. I was able to reproduce the results of the examples locally on my notebook as well as on binder. However, per default, binder is missing matplotlib, seaborn and pandas. Can this be configured? Installation of the toolbox on MacOS, Debian and Ubuntu through pip install . and pip install python-sensors worked without problems. All tests passed with some warnings. There were some minor problems with the reconstruction example in the README.rst which are described in this issue.

Regarding the community guide: I cannot see a clear statement on how third parties could seek support.

I congratulate the authors to their great toolbox! Overall, I would recommend to accept this submission into JOSS.

Minor remarks:

  • The last reference in the README.rst is printed in bold. Is this done on purpose?
  • In example notebook pysensors_overview.ipynb two different UserWarnings are thrown.
  • Some example notebooks require additional packages like seaborn or netCDF4. It would be nice to have a requirements-examples.txt to install all those packages in a single run.
  • Figure in the reconstruction example of the docs is missing.
  • "the number of possible placements grows combinatorially" (first paragraph in paper): "combinatorially" -> "exponentially"

@briandesilva
Copy link

Thanks, @tuelwer! This is all very useful feedback. We'll work to update the paper and package accordingly.

@pdebuyl
Copy link

pdebuyl commented Nov 12, 2020

Thank you @tuelwer for this efficient review!

@briandesilva
Copy link

@tuelwer, could you clarify which figure you're referring to here?

Figure in the reconstruction example of the docs is missing.

I believe I've addressed the other issues you raised, apart from the binder issue and the "more formal definition of the problem setting that is solved by the PySensors toolbox." I experimented with different solutions for specifying binder dependencies, but didn't come up with anything that worked. I'll have to look into both items further.

@tuelwer
Copy link

tuelwer commented Nov 13, 2020

@briandesilva What I meant was this example. I was indeed somewhat imprecise, sorry for this!

missing_image

@briandesilva
Copy link

Okay I've fixed the binder issue and the missing image. @tuelwer, with regards to your comment

I would have liked a more formal definition of the problem setting that is solved by the PySensors toolbox

there are different objective functions that are approximately optimized by PySensors classes depending on the problem type. We had hesitated to include them in our JOSS submission based on our understanding that the paper is meant to be aimed at a general audience. What do you think about the idea of adding the objective functions to their respective Jupyter notebooks?

@tuelwer
Copy link

tuelwer commented Nov 19, 2020

@briandesilva, thanks! Except for the sea surface temperature example (FTP connection problem) all notebooks now run smoothly on Binder!

Regarding the summary: I agree that the paper should be aimed at a general audience (as it is also required by the JOSS submission guidelines) and I agree that stating the loss functions in the paper is probably too much. However, maybe you could consider to elaborate on the data driven aspect of your toolbox, which you briefly describe in the second paragraph of your summary. For example, I found Figure 1 of [1] and the gray box on page 2 of [2] very helpful to understand the problem setting.

In my opinion, adding more details of the problem setting would greatly improve the quality of the paper since it would be easier for a user to assess whether PySensors is suitable for their problem.

References
[1] Brunton, Bingni W., Steven L. Brunton, Joshua L. Proctor, and J Nathan Kutz. "Sparse sensor placement optimization for classification." SIAM Journal on Applied Mathematics 76.5 (2016): 2099-2122.
[2] Manohar, Krithika, Bingni W. Brunton, J. Nathan Kutz, and Steven L. Brunton. "Data-driven sparse sensor placement for reconstruction: Demonstrating the benefits of exploiting known patterns." IEEE Control Systems Magazine 38, no. 3 (2018): 63-86.

@briandesilva
Copy link

@kmanohar, do you want to take a stab at adding a brief description of the problem setting to the paper?

@jordanperr
Copy link

Review for PySensors: A Python Package for Sparse Sensor Placement

Pysensors is a Python implementation of the SSPOC (pysensors.classification.SSPOC) and SSPOR (pysensors.reconstruction.SSPOR) methods to perform optimization of sensor placement. These methods are published in (Brunton, 2016) which is cited in both the documentation and in the paper. The repository under review contains documentation (hosted on readthedocs and in Readme markdown files) and worked out examples in the form of Jupyter notebooks. The code is thoughtfully organized, with two optimizers (CCQR and QR) that extend scikit-learn's BaseEstimator, and two utility functions that wrap MultiTaskLasso and OrthogonalMatchingPursuit from scipy. I was able to install the software, run the tests, and execute some of the example notebooks.

The paper is well written and fits within the scope of the journal. I believe the code represents a significant contribution as a user friendly implementation of the SSPOR and SSPOC algorithms. I would have liked to see citations of academic work that use the PySensor package. Such citations would provide evidence of the impact of this software in an academic field.

Overall, I recommend acceptance for publication after these minor comments are reviewed and implemented at the author's discretion:

Paper:

  • Citation would be useful for the claim “In general, choosing the globally optimal placement… is an intractable computation”.
  • The notion of "sensors" and "optimization of sensor placement" could be considered as domain specific jargon for a truly general audience. The authors may consider defining these terms so the general reader has a more concrete picture of the problem being solved here.
  • A note on the difference between “detection” and “reconstruction and classification” tasks would be helpful.
  • In "statement of need," it may be construed from the text that Chama is the only other software to solve the sparse sensor placement problem. Quickly searching "sensor placement" on Github yields 77 repositories, including SSPOR_pub repo from K. Manohar (an author of this paper), which appears to implement the SSPOR algorithm in MATLAB. Some other repos include "sustainability-lab/polire", and "Chandrayee/sensor-placement". How do these codes fit into the story?
    • The authors could mention how optimal sensor placement is currently solved by each researchers with custom scripts on top of more general statistical learning and optimization packages, such as scikit-learn or scipy. A brief overview of the more general techniques such as SVD, POD, or DMD would be beneficial and help explain why choosing PySensors is superior to a custom script.
  • Citations of academic work that use PySensors could help prove the impact of the software. I do believe that researchers in my peer group would use this software (and I will certainly send it around to them!)

Documentation:

  • At time of review, binaries for the scikit-learn dependency for Python 3.9.0, Mac OSX, are not available. The installation thus fails trying to compile scikit-learn from source. I specified python 3.8.5 and the installation worked as expected.
  • The readme assumes familiarity with Python and pip. This assumption is reasonable, but may not be valid for all academic users.
    • Consider including a high level description of the dependencies in the Readme, such as “Python 3.6-3.8 with Mac or Linux and pip”. Not immediately clear if this software supports Windows.
    • May not be clear for all users what the next step is after "pip install ." A simple sentence with a link to the examples section would help.
  • There is something weird going on with the TOC on the ReadTheDocs page. I see "pysensors.version module" and "Module contents" as subheadings under "API" - which doesn't seem right. Otherwise, the RTD page looks very good.

Examples:

  • The example notebooks have a lot of warnings. To pick on one: classification.ipynb (as checked in to github) has the following warning after cell 19:
/home/brian/Documents/Dropbox (uwamath)/Brian/Research/PySensors/pySensors/venv/lib/python3.6/site-packages/sklearn/linear_model/_coordinate_descent.py:1914: UserWarning: Coordinate descent with l1_reg=0 may lead to unexpected results and is discouraged.
  check_random_state(self.random_state), random)

Code:

  • Tests pass with 36 warnings. They all look like the aforementioned "UserWarning" caused by improper parameters / small data, which is to be expected with tests. These warnings could be caught by the tests and ignored.
  • Performance is not mentioned in the documentation. Would be useful to know the time and space complexity of these algorithms.
  • Excellent code style and software architecture. I believe the implementation is exemplary, although I am not familiar with the pattern of naming each module with an underscore prefix and then renaming them in the __init__ scripts.

@pdebuyl
Copy link

pdebuyl commented Nov 24, 2020

Thank you for the review @jordanperr !

@pdebuyl
Copy link

pdebuyl commented Nov 30, 2020

Hi @briandesilva make sure to update us here on the progress so far.

@briandesilva
Copy link

First I'd like to thank @jordanperr for your comprehensive review of our package. I have just pushed changes that I believe address your comments under Documentation, Examples, and Code. A couple follow-up items:

At time of review, binaries for the scikit-learn dependency for Python 3.9.0, Mac OSX, are not available. The installation thus fails trying to compile scikit-learn from source. I specified python 3.8.5 and the installation worked as expected.

Do you have any suggestions for how we should address this? One option could be to narrow the versions of python the package is compatible with to 3.6-3.8 until Scikit-learn releases python 3.9 binaries.

I believe the implementation is exemplary, although I am not familiar with the pattern of naming each module with an underscore prefix and then renaming them in the init scripts.

This is the style used in Sckit-learn. For example, see the linear_model module.

@pdebuyl
Copy link

pdebuyl commented Dec 7, 2020

Hi @briandesilva thanks for the update. Did you address all concerns at this point so that the reviewers can take a look at the project again?

@tuelwer
Copy link

tuelwer commented Dec 7, 2020

@whedon generate pdf

@whedon
Copy link
Author

whedon commented Dec 7, 2020

👉📄 Download article proof 📄 View article proof on GitHub 📄 👈

@tuelwer
Copy link

tuelwer commented Dec 7, 2020

@briandesilva, @kmanohar thank you for adding the description of the problem setting. Some minor remarks: the index i is not defined. Perhaps you want to add for i=1...n or something similar. Also it would be nice to state the shape of y, C and x and to make clear that y usually has less entries than x.

@briandesilva
Copy link

@pdebuyl

Did you address all concerns at this point so that the reviewers can take a look at the project again?

We've yet to address Jordan's comments regarding the paper. I'll post here once the associated changes have been made.

@pdebuyl
Copy link

pdebuyl commented Feb 16, 2021

Thanks @briandesilva . Can you edit the metadata of the zenodo archive to match the paper please? Title should be "PySensors: A Python Package for Sparse Sensor Placement" and authors should be as here, with ORCID properly set.

@pdebuyl
Copy link

pdebuyl commented Feb 16, 2021

@whedon set 10.5281/zenodo.4542530 as archive

@whedon
Copy link
Author

whedon commented Feb 16, 2021

OK. 10.5281/zenodo.4542530 is the archive.

@pdebuyl
Copy link

pdebuyl commented Feb 16, 2021

@whedon set v0.3.3 as version

@whedon
Copy link
Author

whedon commented Feb 16, 2021

OK. v0.3.3 is the version.

@briandesilva
Copy link

Thanks @briandesilva . Can you edit the metadata of the zenodo archive to match the paper please? Title should be "PySensors: A Python Package for Sparse Sensor Placement" and authors should be as here, with ORCID properly set.

@pdebuyl, I've updated the archive as you asked.

@pdebuyl
Copy link

pdebuyl commented Feb 17, 2021

Thanks @briandesilva

@pdebuyl
Copy link

pdebuyl commented Feb 17, 2021

@whedon accept

@whedon
Copy link
Author

whedon commented Feb 17, 2021

Attempting dry run of processing paper acceptance...

@whedon whedon added the recommend-accept Papers recommended for acceptance in JOSS. label Feb 17, 2021
@whedon
Copy link
Author

whedon commented Feb 17, 2021

👋 @openjournals/joss-eics, this paper is ready to be accepted and published.

Check final proof 👉 openjournals/joss-papers#2089

If the paper PDF and Crossref deposit XML look good in openjournals/joss-papers#2089, then you can now move forward with accepting the submission by compiling again with the flag deposit=true e.g.

@whedon accept deposit=true

@whedon
Copy link
Author

whedon commented Feb 17, 2021

Reference check summary (note 'MISSING' DOIs are suggestions that need verification):

OK DOIs

- 10.1287/opre.43.4.684 is OK
- 10.1145/3384419.3430407 is OK
- 10.1137/17M1162366 is OK
- 10.1109/JSEN.2018.2887044 is OK
- 10.1109/mcs.2018.2810460 is OK
- 10.1137/15M1036713 is OK
- 10.1017/jfm.2011.195 is OK
- 10.2514/6.2004-2415 is OK
- 10.1073/pnas.1517384113 is OK
- 10.1109/access.2018.2886528 is OK
- 10.1109/access.2020.3023625 is OK
- 10.1126/science.1165893 is OK
- 10.1103/physrevmaterials.2.083802 is OK
- 10.1111/j.2517-6161.1996.tb02080.x is OK
- 10.5281/zenodo.1173754 is OK
- 10.1364/oe.24.030433 is OK
- 10.1063/1.5066099 is OK
- 10.1063/1.4977057 is OK
- 10.1016/j.ymssp.2018.08.033 is OK
- 10.1126/sciadv.1602614 is OK
- 10.1098/rspa.2016.0446 is OK
- 10.1137/16m1086637 is OK
- 10.1137/18m116798x is OK
- 10.1017/jfm.2017.823 is OK
- 10.1063/1.5018409 is OK
- 10.1016/j.ifacol.2016.10.249 is OK
- 10.1103/physreve.96.023302 is OK
- 10.1016/j.jcp.2018.10.045 is OK
- 10.1098/rspa.2018.0335 is OK
- 10.1016/j.jcp.2019.07.049 is OK
- 10.1007/s00162-020-00536-w is OK
- 10.1103/physreve.101.010203 is OK
- 10.1115/1.4043148 is OK
- 10.1016/j.ocemod.2009.01.001 is OK
- 10.1109/cdc.2014.7040017 is OK
- 10.1017/jfm.2017.137 is OK
- 10.1073/pnas.1808909115 is OK
- 10.1016/j.jmsy.2018.01.011 is OK
- 10.1017/jfm.2018.147 is OK
- 10.1017/9781108380690 is OK
- 10.1007/s00162-020-00520-4 is OK
- 10.1002/cpa.20124 is OK
- 10.1016/0167-7152(84)90020-8 is OK
- 10.1016/j.crma.2004.08.006 is OK
- 10.1162/0899766053723032 is OK
- 10.1109/TIT.2006.871582 is OK
- 10.1109/tit.2006.885507 is OK
- 10.1109/tit.2005.862083 is OK
- 10.1109/MSP.2007.4286571 is OK
- 10.1109/tit.2009.2034811 is OK
- 10.1016/j.acha.2010.10.002 is OK
- 10.1145/1879141.1879192 is OK
- 10.1137/090766498 is OK
- 10.1111/j.1467-9868.2011.00783.x is OK
- 10.1137/110822724 is OK
- 10.1137/15M1019271 is OK
- 10.1109/tsipn.2016.2614903 is OK
- 10.1109/sam.2016.7569707 is OK
- 10.1137/16m1081270 is OK
- 10.2172/1405271 is OK

MISSING DOIs

- None

INVALID DOIs

- None

@pdebuyl
Copy link

pdebuyl commented Feb 17, 2021

For the eic: only orcid ids for authors seem missing from metadata.

@briandesilva
Copy link

I just added Orcid IDs for the authors to the paper. Sorry, I hadn't thought to add them when I added them to the DOI archive.

@pdebuyl
Copy link

pdebuyl commented Feb 18, 2021

Thanks @briandesilva . I don't think that we require them, I mentioned it as I believe that it brings value to the authors, to the readers, and to the journal :-)

The submission is in the hands of the Editors-in-chief in rotation.

@arfon
Copy link
Member

arfon commented Feb 21, 2021

@whedon accept

@whedon
Copy link
Author

whedon commented Feb 21, 2021

Attempting dry run of processing paper acceptance...

@whedon
Copy link
Author

whedon commented Feb 21, 2021

👋 @openjournals/joss-eics, this paper is ready to be accepted and published.

Check final proof 👉 openjournals/joss-papers#2097

If the paper PDF and Crossref deposit XML look good in openjournals/joss-papers#2097, then you can now move forward with accepting the submission by compiling again with the flag deposit=true e.g.

@whedon accept deposit=true

@whedon
Copy link
Author

whedon commented Feb 21, 2021

Reference check summary (note 'MISSING' DOIs are suggestions that need verification):

OK DOIs

- 10.1287/opre.43.4.684 is OK
- 10.1145/3384419.3430407 is OK
- 10.1137/17M1162366 is OK
- 10.1109/JSEN.2018.2887044 is OK
- 10.1109/mcs.2018.2810460 is OK
- 10.1137/15M1036713 is OK
- 10.1017/jfm.2011.195 is OK
- 10.2514/6.2004-2415 is OK
- 10.1073/pnas.1517384113 is OK
- 10.1109/access.2018.2886528 is OK
- 10.1109/access.2020.3023625 is OK
- 10.1126/science.1165893 is OK
- 10.1103/physrevmaterials.2.083802 is OK
- 10.1111/j.2517-6161.1996.tb02080.x is OK
- 10.5281/zenodo.1173754 is OK
- 10.1364/oe.24.030433 is OK
- 10.1063/1.5066099 is OK
- 10.1063/1.4977057 is OK
- 10.1016/j.ymssp.2018.08.033 is OK
- 10.1126/sciadv.1602614 is OK
- 10.1098/rspa.2016.0446 is OK
- 10.1137/16m1086637 is OK
- 10.1137/18m116798x is OK
- 10.1017/jfm.2017.823 is OK
- 10.1063/1.5018409 is OK
- 10.1016/j.ifacol.2016.10.249 is OK
- 10.1103/physreve.96.023302 is OK
- 10.1016/j.jcp.2018.10.045 is OK
- 10.1098/rspa.2018.0335 is OK
- 10.1016/j.jcp.2019.07.049 is OK
- 10.1007/s00162-020-00536-w is OK
- 10.1103/physreve.101.010203 is OK
- 10.1115/1.4043148 is OK
- 10.1016/j.ocemod.2009.01.001 is OK
- 10.1109/cdc.2014.7040017 is OK
- 10.1017/jfm.2017.137 is OK
- 10.1073/pnas.1808909115 is OK
- 10.1016/j.jmsy.2018.01.011 is OK
- 10.1017/jfm.2018.147 is OK
- 10.1017/9781108380690 is OK
- 10.1007/s00162-020-00520-4 is OK
- 10.1002/cpa.20124 is OK
- 10.1016/0167-7152(84)90020-8 is OK
- 10.1016/j.crma.2004.08.006 is OK
- 10.1162/0899766053723032 is OK
- 10.1109/TIT.2006.871582 is OK
- 10.1109/tit.2006.885507 is OK
- 10.1109/tit.2005.862083 is OK
- 10.1109/MSP.2007.4286571 is OK
- 10.1109/tit.2009.2034811 is OK
- 10.1016/j.acha.2010.10.002 is OK
- 10.1145/1879141.1879192 is OK
- 10.1137/090766498 is OK
- 10.1111/j.1467-9868.2011.00783.x is OK
- 10.1137/110822724 is OK
- 10.1137/15M1019271 is OK
- 10.1109/tsipn.2016.2614903 is OK
- 10.1109/sam.2016.7569707 is OK
- 10.1137/16m1081270 is OK
- 10.2172/1405271 is OK

MISSING DOIs

- None

INVALID DOIs

- None

@arfon
Copy link
Member

arfon commented Feb 21, 2021

@whedon accept deposit=true

@whedon
Copy link
Author

whedon commented Feb 21, 2021

Doing it live! Attempting automated processing of paper acceptance...

@whedon whedon added accepted published Papers published in JOSS labels Feb 21, 2021
@whedon
Copy link
Author

whedon commented Feb 21, 2021

🐦🐦🐦 👉 Tweet for this paper 👈 🐦🐦🐦

@whedon
Copy link
Author

whedon commented Feb 21, 2021

🚨🚨🚨 THIS IS NOT A DRILL, YOU HAVE JUST ACCEPTED A PAPER INTO JOSS! 🚨🚨🚨

Here's what you must now do:

  1. Check final PDF and Crossref metadata that was deposited 👉 Creating pull request for 10.21105.joss.02828 joss-papers#2098
  2. Wait a couple of minutes to verify that the paper DOI resolves https://doi.org/10.21105/joss.02828
  3. If everything looks good, then close this review issue.
  4. Party like you just published a paper! 🎉🌈🦄💃👻🤘

Any issues? Notify your editorial technical team...

@arfon
Copy link
Member

arfon commented Feb 21, 2021

@jordanperr, @tuelwer - many thanks for your reviews here and to @pdebuyl for editing this submission. JOSS relies upon the volunteer efforts of folks likes yourselves and we simply wouldn't be able to do this without you! ✨

@briandesilva - your paper is now accepted and published in JOSS ⚡🚀💥

@arfon arfon closed this as completed Feb 21, 2021
@whedon
Copy link
Author

whedon commented Feb 21, 2021

🎉🎉🎉 Congratulations on your paper acceptance! 🎉🎉🎉

If you would like to include a link to your paper from your README use the following code snippets:

Markdown:
[![DOI](https://joss.theoj.org/papers/10.21105/joss.02828/status.svg)](https://doi.org/10.21105/joss.02828)

HTML:
<a style="border-width:0" href="https://doi.org/10.21105/joss.02828">
  <img src="https://joss.theoj.org/papers/10.21105/joss.02828/status.svg" alt="DOI badge" >
</a>

reStructuredText:
.. image:: https://joss.theoj.org/papers/10.21105/joss.02828/status.svg
   :target: https://doi.org/10.21105/joss.02828

This is how it will look in your documentation:

DOI

We need your help!

Journal of Open Source Software is a community-run journal and relies upon volunteer effort. If you'd like to support us please consider doing either one (or both) of the the following:

@briandesilva
Copy link

@arfon, I'm really sorry, but it was just pointed out to me that two of the author names were written in the wrong order in the paper. Is there any way to re-generate the pdf with the corrected order? I have corrected and pushed paper.md.

@arfon
Copy link
Member

arfon commented Feb 27, 2021

Sure thing. Updated in openjournals/joss-papers@8f2fc95

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
accepted published Papers published in JOSS Python recommend-accept Papers recommended for acceptance in JOSS. review
Projects
None yet
Development

No branches or pull requests

6 participants