Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[REVIEW]: autumn: A Python library for dynamic modelling of captured CO~2~ cost potential curves #3203

Closed
40 tasks done
whedon opened this issue Apr 22, 2021 · 78 comments
Closed
40 tasks done
Assignees
Labels
accepted Jupyter Notebook published Papers published in JOSS Python recommend-accept Papers recommended for acceptance in JOSS. review TeX

Comments

@whedon
Copy link

whedon commented Apr 22, 2021

Submitting author: @Eugenio2192 (Eugenio Salvador Arellano Ruiz)
Repository: https://gitlab.com/dlr-ve/autumn/
Version: v0.1.0
Editor: @timtroendle
Reviewers: @igarizio, @milicag
Archive: 10.5281/zenodo.5163098

⚠️ JOSS reduced service mode ⚠️

Due to the challenges of the COVID-19 pandemic, JOSS is currently operating in a "reduced service mode". You can read more about what that means in our blog post.

Status

status

Status badge code:

HTML: <a href="https://joss.theoj.org/papers/d7bab860686df296e35cc437f0630ff2"><img src="https://joss.theoj.org/papers/d7bab860686df296e35cc437f0630ff2/status.svg"></a>
Markdown: [![status](https://joss.theoj.org/papers/d7bab860686df296e35cc437f0630ff2/status.svg)](https://joss.theoj.org/papers/d7bab860686df296e35cc437f0630ff2)

Reviewers and authors:

Please avoid lengthy details of difficulties in the review thread. Instead, please create a new issue in the target repository and link to those issues (especially acceptance-blockers) by leaving comments in the review thread below. (For completists: if the target issue tracker is also on GitHub, linking the review thread in the issue or vice versa will create corresponding breadcrumb trails in the link target.)

Reviewer instructions & questions

@potterzot & @igarizio, please carry out your review in this issue by updating the checklist below. If you cannot edit the checklist please:

  1. Make sure you're logged in to your GitHub account
  2. Be sure to accept the invite at this URL: https://github.com/openjournals/joss-reviews/invitations

The reviewer guidelines are available here: https://joss.readthedocs.io/en/latest/reviewer_guidelines.html. Any questions/concerns please let @timtroendle know.

Please start on your review when you are able, and be sure to complete your review in the next six weeks, at the very latest

Review checklist for @milicag

Conflict of interest

  • I confirm that I have read the JOSS conflict of interest (COI) policy and that: I have no COIs with reviewing this work or that any perceived COIs have been waived by JOSS for the purpose of this review.

Code of Conduct

General checks

  • Repository: Is the source code for this software available at the repository url?
  • License: Does the repository contain a plain-text LICENSE file with the contents of an OSI approved software license?
  • Contribution and authorship: Has the submitting author (@Eugenio2192) made major contributions to the software? Does the full list of paper authors seem appropriate and complete?
  • Substantial scholarly effort: Does this submission meet the scope eligibility described in the JOSS guidelines

Functionality

  • Installation: Does installation proceed as outlined in the documentation?
  • Functionality: Have the functional claims of the software been confirmed?
  • Performance: If there are any performance claims of the software, have they been confirmed? (If there are no claims, please check off this item.)

Documentation

  • A statement of need: Do the authors clearly state what problems the software is designed to solve and who the target audience is?
  • Installation instructions: Is there a clearly-stated list of dependencies? Ideally these should be handled with an automated package management solution.
  • Example usage: Do the authors include examples of how to use the software (ideally to solve real-world analysis problems).
  • Functionality documentation: Is the core functionality of the software documented to a satisfactory level (e.g., API method documentation)?
  • Automated tests: Are there automated tests or manual steps described so that the functionality of the software can be verified?
  • Community guidelines: Are there clear guidelines for third parties wishing to 1) Contribute to the software 2) Report issues or problems with the software 3) Seek support

Software paper

  • Summary: Has a clear description of the high-level functionality and purpose of the software for a diverse, non-specialist audience been provided?
  • A statement of need: Does the paper have a section titled 'Statement of Need' that clearly states what problems the software is designed to solve and who the target audience is?
  • State of the field: Do the authors describe how this software compares to other commonly-used packages?
  • Quality of writing: Is the paper well written (i.e., it does not require editing for structure, language, or writing quality)?
  • References: Is the list of references complete, and is everything cited appropriately that should be cited (e.g., papers, datasets, software)? Do references in the text use the proper citation syntax?

Review checklist for @igarizio

Conflict of interest

  • I confirm that I have read the JOSS conflict of interest (COI) policy and that: I have no COIs with reviewing this work or that any perceived COIs have been waived by JOSS for the purpose of this review.

Code of Conduct

General checks

  • Repository: Is the source code for this software available at the repository url?
  • License: Does the repository contain a plain-text LICENSE file with the contents of an OSI approved software license?
  • Contribution and authorship: Has the submitting author (@Eugenio2192) made major contributions to the software? Does the full list of paper authors seem appropriate and complete?
  • Substantial scholarly effort: Does this submission meet the scope eligibility described in the JOSS guidelines

Functionality

  • Installation: Does installation proceed as outlined in the documentation?
  • Functionality: Have the functional claims of the software been confirmed?
  • Performance: If there are any performance claims of the software, have they been confirmed? (If there are no claims, please check off this item.)

Documentation

  • A statement of need: Do the authors clearly state what problems the software is designed to solve and who the target audience is?
  • Installation instructions: Is there a clearly-stated list of dependencies? Ideally these should be handled with an automated package management solution.
  • Example usage: Do the authors include examples of how to use the software (ideally to solve real-world analysis problems).
  • Functionality documentation: Is the core functionality of the software documented to a satisfactory level (e.g., API method documentation)?
  • Automated tests: Are there automated tests or manual steps described so that the functionality of the software can be verified?
  • Community guidelines: Are there clear guidelines for third parties wishing to 1) Contribute to the software 2) Report issues or problems with the software 3) Seek support

Software paper

  • Summary: Has a clear description of the high-level functionality and purpose of the software for a diverse, non-specialist audience been provided?
  • A statement of need: Does the paper have a section titled 'Statement of Need' that clearly states what problems the software is designed to solve and who the target audience is?
  • State of the field: Do the authors describe how this software compares to other commonly-used packages?
  • Quality of writing: Is the paper well written (i.e., it does not require editing for structure, language, or writing quality)?
  • References: Is the list of references complete, and is everything cited appropriately that should be cited (e.g., papers, datasets, software)? Do references in the text use the proper citation syntax?
@whedon
Copy link
Author

whedon commented Apr 22, 2021

Hello human, I'm @whedon, a robot that can help you with some common editorial tasks. @potterzot, @igarizio it looks like you're currently assigned to review this paper 🎉.

⚠️ JOSS reduced service mode ⚠️

Due to the challenges of the COVID-19 pandemic, JOSS is currently operating in a "reduced service mode". You can read more about what that means in our blog post.

⭐ Important ⭐

If you haven't already, you should seriously consider unsubscribing from GitHub notifications for this (https://github.com/openjournals/joss-reviews) repository. As a reviewer, you're probably currently watching this repository which means for GitHub's default behaviour you will receive notifications (emails) for all reviews 😿

To fix this do the following two things:

  1. Set yourself as 'Not watching' https://github.com/openjournals/joss-reviews:

watching

  1. You may also like to change your default settings for this watching repositories in your GitHub profile here: https://github.com/settings/notifications

notifications

For a list of things I can do to help you, just type:

@whedon commands

For example, to regenerate the paper pdf after making changes in the paper's md or bib files, type:

@whedon generate pdf

@whedon
Copy link
Author

whedon commented Apr 22, 2021

Software report (experimental):

github.com/AlDanial/cloc v 1.88  T=0.19 s (318.1 files/s, 42970.8 lines/s)
-------------------------------------------------------------------------------
Language                     files          blank        comment           code
-------------------------------------------------------------------------------
Python                          21            794           1502           2859
YAML                             7             22              9           1015
JSON                            10             10              0            359
reStructuredText                14            161            147            215
Markdown                         2             57              0            170
Jupyter Notebook                 2              0            569            150
TeX                              1             11              0            109
DOS Batch                        1              8              1             26
HTML                             1              1              0             24
make                             1              4              7              9
XML                              1              0              0              1
-------------------------------------------------------------------------------
SUM:                            61           1068           2235           4937
-------------------------------------------------------------------------------


Statistical information for the repository 'd46ee64810c51de729dc06f3' was
gathered on 2021/04/22.
The following historical commit information, by author, was found:

Author                     Commits    Insertions      Deletions    % of changes
Arellano Ruiz                    1             1              1            0.02
Arellano Ruiz, Eugen            22          4152           2818           62.52
Eugenio                          1          3497              0           31.37
Eugenio Arellano                11           340            187            4.73
Eugenio Salvador Are             4            62             37            0.89
ve-py@dlr.de                     2            41             13            0.48

Below are the number of rows from each author that have survived and are still
intact in the current revision:

Author                     Rows      Stability          Age       % in comments
Arellano Ruiz                 1          100.0          0.3                0.00
Arellano Ruiz, Eugen       3736           90.0          1.3                7.90
DLR                          40          100.0          0.2                5.00
Eugenio Arellano              3            0.9          0.7                0.00
Eugenio2192                1375          100.0          0.1                6.40

@whedon
Copy link
Author

whedon commented Apr 22, 2021

Reference check summary (note 'MISSING' DOIs are suggestions that need verification):

OK DOIs

- 10.1039/C7EE02342A is OK
- 10.1007/s11356-016-6810-2 is OK
- 10.1016/j.esr.2018.11.004 is OK
- 10.1038/s41467-020-20015-4 is OK
- 10.1016/j.energy.2018.10.114 is OK
- 10.1016/j.energy.2018.06.222 is OK
- 10.18419/opus-2015 is OK
- 10.1016/j.epsr.2020.106690 is OK
- 10.1021/acs.est.5b03474 is OK

MISSING DOIs

- None

INVALID DOIs

- None

@whedon
Copy link
Author

whedon commented Apr 22, 2021

👉📄 Download article proof 📄 View article proof on GitHub 📄 👈

@timtroendle
Copy link

@potterzot, @igarizio, as mentioned over in the pre-review, we are still operating in reduced service mode in which we ask reviewers to finish their review within six weeks at the latest. I'll add a reminder for both of you when half of that time has run up.

It's great if you can finish your review earlier than that, of course.

@timtroendle
Copy link

@whedon remind @potterzot in three weeks

@whedon
Copy link
Author

whedon commented Apr 22, 2021

Reminder set for @potterzot in three weeks

@timtroendle
Copy link

@whedon remind @igarizio in three weeks

@whedon
Copy link
Author

whedon commented Apr 22, 2021

Reminder set for @igarizio in three weeks

@whedon
Copy link
Author

whedon commented May 6, 2021

👋 @potterzot, please update us on how your review is going (this is an automated reminder).

@whedon
Copy link
Author

whedon commented May 6, 2021

👋 @igarizio, please update us on how your review is going (this is an automated reminder).

@potterzot
Copy link

I had some trouble installing cartopy that had to do with depreciated proj_api.h files, but I have everything running and should have a review done by the end of next week.

@timtroendle I cannot edit the checklist and it seems my invitation has expired, can you issue another one? Thanks!

@timtroendle
Copy link

@whedon re-invite @potterzot as reviewer

@whedon
Copy link
Author

whedon commented May 7, 2021

OK, the reviewer has been re-invited.

@potterzot please accept the invite by clicking this link: https://github.com/openjournals/joss-reviews/invitations

@whedon
Copy link
Author

whedon commented May 13, 2021

👋 @potterzot, please update us on how your review is going (this is an automated reminder).

@whedon
Copy link
Author

whedon commented May 13, 2021

👋 @igarizio, please update us on how your review is going (this is an automated reminder).

@igarizio
Copy link

👋 @Eugenio2192 Thank you for the submission. This is a super interesting topic!
I have been reviewing your paper, code, and documentation and I have a few comments:

General checks:

  • Contribution and authorship: I see that you have made a lot of contributions to the software and I just wanted to check with you that the other listed authors had also made substantial contributions.

Functionality

  • Installation:
    • It seems like the software is not yet listed on pypi (you also mention this on the README), this is not a problem at all, but I think it would be best not to put the pip install autumncpc instruction on your README/documentation until it has been properly listed. At the end of the review, your repo will be archived and it could happen that someone in the next few days takes the name autumncpc from pypi (so your archived repo would be forever instructing to install someone else's code). [@timtroendle would you please correct me if I am wrong or if you have a different opinion, I am not 100% sure about this]

    • When following the other installation instructions, my console got flooded with this warning:

      Warning : you have pip-installed dependencies in your environment file, but you do not list pip itself as one of your conda dependencies. Conda may not use the correct pip to install your packages, and they may end up in the wrong place. Please add an explicit pip dependency. I'm adding one for you, but still nagging you.

      I waited 40 minutes and canceled the installation. I added pip to the .env file and ran the installation again. With this, the warning disappeared and the installation finished in a couple of minutes. I am not certain that this was the issue (maybe I just got unlucky the first time), but this is something you might want to consider.

    • There seem to be some slight differences between the installation instructions listed on the README and on the documentation. It would be great if you could make them the same so that your future users do not get confused.

    • Small detail: I think there is an extra .py in from autumn.scripts.create_data_tree.py import main_build_file_tree (here)

Documentation:

  • A statement of need: I was not able to find this in your README or documentation.
  • Installation instructions: Same comment from the functionality section.
  • Example usage:
    • I was not able to run this code from the documentation (I created an issue for this)
    • I was not able to run this line from the README:
      scenarios = scenario_development_one_hot_encoded(["basic", "iron", "cement"])
      I did not create an issue because it seems like scenario_development_one_hot_encoded is just missing an argument.
    • Small detail: On the README you define Distribution = cost_captured_distribution("basic"), but then use distribution.data (Distribution vs distribution). This also causes an error.
  • Functionality documentation: Good documentation in general, but there seems to be a missing section here (I see you have the docstrings for that part, so maybe it is just an issue with sphinx or something related (?)).
  • Automated tests:
    • I see that you had a folder with a couple of tests, but most of them (test_common.py, test_core.py, test_data_operations.py, test_harmonization.py, test_powerplantmatchingcc.py) were empty or did not really test any code (test_harmonize.py).
    • It would be useful for users if you could also add some instructions on how to run the tests (the typical pytest ... is ok)
  • Community guidelines: I was not able to find this in your README or documentation.

Other:

  • Spelling: I am not great at this (English is not my first language), so consider them with caution.
    • Asses vs Assess: line 50.
    • Visualizaion vs Visualization: line 72.
    • Small detail: CO2 vs CO_2: line 50. You consistently use CO_2 (with the subscript) throughout the paper, but here you used CO2.
    • There might be more, but these were the ones I was able to find.

Extra comments

These are not really required for the review. I include them just in case they are helpful to you.

  • Use os.path.join instead of "/" with strings
  • Use uppercase for constants (for example in create_data_tree.py)
  • There is an empty conftest.py (I was not sure if this was on purpose or not)
  • There are two README files with different information (I was not sure if this was on purpose or not)

This is what I have so far, I will keep working on this. Please let me know if I made any mistake on any of this.
Again, thank you for the submission. Have a great rest of the weekend!

@Eugenio2192
Copy link

👋 @Eugenio2192 Thank you for the submission. This is a super interesting topic!
I have been reviewing your paper, code, and documentation and I have a few comments:

General checks:

* **Contribution and authorship:** I see that you have made a lot of contributions to the software and I just wanted to check with you that the other listed authors had also made substantial contributions.

Functionality

* **Installation:**
  
  * It seems like the software is not yet listed on [pypi](https://pypi.org/search/?q=autumncpc) (you also mention this on the [README](https://gitlab.com/dlr-ve/autumn/-/blob/master/README.md)), this is not a problem at all, but I think it would be best not to put the `pip install autumncpc` instruction on your README/documentation until it has been properly listed. At the end of the review, your repo will be archived and it could happen that someone in the next few days takes the name _autumncpc_ from pypi (so your archived repo would be forever instructing to install someone else's code). [@timtroendle would you please correct me if I am wrong or if you have a different opinion, I am not 100% sure about this]
  * When following the other installation instructions, my console got flooded with this warning:
    > Warning : you have pip-installed dependencies in your environment file, but you do not list pip itself as one of your conda dependencies.  Conda may not use the correct pip to install your packages, and they may end up in the wrong place.  Please add an explicit pip dependency.  I'm adding one for you, but still nagging you.
    
    
    I waited 40 minutes and canceled the installation. I added _pip_ to the .env file and ran the installation again. With this, the warning disappeared and the installation finished in a couple of minutes. I am not certain that this was the issue (maybe I just got unlucky the first time), but this is something you might want to consider.
  * There seem to be some slight differences between the installation instructions listed on the [README](https://gitlab.com/dlr-ve/autumn/-/blob/master/README.md) and on the [documentation](https://autumn.readthedocs.io/en/master/installation.html). It would be great if you could make them the same so that your future users do not get confused.
  * Small detail: I think there is an extra _.py_ in `from autumn.scripts.create_data_tree.py import main_build_file_tree` ([here](https://autumn.readthedocs.io/en/master/getdata.html#getting-the-necessary-data))

Documentation:

* **A statement of need:** I was not able to find this in your README or documentation.

* **Installation instructions:** Same comment from the functionality section.

* **Example usage:**
  
  * I was not able to run [this code from the documentation](https://autumn.readthedocs.io/en/master/basic.html#scenario-representations) (I created [an issue for this](https://gitlab.com/dlr-ve/autumn/-/issues/1))
  * I was not able to run this line from the README:
    `scenarios = scenario_development_one_hot_encoded(["basic", "iron", "cement"])`
    I did not create an issue because it seems like `scenario_development_one_hot_encoded` is just missing an argument.
  * Small detail: On the README you define `Distribution = cost_captured_distribution("basic")`, but then use `distribution.data` (Distribution vs distribution). This also causes an error.

* **Functionality documentation:** Good documentation in general, but there seems to be a missing section [here](https://autumn.readthedocs.io/en/master/autumn.html#core) (I see you have the docstrings for that part, so maybe it is just an issue with sphinx or something related (?)).

* **Automated tests:**
  
  * I see that you had a folder with a couple of tests, but most of them (`test_common.py`, `test_core.py`, `test_data_operations.py`, `test_harmonization.py`, `test_powerplantmatchingcc.py`) were empty or did not really test any code (`test_harmonize.py`).
  * It would be useful for users if you could also add some instructions on how to run the tests (the typical `pytest ...` is ok)

* **Community guidelines:**  I was not able to find this in your README or documentation.

Other:

* **Spelling:** I am not great at this (English is not my first language), so consider them with caution.
  
  * Asses vs Assess: [line 50](https://gitlab.com/dlr-ve/autumn/-/blob/master/paper/paper.md#L50).
  * Visualizaion vs Visualization: [line 72](https://gitlab.com/dlr-ve/autumn/-/blob/master/paper/paper.md#L72).
  * Small detail: CO2 vs CO_2: [line 50](https://gitlab.com/dlr-ve/autumn/-/blob/master/paper/paper.md#L50). You consistently use CO_2 (with the subscript) throughout the paper, but here you used CO2.
  * There might be more, but these were the ones I was able to find.

Extra comments

These are not really required for the review. I include them just in case they are helpful to you.

* Use `os.path.join` instead of "**/**" with strings

* Use uppercase for constants (for example in `create_data_tree.py`)

* There is an empty `conftest.py` (I was not sure if this was on purpose or not)

* There are two README files with different information (I was not sure if this was on purpose or not)

This is what I have so far, I will keep working on this. Please let me know if I made any mistake on any of this.
Again, thank you for the submission. Have a great rest of the weekend!

Thanks a lot for the feedback @igarizio most of the issues have been addressed in the latest commits. Some of the changes will take some more time but the critical ones like documentation consistency and tool functionality are resolved.

Regarding the participation of the secondary authors. I have been mostly in the task of translating their contributions into code. Dr. Fuchs is behind the technical oversight of the development of the tool, Mr. Wulff is behind the ideation of the application and reviewing of the changes that were implemented, both of them offered continuous colsuntancy and code reviews during the whole lifetime of the development. Mr. Wu is behind the theoretical background of the tool, an important part of his previous work was used as a base for this development and he also offered reviewing.

Regarding the test framework, it is still work in progress but we are keeping track of the tool health using coverage as an indicator. Hopefully we will get soon the chance of completing this in the following months.

Have a great week!

Eugenio

@timtroendle
Copy link

Hello @potterzot, could you please let us know where you stand with your review?

@igarizio
Copy link

igarizio commented Jun 4, 2021

@whedon generate pdf

@whedon
Copy link
Author

whedon commented Jun 4, 2021

👉📄 Download article proof 📄 View article proof on GitHub 📄 👈

@timtroendle
Copy link

@potterzot, we need to finish the first round of reviews. Can you please let us know where you stand with your review?

@whedon
Copy link
Author

whedon commented Aug 2, 2021

Reference check summary (note 'MISSING' DOIs are suggestions that need verification):

OK DOIs

- 10.1039/C7EE02342A is OK
- 10.1007/s11356-016-6810-2 is OK
- 10.1016/j.esr.2018.11.004 is OK
- 10.1038/s41467-020-20015-4 is OK
- 10.1016/j.energy.2018.10.114 is OK
- 10.1016/j.energy.2018.06.222 is OK
- 10.18419/opus-2015 is OK
- 10.1016/j.epsr.2020.106690 is OK
- 10.1021/acs.est.5b03474 is OK

MISSING DOIs

- None

INVALID DOIs

- None

@timtroendle
Copy link

Everything looks good to me and I am happy to move forward. At this point, @Eugenio2192 could you please:

  • Make a tagged release of your software, and list the version tag of the archived version here.
  • Archive the reviewed software in Zenodo or a similar service (e.g., figshare, an institutional repository)
  • Check the archival deposit (e.g., in Zenodo) has the correct metadata. This includes the title (should match the paper title) and author list (make sure the list is correct and people who only made a small fix are not on it). You may also add the authors' ORCID.
  • Please list the DOI of the archived version here.

I can then move forward with accepting the submission.

@Eugenio2192
Copy link

Hello @timtroendle
Here is the DOI of the archived version: https://doi.org/10.5281/zenodo.5153667
I released the archived version under the tag JOSS: https://gitlab.com/dlr-ve/autumn/-/tree/JOSS

@timtroendle
Copy link

Thanks, @Eugenio2192 . I've seen you are releasing this as a JOSS version. You are free to do so, as JOSS is not limiting you in any way. Still, most authors choose the use the semantic versioning scheme for their software, resulting in version numbers like v.1.4.0. Are you sure you want to use JOSS as your version tag?

I will then move on right away and recommend acceptance of your submission.

@timtroendle
Copy link

@whedon set 10.5281/zenodo.5153667 as archive

@whedon
Copy link
Author

whedon commented Aug 5, 2021

OK. 10.5281/zenodo.5153667 is the archive.

@timtroendle
Copy link

@whedon set JOSS as version

@whedon
Copy link
Author

whedon commented Aug 5, 2021

OK. JOSS is the version.

@Eugenio2192
Copy link

I also made a v0.1.0 tag (will turn to ones once we do the PIPy release) with exactly the same content, my reasoning behind the JOSS tag was to have users be able to quickly reference the version associated to the publication. I will keep both in the repo but now that you mention the semantic versioning I would opt for the v0.1.0 tag for the paper if that can still be changed.

@timtroendle
Copy link

We can still switch the version, no problem. Please ensure that the archive has the correct version. Right now it's still JOSS. Please let me know once you've updated that.

FYI: The paper will mention the version and also link to the archive. Readers of the paper should therefore have no problem finding the correct version.

@Eugenio2192
Copy link

DOI with the correct version: https://doi.org/10.5281/zenodo.5163098

@timtroendle
Copy link

@whedon set 10.5281/zenodo.5163098 as archive

@whedon
Copy link
Author

whedon commented Aug 5, 2021

OK. 10.5281/zenodo.5163098 is the archive.

@timtroendle
Copy link

@whedon set v0.1.0 as version

@whedon
Copy link
Author

whedon commented Aug 5, 2021

OK. v0.1.0 is the version.

@timtroendle
Copy link

Thanks @Eugenio2192 for letting me know. And thanks a lot to @igarizio and @milicag for reviewing this submission. I will go ahead and recommend the acceptance of this submission.

@timtroendle
Copy link

@whedon recommend-accept

@whedon
Copy link
Author

whedon commented Aug 5, 2021

Attempting dry run of processing paper acceptance...

@whedon whedon added the recommend-accept Papers recommended for acceptance in JOSS. label Aug 5, 2021
@whedon
Copy link
Author

whedon commented Aug 5, 2021

Reference check summary (note 'MISSING' DOIs are suggestions that need verification):

OK DOIs

- 10.1039/C7EE02342A is OK
- 10.1007/s11356-016-6810-2 is OK
- 10.1016/j.esr.2018.11.004 is OK
- 10.1038/s41467-020-20015-4 is OK
- 10.1016/j.energy.2018.10.114 is OK
- 10.1016/j.energy.2018.06.222 is OK
- 10.18419/opus-2015 is OK
- 10.1016/j.epsr.2020.106690 is OK
- 10.1021/acs.est.5b03474 is OK

MISSING DOIs

- None

INVALID DOIs

- None

@whedon
Copy link
Author

whedon commented Aug 5, 2021

👋 @openjournals/joss-eics, this paper is ready to be accepted and published.

Check final proof 👉 openjournals/joss-papers#2495

If the paper PDF and Crossref deposit XML look good in openjournals/joss-papers#2495, then you can now move forward with accepting the submission by compiling again with the flag deposit=true e.g.

@whedon accept deposit=true

@arfon
Copy link
Member

arfon commented Aug 6, 2021

@whedon accept deposit=true

@whedon
Copy link
Author

whedon commented Aug 6, 2021

Doing it live! Attempting automated processing of paper acceptance...

@whedon whedon added accepted published Papers published in JOSS labels Aug 6, 2021
@whedon
Copy link
Author

whedon commented Aug 6, 2021

🐦🐦🐦 👉 Tweet for this paper 👈 🐦🐦🐦

@whedon
Copy link
Author

whedon commented Aug 6, 2021

🚨🚨🚨 THIS IS NOT A DRILL, YOU HAVE JUST ACCEPTED A PAPER INTO JOSS! 🚨🚨🚨

Here's what you must now do:

  1. Check final PDF and Crossref metadata that was deposited 👉 Creating pull request for 10.21105.joss.03203 joss-papers#2498
  2. Wait a couple of minutes, then verify that the paper DOI resolves https://doi.org/10.21105/joss.03203
  3. If everything looks good, then close this review issue.
  4. Party like you just published a paper! 🎉🌈🦄💃👻🤘

Any issues? Notify your editorial technical team...

@arfon
Copy link
Member

arfon commented Aug 6, 2021

@igarizio, @milicag – many thanks for your reviews here and to @timtroendle for editing this submission! JOSS relies upon the volunteer effort of people like you and we simply wouldn't be able to do this without you ✨

@Eugenio2192 – your paper is now accepted and published in JOSS ⚡🚀💥

@arfon arfon closed this as completed Aug 6, 2021
@whedon
Copy link
Author

whedon commented Aug 6, 2021

🎉🎉🎉 Congratulations on your paper acceptance! 🎉🎉🎉

If you would like to include a link to your paper from your README use the following code snippets:

Markdown:
[![DOI](https://joss.theoj.org/papers/10.21105/joss.03203/status.svg)](https://doi.org/10.21105/joss.03203)

HTML:
<a style="border-width:0" href="https://doi.org/10.21105/joss.03203">
  <img src="https://joss.theoj.org/papers/10.21105/joss.03203/status.svg" alt="DOI badge" >
</a>

reStructuredText:
.. image:: https://joss.theoj.org/papers/10.21105/joss.03203/status.svg
   :target: https://doi.org/10.21105/joss.03203

This is how it will look in your documentation:

DOI

We need your help!

Journal of Open Source Software is a community-run journal and relies upon volunteer effort. If you'd like to support us please consider doing either one (or both) of the the following:

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
accepted Jupyter Notebook published Papers published in JOSS Python recommend-accept Papers recommended for acceptance in JOSS. review TeX
Projects
None yet
Development

No branches or pull requests

8 participants