Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[REVIEW]: FANGS - Fire Applications with Next-Generation Satellites #197

Open
22 tasks done
whedon opened this issue Feb 23, 2023 · 69 comments
Open
22 tasks done

[REVIEW]: FANGS - Fire Applications with Next-Generation Satellites #197

whedon opened this issue Feb 23, 2023 · 69 comments
Assignees
Labels
Jupyter Notebook Python recommend-accept Papers recommended for acceptance in JOSE. review TeX

Comments

@whedon
Copy link

whedon commented Feb 23, 2023

Submitting author: @sabrinaszeto ()
Repository: https://gitlab.eumetsat.int/eumetlab/atmosphere/fire-monitoring
Branch with paper.md (empty if default branch):
Version: v0.2
Editor: @yabellini
Reviewers: @RomiNahir, @csaybar
Archive: 10.5281/zenodo.13907115
Paper kind: learning module

Status

status

Status badge code:

HTML: <a href="https://jose.theoj.org/papers/30a1735810dca44db249d61bbf397dec"><img src="https://jose.theoj.org/papers/30a1735810dca44db249d61bbf397dec/status.svg"></a>
Markdown: [![status](https://jose.theoj.org/papers/30a1735810dca44db249d61bbf397dec/status.svg)](https://jose.theoj.org/papers/30a1735810dca44db249d61bbf397dec)

Reviewers and authors:

Please avoid lengthy details of difficulties in the review thread. Instead, please create a new issue in the target repository and link to those issues (especially acceptance-blockers) by leaving comments in the review thread below. (For completists: if the target issue tracker is also on GitHub, linking the review thread in the issue or vice versa will create corresponding breadcrumb trails in the link target.)

Reviewer instructions & questions

@RomiNahir, please carry out your review in this issue by updating the checklist below. If you cannot edit the checklist please:

  1. Make sure you're logged in to your GitHub account
  2. Be sure to accept the invite at this URL: https://github.com/openjournals/jose-reviews/invitations

The reviewer guidelines are available here: https://joss.readthedocs.io/en/latest/reviewer_guidelines.html. Any questions/concerns please let @yabellini know.

Please start on your review when you are able, and be sure to complete your review in the next six weeks, at the very latest

Review checklist for @RomiNahir

Conflict of interest

Code of Conduct

General checks

  • Repository: Is the source for this learning module available at the repository url?
  • License: Does the repository contain a plain-text LICENSE file with the contents of a standard license? (OSI-approved for code, Creative Commons for content)
  • Version: Does the release version given match the repository release (1.0)?
  • Authorship: Has the submitting author (@sabrinaszeto) made visible contributions to the module? Does the full list of authors seem appropriate and complete?

Documentation

  • A statement of need: Do the authors clearly state the need for this module and who the target audience is?
  • Installation instructions: Is there a clearly stated list of dependencies?
  • Usage: Does the documentation explain how someone would adopt the module, and include examples of how to use it?
  • Community guidelines: Are there clear guidelines for third parties wishing to 1) Contribute to the module 2) Report issues or problems with the module 3) Seek support

Pedagogy / Instructional design (Work-in-progress: reviewers, please comment!)

  • Learning objectives: Does the module make the learning objectives plainly clear? (We don't require explicitly written learning objectives; only that they be evident from content and design.)
  • Content scope and length: Is the content substantial for learning a given topic? Is the length of the module appropriate?
  • Pedagogy: Does the module seem easy to follow? Does it observe guidance on cognitive load? (working memory limits of 7 +/- 2 chunks of information)
  • Content quality: Is the writing of good quality, concise, engaging? Are the code components well crafted? Does the module seem complete?
  • Instructional design: Is the instructional design deliberate and apparent? For example, exploit worked-example effects; effective multi-media use; low extraneous cognitive load.

JOSE paper

  • Authors: Does the paper.md file include a list of authors with their affiliations?
  • A statement of need: Does the paper clearly state the need for this module and who the target audience is?
  • Description: Does the paper describe the learning materials and sequence?
  • Does it describe how it has been used in the classroom or other settings, and how someone might adopt it?
  • Could someone else teach with this module, given the right expertise?
  • Does the paper tell the "story" of how the authors came to develop it, or what their expertise is?
  • References: Do all archival references that should have a DOI list one (e.g., papers, datasets, software)?
@whedon
Copy link
Author

whedon commented Feb 23, 2023

Hello human, I'm @whedon, a robot that can help you with some common editorial tasks. @RomiNahir it looks like you're currently assigned to review this paper 🎉.

⚠️ JOSS reduced service mode ⚠️

Due to the challenges of the COVID-19 pandemic, JOSS is currently operating in a "reduced service mode". You can read more about what that means in our blog post.

⭐ Important ⭐

If you haven't already, you should seriously consider unsubscribing from GitHub notifications for this (https://github.com/openjournals/jose-reviews) repository. As a reviewer, you're probably currently watching this repository which means for GitHub's default behaviour you will receive notifications (emails) for all reviews 😿

To fix this do the following two things:

  1. Set yourself as 'Not watching' https://github.com/openjournals/jose-reviews:

watching

  1. You may also like to change your default settings for this watching repositories in your GitHub profile here: https://github.com/settings/notifications

notifications

For a list of things I can do to help you, just type:

@whedon commands

For example, to regenerate the paper pdf after making changes in the paper's md or bib files, type:

@whedon generate pdf

@whedon
Copy link
Author

whedon commented Feb 23, 2023

Wordcount for paper.md is 1161

@whedon
Copy link
Author

whedon commented Feb 23, 2023

👉📄 Download article proof 📄 View article proof on GitHub 📄 👈

@whedon
Copy link
Author

whedon commented Feb 23, 2023

Reference check summary (note 'MISSING' DOIs are suggestions that need verification):

OK DOIs

- 10.1038/s41598-017-00116-9 is OK
- 10.5281/zenodo.7463073 is OK

MISSING DOIs

- None

INVALID DOIs

- None

@whedon
Copy link
Author

whedon commented Feb 23, 2023

Software report (experimental):

github.com/AlDanial/cloc v 1.88  T=0.66 s (45.3 files/s, 86161.0 lines/s)
-------------------------------------------------------------------------------
Language                     files          blank        comment           code
-------------------------------------------------------------------------------
Jupyter Notebook                25              0          54330           2258
Markdown                         2             73              0            197
Python                           1             18             49             80
YAML                             1              0              0             48
TeX                              1              3              0             39
-------------------------------------------------------------------------------
SUM:                            30             94          54379           2622
-------------------------------------------------------------------------------


Statistical information for the repository '22eda3c7100ed08b3870db2a' was
gathered on 2023/02/23.
The following historical commit information, by author, was found:

Author                     Commits    Insertions      Deletions    % of changes
Sabrina Szeto                   24         19681          19392           99.97
jwagemann                        1             9              4            0.03

Below are the number of rows from each author that have survived and are still
intact in the current revision:

Author                     Rows      Stability          Age       % in comments
Sabrina Szeto               294            1.5          0.0               14.29

@yabellini
Copy link
Member

@whedon add @andrewmaclachlan as reviewer

@whedon
Copy link
Author

whedon commented Feb 23, 2023

OK, @andrewmaclachlan is now a reviewer

@yabellini
Copy link
Member

@andrewmaclachlan and @RomiNahir you can start the review on this issue. Here is the Review criteria https://openjournals.readthedocs.io/en/jose/review_criteria.html and the review checklist https://openjournals.readthedocs.io/en/jose/review_checklist.html

Please, let me know if you have any questions.

@yabellini
Copy link
Member

Hi @andrewmaclachlan and @RomiNahir, checking how everything is going with this review. Is it anything I can o to help?

@whedon
Copy link
Author

whedon commented Mar 9, 2023

👋 @RomiNahir, please update us on how your review is going (this is an automated reminder).

@RomiNahir
Copy link
Collaborator

This article shows a training course of Python-base fire satellite detection. The objectives and instructions are clear as the step by step is easy to reproduce. The content is correct for users who have knowledge in Python and satellite images. It shows an example of success training last year. I will recommend this article because it is a well documented example of satellite training.

@yabellini
Copy link
Member

Hi @andrewmaclachlan, since we have not hard from you in several weeks, we are now looking for a new reviewer. Thank you for your original willingness to contribute a review.

@yabellini
Copy link
Member

@whedon remove @andrewmaclachlan as reviewer

@whedon
Copy link
Author

whedon commented Apr 24, 2023

OK, @andrewmaclachlan is no longer a reviewer

@yabellini
Copy link
Member

yabellini commented Jul 6, 2023

One person I contacted is interested in doing the review but can't do it during the (north) summer. I will be trying to find a second reviewer before. If I fail, I will go back to this person.

@sabrinaszeto
Copy link

sabrinaszeto commented Oct 7, 2024

Thank you @csaybar and @RomiNahir for your feedback and reviews. I appreciate it!

I've made the changes and they are ready to go. Thanks as well, @yabellini for your perseverance in this process!

@yabellini
Copy link
Member

@editorialbot generate pdf

@editorialbot
Copy link
Collaborator

👉📄 Download article proof 📄 View article proof on GitHub 📄 👈

@yabellini
Copy link
Member

@editorialbot check references

@editorialbot
Copy link
Collaborator

Reference check summary (note 'MISSING' DOIs are suggestions that need verification):

✅ OK DOIs

- 10.1038/s41598-017-00116-9 is OK
- 10.5281/zenodo.7463073 is OK

🟡 SKIP DOIs

- No DOI given, and none found for title: Meteosat Third Generation - The Case for Preparing...
- No DOI given, and none found for title: Climate Change and Land: an IPCC special report on...

❌ MISSING DOIs

- None

❌ INVALID DOIs

- None

@yabellini
Copy link
Member

@sabrinaszeto, that's awesome! As a next step, please create a Zenodo archive and report the DOI here. You probably need to do a new release of your material.

@sabrinaszeto
Copy link

@yabellini, thank you! The Zenodo archive with the updated materials is ready.

@yabellini
Copy link
Member

Thank you, @sabrinaszeto. I was checking your repo, and under the Author section, two authors are listed instead of the 4 in the work here and the Zenodo deposit. Would you fix that?

I'm also asking about releases in the JOSE Slack because I didn't find the release number on GitLab. I looked under the Deploy option, and there are releases.

DOI you provide: https://zenodo.org/records/13907116

@yabellini
Copy link
Member

@editorialbot set https://zenodo.org/records/13907116 as archive

@editorialbot
Copy link
Collaborator

That doesn't look like a valid DOI value

@sabrinaszeto
Copy link

sabrinaszeto commented Oct 10, 2024 via email

@yabellini
Copy link
Member

yabellini commented Oct 10, 2024

I got an answer about the the release:

They should make a tagged release on the version-controlled repository (then update the version with @editorialbot), and the Zenodo archive should reflect that release (i.e., be synced with it)

Please, after you update the readme, then create a release of your repo and then update the Zenodo deposit.

@yabellini
Copy link
Member

Hi @sabrinaszeto, I am just checking with you about the status of the tagged release. Thanks!

@sabrinaszeto
Copy link

sabrinaszeto commented Oct 28, 2024 via email

@sabrinaszeto
Copy link

@yabellini, thanks for your patience! The tagged release is now complete. I have also updated the Zenodo archive.

@yabellini
Copy link
Member

@editorialbot set v0.1 as version

@editorialbot
Copy link
Collaborator

Done! version is now v0.1

@yabellini
Copy link
Member

@editorialbot set 10.5281/zenodo.13907115 as archive

@editorialbot
Copy link
Collaborator

Done! archive is now 10.5281/zenodo.13907115

@yabellini
Copy link
Member

@sabrinaszeto I'm strugling to understand the versioning:

  • the tag version on GitLab is v0.1
  • you have two version on Zenodo, v0.1 and v0.2
    Which one corresponds to the work on this paper?

@labarba, should the version be the same number in both places? (GitLab and Zenodo) If not, which one should I indicate to the bot here?

Thanks for any guidance on this matter. It is the only step missing to accept this paper.

@sabrinaszeto
Copy link

sabrinaszeto commented Nov 18, 2024 via email

@yabellini
Copy link
Member

Please, @sabrinaszeto, can you make the version number match? Thank you!

@sabrinaszeto
Copy link

Thank you @yabellini, I made a historical release on GitLab so the version numbers match.

@yabellini
Copy link
Member

@editorialbot set v0.2 as version

@editorialbot
Copy link
Collaborator

Done! version is now v0.2

@yabellini
Copy link
Member

@editorialbot recommend-accept

@editorialbot
Copy link
Collaborator

Attempting dry run of processing paper acceptance...

@editorialbot
Copy link
Collaborator

Reference check summary (note 'MISSING' DOIs are suggestions that need verification):

✅ OK DOIs

- 10.1038/s41598-017-00116-9 is OK
- 10.5281/zenodo.7463073 is OK

🟡 SKIP DOIs

- No DOI given, and none found for title: Meteosat Third Generation - The Case for Preparing...
- No DOI given, and none found for title: Climate Change and Land: an IPCC special report on...

❌ MISSING DOIs

- None

❌ INVALID DOIs

- None

@editorialbot
Copy link
Collaborator

👋 @openjournals/jose-eics, this paper is ready to be accepted and published.

Check final proof 👉📄 Download article

If the paper PDF and the deposit XML files look good in openjournals/jose-papers#162, then you can now move forward with accepting the submission by compiling again with the command @editorialbot accept

@editorialbot editorialbot added the recommend-accept Papers recommended for acceptance in JOSE. label Dec 10, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Jupyter Notebook Python recommend-accept Papers recommended for acceptance in JOSE. review TeX
Projects
None yet
Development

No branches or pull requests

9 participants