Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[REVIEW]: Practical machine learning with PyTorch #239

Open
editorialbot opened this issue Mar 3, 2024 · 20 comments
Open

[REVIEW]: Practical machine learning with PyTorch #239

editorialbot opened this issue Mar 3, 2024 · 20 comments

Comments

@editorialbot
Copy link
Collaborator

editorialbot commented Mar 3, 2024

Submitting author: @jatkinson1000 (Jack Atkinson)
Repository: https://github.com/Cambridge-ICCS/ml-training-material
Branch with paper.md (empty if default branch): JOSE
Version: v1.0
Editor: @nicoguaro
Reviewers: @mnarayan, @dortiz5
Archive: Pending
Paper kind: learning module

Status

status

Status badge code:

HTML: <a href="https://jose.theoj.org/papers/fa9320a02f05c17eafa19d0204a51592"><img src="https://jose.theoj.org/papers/fa9320a02f05c17eafa19d0204a51592/status.svg"></a>
Markdown: [![status](https://jose.theoj.org/papers/fa9320a02f05c17eafa19d0204a51592/status.svg)](https://jose.theoj.org/papers/fa9320a02f05c17eafa19d0204a51592)

Reviewers and authors:

Please avoid lengthy details of difficulties in the review thread. Instead, please create a new issue in the target repository and link to those issues (especially acceptance-blockers) by leaving comments in the review thread below. (For completists: if the target issue tracker is also on GitHub, linking the review thread in the issue or vice versa will create corresponding breadcrumb trails in the link target.)

Reviewer instructions & questions

@mnarayan & @manubastidas, your review will be checklist based. Each of you will have a separate checklist that you should update when carrying out your review.
First of all you need to run this command in a separate comment to create the checklist:

@editorialbot generate my checklist

The reviewer guidelines are available here: https://openjournals.readthedocs.io/en/jose/reviewer_guidelines.html. Any questions/concerns please let @nicoguaro know.

Please start on your review when you are able, and be sure to complete your review in the next six weeks, at the very latest

Checklists

📝 Checklist for @mnarayan

📝 Checklist for @dortiz5

@editorialbot
Copy link
Collaborator Author

Hello humans, I'm @editorialbot, a robot that can help you with some common editorial tasks.

For a list of things I can do to help you, just type:

@editorialbot commands

For example, to regenerate the paper pdf after making changes in the paper's md or bib files, type:

@editorialbot generate pdf

@editorialbot
Copy link
Collaborator Author

Software report:

github.com/AlDanial/cloc v 1.88  T=0.12 s (277.4 files/s, 275475.6 lines/s)
-------------------------------------------------------------------------------
Language                     files          blank        comment           code
-------------------------------------------------------------------------------
SVG                              3              3              3          17506
CSS                              3           2487              9           5383
Jupyter Notebook                 8              0           4289           1282
Markdown                         5            137              0            512
HTML                             1             53              0            246
TeX                              2             23              0            224
Python                           3             65             98             94
Lua                              1              9              2             73
YAML                             4             13              9             70
JavaScript                       1             15             20             40
Sass                             1             13             11             37
TOML                             1              5              4             37
-------------------------------------------------------------------------------
SUM:                            33           2823           4445          25504
-------------------------------------------------------------------------------


gitinspector failed to run statistical information for the repository

@editorialbot
Copy link
Collaborator Author

Wordcount for paper.md is 2374

@editorialbot
Copy link
Collaborator Author

Reference check summary (note 'MISSING' DOIs are suggestions that need verification):

OK DOIs

- 10.1103/RevModPhys.91.045002 is OK
- 10.5281/zenodo.3960218 is OK
- 10.1098/rsta.2020.0093 is OK
- 10.1007/978-3-030-69128-8_12 is OK
- 10.5281/zenodo.5960048 is OK

MISSING DOIs

- None

INVALID DOIs

- None

@editorialbot
Copy link
Collaborator Author

👉📄 Download article proof 📄 View article proof on GitHub 📄 👈

@nicoguaro
Copy link

@manubastidas, @mnarayan, this is the space where the review process takes form. There is a checklist for each one, tick the boxes when you see that the criterion is satisfied. You can generate the checklist with

@editorialbot generate my checklist

I will be here to answer the questions that you might have.

Let us use as a tentative timeframe the first week of April, is that OK for you?

@mnarayan
Copy link

mnarayan commented Mar 5, 2024

Review checklist for @mnarayan

Conflict of interest

Code of Conduct

General checks

  • Repository: Is the source for this learning module available at the https://github.com/Cambridge-ICCS/ml-training-material?
  • License: Does the repository contain a plain-text LICENSE file with the contents of a standard license? (OSI-approved for code, Creative Commons for content)
  • Version: Does the release version given match the repository release?
  • Authorship: Has the submitting author (@jatkinson1000) made visible contributions to the module? Does the full list of authors seem appropriate and complete?

Documentation

  • A statement of need: Do the authors clearly state the need for this module and who the target audience is?
  • Installation instructions: Is there a clearly stated list of dependencies?
  • Usage: Does the documentation explain how someone would adopt the module, and include examples of how to use it?
  • Community guidelines: Are there clear guidelines for third parties wishing to 1) Contribute to the module 2) Report issues or problems with the module 3) Seek support

Pedagogy / Instructional design (Work-in-progress: reviewers, please comment!)

  • Learning objectives: Does the module make the learning objectives plainly clear? (We don't require explicitly written learning objectives; only that they be evident from content and design.)
  • Content scope and length: Is the content substantial for learning a given topic? Is the length of the module appropriate?
  • Pedagogy: Does the module seem easy to follow? Does it observe guidance on cognitive load? (working memory limits of 7 +/- 2 chunks of information)
  • Content quality: Is the writing of good quality, concise, engaging? Are the code components well crafted? Does the module seem complete?
  • Instructional design: Is the instructional design deliberate and apparent? For example, exploit worked-example effects; effective multi-media use; low extraneous cognitive load.

JOSE paper

  • Authors: Does the paper.md file include a list of authors with their affiliations?
  • A statement of need: Does the paper clearly state the need for this module and who the target audience is?
  • Description: Does the paper describe the learning materials and sequence?
  • Does it describe how it has been used in the classroom or other settings, and how someone might adopt it?
  • Could someone else teach with this module, given the right expertise?
  • Does the paper tell the "story" of how the authors came to develop it, or what their expertise is?
  • References: Do all archival references that should have a DOI list one (e.g., papers, datasets, software)?

@jatkinson1000
Copy link

Hi All, and thanks for volunteering to edit/review.
Please do let me know if you have any questions or if I can help at all.

We have just delivered another workshop using this material this week.

@nicoguaro
Copy link

Hello @manubastidas and @mnarayan, do you have any advances regarding this review? If we can help you with something please let us know.

@nicoguaro
Copy link

I think that @manubastidas is not able to continue the review with us for personal reasons. Thank you for your disposition and I hope to count with you in future opportunities.

@dortiz5 has accepted to help us with the review.

@nicoguaro
Copy link

@editorialbot remove @manubastidas from reviewers

@editorialbot
Copy link
Collaborator Author

@manubastidas removed from the reviewers list!

@nicoguaro
Copy link

@editorialbot add @dortiz5 to reviewers

@editorialbot
Copy link
Collaborator Author

@dortiz5 added to the reviewers list!

@nicoguaro
Copy link

@dortiz5, this is the space where the review process takes form. There is a checklist for each one, tick the boxes when you see that the criterion is satisfied. You can generate the checklist with

@editorialbot generate my checklist

I will be here to answer the questions that you might have.

Let us use as a tentative timeframe the second week of May, is that OK for you?

@dortiz5
Copy link

dortiz5 commented May 6, 2024

Review checklist for @dortiz5

Conflict of interest

Code of Conduct

General checks

  • Repository: Is the source for this learning module available at the https://github.com/Cambridge-ICCS/ml-training-material?
  • License: Does the repository contain a plain-text LICENSE file with the contents of a standard license? (OSI-approved for code, Creative Commons for content)
  • Version: Does the release version given match the repository release?
  • Authorship: Has the submitting author (@jatkinson1000) made visible contributions to the module? Does the full list of authors seem appropriate and complete?

Documentation

  • A statement of need: Do the authors clearly state the need for this module and who the target audience is?
  • Installation instructions: Is there a clearly stated list of dependencies?
  • Usage: Does the documentation explain how someone would adopt the module, and include examples of how to use it?
  • Community guidelines: Are there clear guidelines for third parties wishing to 1) Contribute to the module 2) Report issues or problems with the module 3) Seek support

Pedagogy / Instructional design (Work-in-progress: reviewers, please comment!)

  • Learning objectives: Does the module make the learning objectives plainly clear? (We don't require explicitly written learning objectives; only that they be evident from content and design.)
  • Content scope and length: Is the content substantial for learning a given topic? Is the length of the module appropriate?
  • Pedagogy: Does the module seem easy to follow? Does it observe guidance on cognitive load? (working memory limits of 7 +/- 2 chunks of information)
  • Content quality: Is the writing of good quality, concise, engaging? Are the code components well crafted? Does the module seem complete?
  • Instructional design: Is the instructional design deliberate and apparent? For example, exploit worked-example effects; effective multi-media use; low extraneous cognitive load.

JOSE paper

  • Authors: Does the paper.md file include a list of authors with their affiliations?
  • A statement of need: Does the paper clearly state the need for this module and who the target audience is?
  • Description: Does the paper describe the learning materials and sequence?
  • Does it describe how it has been used in the classroom or other settings, and how someone might adopt it?
  • Could someone else teach with this module, given the right expertise?
  • Does the paper tell the "story" of how the authors came to develop it, or what their expertise is?
  • References: Do all archival references that should have a DOI list one (e.g., papers, datasets, software)?

@jatkinson1000
Copy link

Thanks for looking at this @dortiz5 !

I have just opened a Pull Request to add contribution guidelines here.

Once that has been reviewed I'll merge it to main, and then rebase the paper branch off of main and let you know.

@dortiz5
Copy link

dortiz5 commented May 6, 2024

@jatkinson1000, thanks to you for the repository.

It is nicely written, ordered, and storytelling, with clear instructions for preparation and prerequisites to run the exercises. Also, the local installation was easy. I followed the four simple steps you described, and it works correctly.

From the pedagogical standpoint, learning objectives are explicitly written and aligned with the content in the slides and the exercises. I liked that you gave different options to run the exercises and gave their solutions. Also, the information is concise and easy to follow. Finally, I found that the JOSE paper follows the checklist.

@jatkinson1000
Copy link

Hi @dortiz5 There should now be contribution guidelines on the main branch and the paper branch.

I'll look at sorting a version and let you know once that's done.

@mnarayan @nicoguaro please do let me know if there is anything I can do to help you.
Thanks all!

@nicoguaro
Copy link

@dortiz5, thanks for the review. Would you recommend the work for publication?

@mnarayan, can you still provide a review for this submission? Please let us know if we can do something to help you move forward.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

5 participants