New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[REVIEW]: A course on Geographic Data Science #42

Open
whedon opened this Issue Jan 28, 2019 · 6 comments

Comments

Projects
None yet
4 participants
@whedon
Copy link
Collaborator

whedon commented Jan 28, 2019

Submitting author: @darribas (Daniel Arribas-Bel)
Repository: https://github.com/darribas/gds18
Version: v4.0
Editor: @labarba
Reviewer: @lheagy, @jsta
Archive: Pending

Status

status

Status badge code:

HTML: <a href="http://jose.theoj.org/papers/ab5b87ff724fbdb2fda35a7301eecce9"><img src="http://jose.theoj.org/papers/ab5b87ff724fbdb2fda35a7301eecce9/status.svg"></a>
Markdown: [![status](http://jose.theoj.org/papers/ab5b87ff724fbdb2fda35a7301eecce9/status.svg)](http://jose.theoj.org/papers/ab5b87ff724fbdb2fda35a7301eecce9)

Reviewers and authors:

Please avoid lengthy details of difficulties in the review thread. Instead, please create a new issue in the target repository and link to those issues (especially acceptance-blockers) in the review thread below. (For completists: if the target issue tracker is also on GitHub, linking the review thread in the issue or vice versa will create corresponding breadcrumb trails in the link target.)

Reviewer instructions & questions

@lheagy & @jsta, please carry out your review in this issue by updating the checklist below. If you cannot edit the checklist please:

  1. Make sure you're logged in to your GitHub account
  2. Be sure to accept the invite at this URL: https://github.com/openjournals/jose-reviews/invitations

The reviewer guidelines are available here: https://jose.theoj.org/about#reviewer_guidelines. Any questions/concerns please let @labarba know.

Review checklist for @lheagy

Conflict of interest

Code of Conduct

General checks

  • Repository: Is the source for this learning module available at the repository url?
  • License: Does the repository contain a plain-text LICENSE file with the contents of a standard license? (OSI-approved for code, Creative Commons for content)
  • Version: Does the release version given match the repository release (v4.0)?
  • Authorship: Has the submitting author (@darribas) made visible contributions to the module? Does the full list of authors seem appropriate and complete?

Documentation

  • A statement of need: Do the authors clearly state the need for this module and who the target audience is?
  • Installation instructions: Is there a clearly stated list of dependencies?
  • Usage: Does the documentation explain how someone would adopt the module, and include examples of how to use it?
  • Community guidelines: Are there clear guidelines for third parties wishing to 1) Contribute to the module 2) Report issues or problems with the module 3) Seek support

Pedagogy / Instructional design (Work-in-progress: reviewers, please comment!)

  • Learning objectives: Does the module make the learning objectives plainly clear? (We don't require explicitly written learning objectives; only that they be evident from content and design.)
  • Content scope and length: Is the content substantial for learning a given topic? Is the length of the module appropriate?
  • Pedagogy: Does the module seem easy to follow? Does it observe guidance on cognitive load? (working memory limits of 7 +/- 2 chunks of information)
  • Content quality: Is the writing of good quality, concise, engaging? Are the code components well crafted? Does the module seem complete?
  • Instructional design: Is the instructional design deliberate and apparent? For example, exploit worked-example effects; effective multi-media use; low extraneous cognitive load.

JOSE paper

  • Authors: Does the paper.md file include a list of authors with their affiliations?
  • A statement of need: Does the paper clearly state the need for this module and who the target audience is?
  • Description: Does the paper describe the learning materials and sequence?
  • Does it describe how it has been used in the classroom or other settings, and how someone might adopt it?
  • Could someone else teach with this module, given the right expertise?
  • Does the paper tell the "story" of how the authors came to develop it, or what their expertise is?
  • References: Do all archival references that should have a DOI list one (e.g., papers, datasets, software)?

Review checklist for @jsta

Conflict of interest

Code of Conduct

General checks

  • Repository: Is the source for this learning module available at the repository url?
  • License: Does the repository contain a plain-text LICENSE file with the contents of a standard license? (OSI-approved for code, Creative Commons for content)
  • Version: Does the release version given match the repository release (v4.0)?
  • Authorship: Has the submitting author (@darribas) made visible contributions to the module? Does the full list of authors seem appropriate and complete?

Documentation

  • A statement of need: Do the authors clearly state the need for this module and who the target audience is?
  • Installation instructions: Is there a clearly stated list of dependencies?
  • Usage: Does the documentation explain how someone would adopt the module, and include examples of how to use it?
  • Community guidelines: Are there clear guidelines for third parties wishing to 1) Contribute to the module 2) Report issues or problems with the module 3) Seek support

Pedagogy / Instructional design (Work-in-progress: reviewers, please comment!)

  • Learning objectives: Does the module make the learning objectives plainly clear? (We don't require explicitly written learning objectives; only that they be evident from content and design.)
  • Content scope and length: Is the content substantial for learning a given topic? Is the length of the module appropriate?
  • Pedagogy: Does the module seem easy to follow? Does it observe guidance on cognitive load? (working memory limits of 7 +/- 2 chunks of information)
  • Content quality: Is the writing of good quality, concise, engaging? Are the code components well crafted? Does the module seem complete?
  • Instructional design: Is the instructional design deliberate and apparent? For example, exploit worked-example effects; effective multi-media use; low extraneous cognitive load.

JOSE paper

  • Authors: Does the paper.md file include a list of authors with their affiliations?
  • A statement of need: Does the paper clearly state the need for this module and who the target audience is?
  • Description: Does the paper describe the learning materials and sequence?
  • Does it describe how it has been used in the classroom or other settings, and how someone might adopt it?
  • Could someone else teach with this module, given the right expertise?
  • Does the paper tell the "story" of how the authors came to develop it, or what their expertise is?
  • References: Do all archival references that should have a DOI list one (e.g., papers, datasets, software)?
@whedon

This comment has been minimized.

Copy link
Collaborator Author

whedon commented Jan 28, 2019

Hello human, I'm @whedon, a robot that can help you with some common editorial tasks. @lheagy, it looks like you're currently assigned as the reviewer for this paper 🎉.

⭐️ Important ⭐️

If you haven't already, you should seriously consider unsubscribing from GitHub notifications for this (https://github.com/openjournals/jose-reviews) repository. As a reviewer, you're probably currently watching this repository which means for GitHub's default behaviour you will receive notifications (emails) for all reviews 😿

To fix this do the following two things:

  1. Set yourself as 'Not watching' https://github.com/openjournals/jose-reviews:

watching

  1. You may also like to change your default settings for this watching repositories in your GitHub profile here: https://github.com/settings/notifications

notifications

For a list of things I can do to help you, just type:

@whedon commands
@whedon

This comment has been minimized.

Copy link
Collaborator Author

whedon commented Jan 28, 2019

Attempting PDF compilation. Reticulating splines etc...
@labarba

This comment has been minimized.

Copy link
Member

labarba commented Jan 28, 2019

@lheagy, @jsta — Thank you for agreeing to review for JOSE! This is where the action happens: work your way through the review checklist, feel free to ask questions or post comments here, and also open issues in the submission repository as needed. Godspeed!

@whedon

This comment has been minimized.

Copy link
Collaborator Author

whedon commented Jan 28, 2019

@lheagy

This comment has been minimized.

Copy link
Member

lheagy commented Feb 10, 2019

Congrats on a really nice course @darribas! They are interesting examples and are well-explained. I have a few review comments, and once these are addressed I think it will be ready for publication

review comments

  • darribas/gds18#1: the repo license (cc-by-nc-sa) is different than the license stated in the README and on the website (cc-by) (@labarba, does JOSE have any concerns with the 'NC' portion of the license as it is not considered "Free Cultural Work"?)
  • darribas/gds18#10: there is no version information in the github repository
  • darribas/gds18#9: community guidelines. I did not see these anywhere in the repo
  • darribas/gds18#7: a suggestion on simplifying the process for downloading data in the labs. Right now, the learner has to follow at least 2 links to get to the data from the notebook in most cases; this overhead could be reduced so there is less startup time for the labs.

minor fixes (not publication blockers in my opinion)

  • darribas/gds18#6: installation instructions conda activate is preferred over source activate
  • darribas/gds18#8: it looks like the api of pysal changed, so a couple of the labs error on import

feedback on the review process

  • the paper criteria Does the paper tell the "story" of how the authors came to develop it, or what their expertise is? is a bit vague to me. I quite like the paper heading "Experience of use" used by @darribas and think that this captures the spirit of this criteria. So it might be worth considering that language as a bold-faced title for this bullet-point
@jsta

This comment has been minimized.

Copy link
Member

jsta commented Feb 12, 2019

This paper and these course materials looks quite good to me. I focused my review on the labs portion of the repository. In addition to the comments by @lheagy, I had several ideas for improvement:

review comments

  • Does the conda file specify all the dependencies including ones required for the optional excercises? (darribas/gds18#2)

minor fixes

  • In the lab01 notebook, dropping columns is shown using both del foo and pandas.drop syntax. Consider only using panadas syntax to decrease cognative load?

  • The paper states that the target audience is learners with little to no prior knowledge. Maybe text to this effect could be added to the Overview page?

  • There are multiple instances where an operation is claimed to be simple or easy. I would recommend not emphasizing this so much as it can de-motivate people who are struggling.

  • The default data path is highly variable among notebooks. Maybe these could be made consistent? (darribas/gds18#3)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment