This version of this documented is deprecated. The canonical copy can be found in the astropy project-wide policies repo.
Reviewing affiliated packages
This document describes the set of review criteria for packages applying to become Astropy-affiliated, and is intended for reviewers.
If you are reading this because you have accepted to review an affiliated package submission, thank you for taking the time to do this!
Note that unlike for a paper where the reviews from the referee are passed on to the authors, one of the coordinators will also review the package and will create a combined report, so the report you write may not be seen by the authors as-is. Reports should be emailed privately back to the coordinator that contacted you.
Reviewing a package involves assessing how well the package does in several areas, which we outline below. As a reviewer it is is expected that you review on these criteria, and it is sufficient to solely use these criteria . However, feel free to bring up any other aspect which you think is important. For the categories below, you can let us know which of the 'traffic light' levels you think the package should be rated as, in addition to providing comments in cases where things aren't perfect.
In general we use the following color-coding, which also determines if a package is accepted after its first review:
The document also includes
monospaced keywords for the categories and levels. These are the keywords and values to be used in the registry.json file that is the canonical source for affiliated package information.
The categories in which we assess the package are the following:
- Functionality (
- Integration with Astropy ecosystem (
- Documentation (
- Testing (
- Development status (
- Python 3 compatibility (
We first need to make sure that the scope of the package is relevant for the affiliated package system. The scopes are:
Note that general is not necessary better than specific, it’s just a way to make sure we can present these separately.
Integration with Astropy ecosystem ('ecointegration')
Next up, we need to check how well the package fits in to the existing Astropy ecosystem - does it make use of existing functionality, or does it duplicate it?
No code is complete without documentation! Take a look at the documentation (if it exists) and see how the package fares:
|No documentation or some documentation, but very bare bones/minimal and incomplete or incorrect in a number of places.|
|Reasonable documentation (which could be a very well written README), installation instructions and at least one usage example, but some parts missing.|
|Extensive documentation, including at least: motivation/scope of package, installation instructions, usage examples, API documentation. In terms of infrastructure, the documentation should be automatically built on readthedocs.org. If appropriate, one or more tutorials should be included in the Astropy tutorials at http://tutorials.astropy.org.|
In our terminology, “tests” refer to those that can be run in an automated way, and we do not consider examples that need to be run and/or checked manually to be acceptable as the primary way of satisfying “tests”
Test coverage can be tricky to measure, so this will be carefully assessed for each package. The main idea is to determine whether it is low, medium or high compared to what one might realistically achieve.
Development status ('devstatus')
Python 3 compatibility ('python3')
|Nothing is currently 'unacceptable'. Starting 1 January 2020, 'Not compatible with Python 3' will be red.|
|Not compatible with Python 3|
|Compatible with Python 3|