A cookiecutter template for a Python 3 package.
- Python 3 only. Although it would be straightforward to also add Python 2 support to your package by hand.
- The actual package code is in a
srcdirectory. See https://blog.ionelmc.ro/2014/05/25/python-packaging/#the-structure for the reasoning behind this.
- Support for
- Support for Black code style
- Support for pre-commit git hooks
- Travis CI and AppVeyor support.
- Sphinx/Read-the-docs support. This includes optional use of the better_apidoc tool for generating API documentation with templates.
- Support for Jupyter Notebooks in the Sphinx documentation (nbsphinx). This includes validation of notebooks as tests through the nbval plugin.
- Mandatory testing with pytest
- Environment management through
- Development tasks are organized in a Makefile. Run
make helpinside the generated project for details.
- Support for Coveralls
- Upload to PyPI through
- Github templates for bug reports
- Support for git-flow branching model
- Interactive post-setup script for initializing git
First, make sure to have
pip install cookiecutter
Then, create a new project with
Follow the prompts. You can also pass values for variables on the command line, e.g.
cookiecutter gh:goerz/cookiecutter-pypackage --no-input interactive_postsetup=no project_name="My Project"
full_name: Full author name, will show up in README, in the module, and on PyPI
github_usernameUsername (or organization name) on github
project_name: The name of the package on PyPI, also the name of the folder that will be generated
project_short_description: (Short) description to appear as the doc-string of the module, in the documentation of the console script, in the README, and on PyPI
version: Initial version of the package
create_author_file: Whether to create AUTHORS.rst
open_source_license: The license under which the code will be available (choice of MIT, GPL, or Public Domain)
environment_manager: The system for managing virtual environments. Currently, only
conda_packages: If using
condaas an environment manager, which packages to install from the conda repository (i.e., not through pip). If you package extensively uses the Python scientific stack, and virtual environments are managed through conda, you might consider using the anaconda meta package, and set
anaconda pytest-cov pytest-xdist coverage sphinx_rtd_theme flake8.
use_isort: Whether to require that all imports are sorted according to the
use_black: Whether the black code formatter should be used to enforce code styles. This enables
make black-check, and automatic checking of the code style on Travis.
use_pre_commit: Whether to use pre-commit to manage git pre-commit hooks. A local hook for checking for trailing whitespace and DEBUG lines is always included, as well as third-party hooks for
linelength: The allowed line length of code lines. PEP 8 requires 79 characters. This is not a hard limit; code may extend beyond the
linelengthif this increases readability.
allow_single_quote_strings: Whether strings are allowed to be enclosed in single quotes, cf.
on_pypi: Whether the package will be uploaded to the Python Package Index
travisci: Whether Travis will be used as a Continuous Integration testing service
appveyor: Whether AppVeyor will be used as a Continuous Integration testing service for Windows
coveralls: Whether to upload coverage data to http://coveralls.io. This only work if
sphinx_docs: Whether the package will use Sphinx to generate its documentation
use_notebooks: Whether Jupyter notebooks will be included in the Sphinx documentation, and validated through pytest. This is not compatible with
better_apidoc: Whether to use https://github.com/goerz/better-apidoc for generating the package API for Sphinx.
readthedocs: Whether the Sphinx-documentation will be hosted on https://readthedocs.org
support_py34: Does the package support Python 3.4?
support_py35: Does the package support Python 3.5?
support_py36: Does the package support Python 3.6?
support_py37: Does the package support Python 3.7? Remember to check any conda-forge packages that you might depend on to support Python 3.7 before enabling this.
use_git_flow: Whether the project uses the git-flow branching model
interactive_postsetup: Whether to run the interactive post-setup script, which will e.g. set up git for the project
After you generate a new project from the cookiecutter template, you should do the following:
Declare dependencies in
setup.py, both for installation and development (testing). There are no additional pip requirement files (These are for app deployment, not for packages!).
If you are using
environment_manager, you may list any dependency that you know has a conda package in
Makefile. Also, if you are using Travis CI (
use_travis), you should add the same packages in the
.travis.yml. Any dependencies not installed as conda packages will still automatically be installed via
pip, so the use of conda packages is optional. However, if you do use conda packages, you must manually ensure that the list of conda packages in the various locations (
.travis.yml) stays in sync. Note that conda can be extremely slow, so it recommended to only install the base Python via conda, and other dependencies via pip, if possible.
If you are using ReadTheDocs (RTD) with conda to host your documentation, you will have to specify the build environment for the documentation in
docs/rtd_environment.yml. Packages that are available through conda can be listed directly, other packages must be listed in a pip section. For example,
channels: - defaults dependencies: - python=3.6 - anaconda - pip: - better-apidoc - trajectorydata - git+https://github.com/mabuchilab/QNET.git@develop#egg=QNET-2.0.0-dev
In general, RTD needs only a minimal number of dependencies (not everything required to run the tests). Specifically, you must not include any
docs/rtd_environment.yml-- these are installed automatically by the RTD build process. RTD will not take into account dependencies listed in
Review the custom RTD templates in
If you didn't do so during project creation, initialize git and push the project to Github
If you are using the git-flow branching model, you must configure this on Github. Go to the "Settings" for the project, then "Branches", and switch the "Default branch" from "master", to "develop". You may consider protecting the master branch.
Activate Travis CI. The easiest way to do this is to click on the
build|unknownbadge in the README on Github
Activate AppVeyor. The easiest way to do this is to click on the
appveyor|no idbadge in the README on Github. You must update the badge svg in
Activate ReadTheDocs. Log in to https://readthedocs.org/dashboard/, and click the "Import a Project" button. You shouldn't have to do any configuration, as everything is set up through the
Activate Coveralls. Log in to https://coveralls.io, and click on "Add Repo". Not that coverage data is only uploaded if all tests pass successfully!
Review the classifiers in
setup.py. The full list of PyPI classifiers can be found here.
If you are using pre-commit, review the
.pre-commit-config.yamlfile, especially for whether you will want to use more recent
revs for third-party hooks.
If the package should be registered on PyPI, upload it. You can do this with
Make sure to tag releases on Github (using a leading
vin the tag name, e.g.
Activate branch protection for the
developbranch, if using the git flow branching model), to prohibit history rewriting for these branches.