See details here.
This template provides a boilerplate repository for developing Python libraries in Duckietown.
We have the following features:
- Unit-tests using Nose.
- Building/testing in Docker environment locally.
- Integration with CircleCI for automated testing.
- Integration with CodeCov for displaying coverage result.
- Integration with Sphinx to build code docs. (So far, only built locally.)
- Jupyter notebooks, which are run also in CircleCI as tests.
- Version bump using Bumpversion.
- Code formatting using Black.
- Command-line program for using the library.
This repository describes a library called "duckietown_pondcleaner
" and there is one command-line tool
called dt-pc-demo.
Warning: Do not remove files/features just because you don't understand. Ask instead. See: Chesterson's fence.
-
.gitignore
: Files ignore by Git. -
.dtproject
: ... -
.bumpversion.cfg
: Configuration for bumpversion -
Makefile
: ...
-
requirements.txt
: Contains the pinned versions of your requirement that are used to run tests. -
MANIFEST.in
: Deselects the tests to be included in the egg. -
setup.py
: Containes meta information, definition of the scripts, and the dependencies information.
-
src/
- This is the path that you should set as "sources root" in your tool -
src/duckietown_pondcleaner
: Contains the code. -
src/duckietown_pondcleaner/__init__.py
: Contains the__version__
library. -
src/duckietown_pondcleaner_tests
: Contains the tests - not included in the egg.
These are files to build and run a testing container.
-
.dockerignore
: Describes what files go in the docker container. -
Dockerfile
: ...
-
src/conf.py
: Sphinx settings -
src/index.rst
: Sphinx main file -
src/duckietown_pondcleaner/index.rst
: Documentation for the package
.coveragerc
: Options for code coverage.
-
notebooks
: Notebooks that are run also as a test. -
notebooks-extra
: Other notebooks (not run as test) -
notebooks/*.ipynb
: The notebooks themselves.
Use the fork button in the top-right corner of the Github page to fork this template repository.
Create a new repository on Github while specifying the newly forked template repository as a template for your new repository.
Build a library by following these steps:
- Clone the newly created repository;
- Place your Python packages inside
src/
; - List the python dependencies in the file
dependencies.txt
; - Update the appropriate section in the file
setup.py
;
Make sure that there are no other remains:
grep -r . pondcleaner
Update the branch names in README.md
.
The following are necessary steps for admins to do:
-
Activate on CircleCI. Make one build successful.
-
Activate on CodeCov. Get the
CODECOV_TOKEN
. Put this token in CircleCI environment.
Test the code using Docker by:
make test-docker
This runs the test using a Docker container built from scratch
with the pinned dependencies in requirements.txt
.
This is equivalent to what is run on CircleCI.
To run the tests natively, use:
make test
We assume you have already setup a Python virtual environment.
Then we suggest you run:
python setup.py develop
This will install the library in an editable way (rather than copying the sources somewhere else).
If you don't want to install the deps, do:
python setup.py develop --no-deps
For example, this is done in the Dockerfile so that
we know we are only using the dependencies in requirements.txt
with the
exact pinned version.
To add another tests, add files with the name test_*py
in the
package duckietown_podcleaner_tests
. The name is important.
Tip: make sure that the tests are actually run looking at the coverage results.
Always clean the notebooks before committing them:
make -C notebooks cleanup
If you don't think you can be diligent, then add the notebooks using Git LFS.
The first step is to change the version and tag the repo.
DO NOT change the version manually; use the CLI tool bumpversion
instead.
The tool can be called by:
make bump # bump the version, tag the tree
If you need to include the version in a new file, list it inside the file .bumpversion.cfg
using the
syntax [bumpversion:file:<FILE_PATH>]
.
The next step is to upload the package to PyPy. We use twine. Invoke using:
make upload # upload to PyPy
For this step, uou need to have admin permissions on PyPy.