Docker for Local Dev Documentation
This documentation is composed of three main sections:
- How to install and use Docker for local development
- Connecting Docker to your code editor
- Docker 101 and how we use it with the foundation site. Start here if you're new to Docker
How to use
The general workflow is:
- Install the project with
- Run the project with
- Use invoke commands for frequent development tasks (database migrations, dependencies install, run tests, etc),
- After doing a
git pull, keep your clone up to date by running
To get a list of invoke commands available, run
docker-catch-up (docker-catchup) Rebuild images and apply migrations docker-l10n-sync Sync localizable fields in the database docker-l10n-update Update localizable field data (copies from original unlocalized to default localized field) docker-makemigrations Creates new migration(s) for apps docker-manage Shorthand to manage.py. inv docker.manage "[COMMAND] [ARG]" docker-migrate Updates database schema docker-npm Shorthand to npm. inv docker.npm "[COMMAND] [ARG]" docker-new-db Delete your database and create a new one with fake data docker-pipenv Shorthand to pipenv. inv docker.pipenv "[COMMAND] [ARG]" docker-setup Prepare your dev environment after a fresh git clone docker_switching_branch Get a new database with fake data and rebuild images docker-test-node Run node tests docker-test-python Run python tests
docker- prefixes all Docker commands. Note the double quotes to pass multiples arguments to
docker-npm commands. There's no
invoke docker-runserver command: use
docker-compose up instead.
A few examples:
invoke docker-pipenv "install requests"": add requests to your
Pipfile and lock it.
invoke docker-manage load_fake_data: add more fake data to your project,
invoke docker-npm "install moment": install moment, add it to your
package.jsonand lock it.
Docker and docker-compose CLIs
We strongly recommend you to check at least the docker-compose CLI documentation since we're using it a lot. Meanwhile, here are the commands you will use the most:
- docker-compose up: start the services and the project. Stop them with
^C. If you want to rebuild your images, for example after a python dependencies update, add the
--buildflag. If you want to run the services in detached mode, use
--detached. To get logs, use
docker-compose logs --follow [SERVICE],
- docker-compose down: stop and remove the services,
- docker-compose run (--rm) [SERVICE NAME] [COMMAND]: run a command against a service.
--rmremoves your container when you're done,
- docker-compose build [SERVICE NAME]: build a new image for the service. Use
--no-cacheto build the image from scratch again,
- docker-compose ps: list the services running.
- docker image: interact with images,
- docker container: interact with containers,
- docker volume: interact with volumes.
- docker system prune: delete all unused container, image and network. Add
--volumesto also remove volume.
🚨It will impact other docker project running on your system! For a more subtle approach, check this blog post on to remove elements selectively.
How to install or update dependencies?
The significant difference between the two is that python dependencies are "baked" into the image, while JS dependencies are stored in a volume.
invoke docker-pipenv "install [PACKAGE]".
Pipfile and lock-it: it doesn't install your dependency! After running this command, run
docker-compose build backend to create a new and updated backend image to use.
To update your dependencies, do
invoke docker-pipenv update, then build the new image with
docker-compose build backend.
invoke docker-npm "install [PACKAGE]".
invoke docker-npm update.
You don't need to rebuild the
Connecting Docker to your code editor
This feature is only available for the professional version of Pycharm. Follow the official instructions available here
Visual Studio Code
Visual Studio Code use a feature called Dev Container to run Docker projects. The configuration files are in the
.devconatainer directory. This feature is only available starting VSCode 1.35 stable. For now, we're only creating a python container to get Intellisense, we're not running the full project inside VSCode. We may revisit this in the future if Docker support in VSCode improves.
A few things to keep in mind when using that setup:
- Do not use the terminal in VSCode when running
invoke docker-commands: use a local terminal instead,
- when running
inv docker-catchupor installing python dependencies, you will need to rebuild the Dev Container. To do that, press
F1and look for
- Install the Remote - containers extension,
- Open the project in VSCode: it detects the Dev Container files and a popup appears: click on
Reopen in a Container,
- Wait for the Dev Container to build,
- Work as usual and use the docker invoke commands in a terminal outside VSCode.
Docker vocabulary and overview
Welcome to Docker! Before jumping into Docker installation, take a moment to get familiar with Docker vocabulary:
- Docker: Docker is a platform to develop, deploy and run applications with containers.
- Docker engine: The Docker engine is a service running in the background (daemon). It's managing containers.
- Docker CLI: Command Line Interface to interact with Docker. For example,
Docker image lslists the images available on your system.
- Docker hub: Registry containing Docker images.
- Image: An image is a file used to build containers: In our case, it's mostly instructions to install dependencies.
- Container: Containers run an image. In our case, we have a container for the database, another one for building static files and the last one for running Django. A container life is ephemeral: data written there don't persist when you shut down a container.
- Volume: A volume is a special directory on your machine that is used to make data persistent. For example, we use it to store the database: that way, you don't lose your data when you turn down your containers.
- Host: host is used in Docker docs to mean the system on top of which containers run.
- Docker-compose: It's a tool to run multi-container applications: we use it to run our three containers together.
- Docker-compose CLI: Command line interface to interact with docker-compose. It's used to launch your dev environment.
- Docker-compose service: a service is a container and the configuration associated to it.
I would recommend watching An Intro to Docker for Djangonauts by Lacey Williams Henschel (25 min, repo mentioned in the talk): it's a great beginner talk to learn Docker and how to use it with Django.
All our containers run on Linux.
For local development, we have two Dockerfiles that define our images:
Dockerfile.node: use a node8 Debian Stretch slim base image from the Docker Hub and install node dependencies,
Dockerfile.python: use a python3.7 Debian Stretch slim base image, install required build dependencies before installing pipenv and the project dependencies. We don't have a custom image for running postgres and use one from the Docker Hub.
docker-compose.yml file describes the 3 services that the project needs to run:
watch-static-files: rebuilds static files when they're modified,
postgres: contains a postgres database,
backend: runs Django. Starting this one automatically starts the two other ones.
Resources about Docker
- Docker and Docker-compose documentations,
- Intro to Docker: Lacey wrote a good intro tutorial to Docker and Django, without Harry Potter metaphors this time :),
- Jérôme Petazzoni's training slides and talks: presentations and slides if you want to dive into Docker.
Do I need to build the static files before doing a
Static files are automatically built when starting the
Where is Docker fitting in all the tools we're already using?
Let's do a quick overview of all the tools you're currently using to run the foundation site on your computer:
packages-lock.json). Also used to launch commands like
npm run start.
pipenv: use to manage python dependencies (
pipfile.lock). Also manage a python virtual environment, which isolates the foundation site's python packages from the rest of your system.
inv: use as a cli tool to provide shortcuts for most used commands. ex:
inv runserveris a shortcut for
pipenv run python network-api/manage.py runserver.
We still use all those tools with Docker. The major difference is that
pipenv is now running inside a container, while invoke continues to run as before.
Can I use Docker in parallel with the old way of running the foundation site?
Short answer is yes but:
- you will have two different databases
- you will have two files to manage your environments variables (
- those two environment won't share their dependencies: you will have to maintain and update both of them.