Docker Compose allows users to orchestrate multiple Docker containers that can talk to one another.
This repository contains a docker-compose.yml file starting the following containers:
postgres- a PostgreSQL database housing a toy datasetapi- a mock prediction API (built as a responder app). In a real world setting this API could, for instance, run an input data point through a previously trained machine learning model and return a prediction.dash- a simple Dash dashboard that reads data frompostgresand runs them trough the predictionapi.jnb- a Jupyter lab environment to run Jupyter notebooks being able to read/write data frompostgres, run data through theapiand so on.
To run Docker Compose:
docker-compose up --build
The Jupyter lab server is available at: http://127.0.0.1:10000/lab
The dashboard is then available at: http://127.0.0.1:8050/
Airflow is available at: http://127.0.0.1:8080
Superset is available at: http://127.0.0.1:8088
Shutdown and remove containers:
docker-compose down
docker system prune
Adding --volumes also removes volumes.
- spark using https://jupyter-docker-stacks.readthedocs.io/en/latest/using/specifics.html#apache-spark?
- kafka
- airflow job to retrain models
- fancier dashboard?
- how do volumes relate to containers? what about persistance?
- add more container components?
- show how to use a simple, pretrained tf model from tf hub in API?