Flask web server that handles "heavy" tasks asynchronously via Celery.
Another project I was working on needed to be able to crunch a bunch of data, and email a report out. Requests to the route using this function were sluggish, so I wanted to implement a task-queue in Flask, and this was the easiest solution I found.
- When dealing with mass email sends / a high volume of singular email sends
- When doing large multi-step operations like deleting a large number of records or preparing a data export
- etc, etc, etc.
There are a few things you're going to have to do regardless of if you're running the app in production, or development mode.
- create a
.env
file in the root directory, and set theREDIS_PASS
value (see .env.example) - run
docker-compose build
to build everything
Run the command docker-compose up caesar_dev redis
, this will start the dev server, and a redis cache
Run the command docker-compose up caesar_prod redis
, this will start the prod server, and a redis cache.
create an issue!