Using Django Q for asynchronous tasks in your Django Application - Basic
-
Clone this repo
-
Create Virtual Environment and install requirements
pip install virtualenv virtualenv venv source venv/bin/activate pip intall -r requirements.txt
-
Login to heroku and get free heroku redis addon
heroku login heroku create --addons=heroku-redis heroku config:get REDIS_URL Now you will get a REDIS_URL
-
Pass REDIS_URL as environment variable
export REDIS_URL='redis://your_redis_url
-
Modify Django Q configuration in settings.py file if required
Q_CLUSTER = { 'name': 'your_project_name', 'workers': 8, 'recycle': 500, 'timeout': 60, 'compress': True, 'save_limit': 250, 'queue_limit': 500, 'cpu_affinity': 1, 'label': 'Django Q', 'redis': os.environ.get('REDIS_URL') }
-
Run servers
python manage.py migrate python manage.py qcluster
-
Notice the logs in both the servers and you will find the tasks getting executed asynchronously.
-
Redis database which is used as cache, holds all the tasks in a queue and then executes it.