Skip to content
asynchronous task queues using python's multiprocessing library
Branch: master
Clone or download
Fetching latest commit…
Cannot retrieve the latest commit at this time.
Permalink
Type Name Latest commit message Commit time
Failed to load latest commit information.
data
logs
output init Mar 29, 2018
.gitignore init Apr 2, 2018
LICENSE minor updates, bump dependencies May 3, 2019
README.md Update README.md Apr 30, 2018
redis_queue.py init Apr 2, 2018
redis_queue_client.py init Apr 2, 2018
redis_queue_server.py minor updates, bump dependencies May 3, 2019
redis_queue_worker.py init Apr 2, 2018
requirements.txt minor updates, bump dependencies May 3, 2019
simple_pool.py added f strings Apr 29, 2018
simple_queue.py
simple_task_queue.py added f strings Apr 29, 2018
simple_task_queue_logging.py
simple_task_queue_logging_separate_files.py added f strings Apr 29, 2018
tasks.py minor updates, bump dependencies May 3, 2019

README.md

Asynchronous Task Queues in Python

Several implementations of asynchronous task queues in Python using the multiprocessing library and Redis.

Blog post: Developing an Asynchronous Task Queue in Python

Setup

  1. Fork/Clone

  2. Create and activate a virtual environment

  3. Install the dependencies

  4. Enter the Python shell and download the NLTK stopwords corpus:

    >> import nltk
    >> nltk.download('stopwords')
      [nltk_data] Downloading package stopwords to
      [nltk_data]     /Users/michael.herman/nltk_data...
      [nltk_data]   Unzipping corpora/stopwords.zip.
      True

Examples

Multiprocessing Pool:

$ python simple_pool.py

Multiprocessing Queue:

$ python simple_queue.py
$ python simple_task_queue.py

Logging to a single file:

$ python simple_task_queue_logging.py

Logging to separate files:

$ python simple_task_queue_logging_separate_files.py

Redis:

$ python redis_queue.py
You can’t perform that action at this time.