Skip to content

denisecase/streaming-04-multiple-consumers

Repository files navigation

streaming-04-multiple-consumers

Use RabbitMQ to distribute tasks to multiple workers

One process will create task messages. Multiple worker processes will share the work.

Before You Begin

  1. Fork this starter repo into your GitHub.
  2. Clone your repo down to your machine.
  3. View / Command Palette - then Python: Select Interpreter
  4. Select your conda environment.

Read

  1. Read the RabbitMQ Tutorial - Work Queues
  2. Read the code and comments in this repo.

RabbitMQ Admin

RabbitMQ comes with an admin panel. When you run the task emitter, reply y to open it.

(Python makes it easy to open a web page - see the code to learn how.)

Execute the Producer

  1. Run emitter_of_tasks.py (say y to monitor RabbitMQ queues)

Explore the RabbitMQ website.

Execute a Consumer / Worker

  1. Run listening_worker.py

Will it terminate on its own? How do you know?

Ready for Work

  1. Use your emitter_of_tasks to produce more task messages.

Start Another Listening Worker

  1. Use your listening_worker.py script to launch a second worker.

Follow the tutorial. Add multiple tasks (e.g. First message, Second message, etc.) How are tasks distributed? Monitor the windows with at least two workers. Which worker gets which tasks?

Reference

Screenshot

See a running example with at least 3 concurrent process windows here:

About

Use RabbitMQ to distribute tasks to multiple workers

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages