Skip to content


Subversion checkout URL

You can clone with
Download ZIP
A Gearman worker which cURLs to do work.
Python Shell
Latest commit b22bf79 @powdahound powdahound Merge pull request #6 from artificialrobot/master
Allow a curler job to define custom headers


A Gearman worker which hits a web service to do its work. Why? Because you already have a bunch of solid code in your web framework and don't want to port it to a standalone worker service.

Basic flow:

  1. User #10 on your service submits a form requesting some time-intensive task.
  2. You insert a job into the curler Gearman queue with the data {"method": "do_thing", "data": 10}
  3. curler receives the job and POSTs to http://localhost/jobs/do_thing with data=10.
  4. You pull 10 from post['data'] and do the thing.

Installation & Usage

curler runs as a twistd service. To install & run:

$ git clone
$ cd curler/
$ sudo python install
$ twistd --nodaemon curler --base-urls=http://localhost/jobs

There are a few arguments to curler:

  • --base-urls - Base URLs which the method property is appended to. You can specify multiple URLs by separating them with commas and one will be chosen at random.
  • --job-queue - The Gearman job queue to monitor (defaults to 'curler').
  • --gearmand-server - Gearman job servers to get jobs from (defaults to 'localhost:4730'). Separate multiple with commas.
  • --num-workers - Number of workers to run per server (# of jobs you can process in parallel). Uses nonblocking Twisted APIs instead of spawning extra processes or threads. Defaults to 5.
  • --verbose - Enables verbose logging (includes full request/response data).

Run twistd --help to see how to run as a daemon.

Job data

Jobs inserted into the curler queue must contain two properties:

  • method - Relative path of the URL to hit.
  • data - Arbitrary data string. POSTed as the data property. Use JSON if you need structure.


Something went wrong with that request. Please try again.