Skip to content

HTTPS clone URL

Subversion checkout URL

You can clone with HTTPS or Subversion.

Download ZIP
Browse files

update gaspar version, various docstrings, readme (release preparation)

  • Loading branch information...
commit 3c75a455a0e5dc2cc4dca33d9cb510db33717241 1 parent 5789d35
Jason Moiron authored
57 README.rst
View
@@ -1,5 +1,5 @@
gaspar
--------
+======
Gaspar is a library for creating small, simple TCP daemons that parallelize CPU
intensive work with a simple and low-overhead request/response pattern.
@@ -8,13 +8,15 @@ It does this by forking off a number of worker processes, using eventlet to
handle incoming requests and the 0MQ push/pull message pattern to load
ballance work across the worker processes and receive responses.
+
running servers
---------------
Gaspar uses the terms ``producer`` and ``consumer`` for the process that receives
-incoming requests and the processes that actually handle those requests. To use
-Gaspar, you need only to create a producer and a consumer, and then start the
-producer::
+incoming requests and the processes that actually handle those requests. In the
+0MQ documentation, these are called ``ventilator`` and ``sink``, and various
+other terms are used throughout distributed systems literature. To use Gaspar,
+you need only to create a producer and a consumer, and then start the producer::
>>> import gaspar
>>> def echo(message): return message
@@ -27,6 +29,7 @@ listening on port ``10123``, receiving requests, sending them to a number of wor
(default is the # of CPUs on your machine), and then replying based on the echo
handler.
+
requests
--------
@@ -37,5 +40,47 @@ requests are:
* a string of that length
The reply is simply a string followed by the termination of the socket. The
-convenience function ``gaspar.request("host:port", message)`` will send a
-request and return the reply synchronously.
+convenience function ``gaspar.client.request("host:port", message)`` will send a
+request and return the reply synchronously. It uses the basic ``socket``
+libraries, so you can "green" it safely with eventlet or gevent's monkey
+patching methods.
+
+``gaspar.client`` also provides a function called ``pack`` which takes a string
+and returns a new string with the 4-byte message length pre-pended. If you
+are using a gaspar daemon with async frameworks that are not greenlet based,
+you can use this to cover that aspect of the client protocol.
+
+limitations
+-----------
+
+formless request/response
+~~~~~~~~~~~~~~~~~~~~~~~~~
+
+Gaspar requests and responses are just strings. There is no standard way to
+serialize multiple arguments or return multiple values. Because the nature of
+the work being farmed out to such a daemon could be defeated by the wrong
+calling semantics, these details are left to the ``Consumer`` implementation
+and to postprocessing client responses.
+
+single-server operation
+~~~~~~~~~~~~~~~~~~~~~~~
+
+Although the technologies in use (TCP and 0MQ) would allow for daemons to be
+spread across systems, this wasn't an original design goal of Gaspar and it
+is not currently supported.
+
+
+why shouldn't I use celery?
+---------------------------
+
+The major "advantages" of Gaspar over Celery are its small size, conceptual
+simplicity, and infrastructureless operation. The purpose of Gaspar was to
+make it very easy to remove CPU bound processes from a tight event based I/O
+loop (like eventlet, gevent, tornado, et al), turn it into I/O wait, and
+spread that work across multiple cores.
+
+Celery serves a much broader range of purposes, is a lot more sophisticated,
+and has features like delayed and recurrent execution that Gaspar lacks. If
+you have a number of tasks you need to execute asynchronously, Celery is
+very good at this.
+
11 gaspar/__init__.py
View
@@ -1,6 +1,17 @@
+#!/usr/bin/env python
+# -*- coding: utf-8 -*-
+
+"""Gaspar is a library for creating small, simple TCP daemons that parallelize CPU
+intensive work with a simple and low-overhead request/response pattern.
+
+It does this by forking off a number of worker processes, using eventlet to
+handle incoming requests and the 0MQ push/pull message pattern to load
+ballance work across the worker processes and receive responses."""
from producers import Producer, Forker
from consumers import Consumer
+VERSION = (1, 0)
+
__all__ = [p for p in dir() if not p.startswith('_')]
10 gaspar/consumers.py
View
@@ -3,6 +3,7 @@
"""Gaspar consumers (workers)."""
+import os
import logging
import eventlet
from eventlet.green import zmq
@@ -26,6 +27,9 @@ def initialize(self, producer):
self.initialized = True
def start(self):
+ """Start the consumer. This starts a listen loop on a zmq.PULL socket,
+ calling ``self.handle`` on each incoming request and pushing the response
+ on a zmq.PUSH socket back to the producer."""
if not self.initialized:
raise Exception("Consumer not initialized (no Producer).")
producer = self.producer
@@ -38,7 +42,7 @@ def start(self):
self.listen()
def listen(self):
- import os
+ """Listen forever on the zmq.PULL socket."""
while True:
message = self.pull.recv()
logger.debug("received message of length %d" % len(message))
@@ -47,7 +51,9 @@ def listen(self):
self.push.send(response)
def handle(self, message):
- """Default handler, returns message."""
+ """Handle a message. If this producer was initialized with a handler,
+ that handler is called with ``message`` as an argument, and its return
+ value is sent over the zmq.PUSH socket back to the producer."""
if self.handler:
return self.handler(message)
return message
4 gaspar/producers.py
View
@@ -91,6 +91,10 @@ def serve(self):
eventlet.spawn(self.request_handler, conn, addr)
def start(self, blocking=True):
+ """Start the producer. This will eventually fire the ``server_start``
+ and ``running`` events in sequence, which signify that the incoming
+ TCP request socket is running and the workers have been forked,
+ respectively. If ``blocking`` is False, control ."""
self.setup_zmq()
if blocking:
self.serve()
11 setup.py
View
@@ -6,7 +6,11 @@
from setuptools import setup, find_packages
import sys, os
-version = '0.1'
+try:
+ from gaspar import VERSION
+ version = '.'.join(VERSION)
+except ImportError:
+ version = '1.0'
# some trove classifiers:
@@ -21,7 +25,10 @@
long_description=open('README.rst').read(),
# Get strings from http://pypi.python.org/pypi?%3Aaction=list_classifiers
classifiers=[
- 'Development Status :: 1 - Planning',
+ 'Development Status :: 4 - Beta',
+ 'License :: OSI Approved :: MIT License',
+ 'Intended Audience :: Developers',
+ 'Operating System :: POSIX',
],
keywords='eventlet zmq parallel prefork',
author='Jason Moiron',
Please sign in to comment.
Something went wrong with that request. Please try again.