Skip to content

HTTPS clone URL

Subversion checkout URL

You can clone with
or
.
Download ZIP
Browse files

Project is renamed to tornado-slacker, django hard dependency is remo…

…ved, simpler implementation

--HG--
rename : async_orm/vendor/__init__.py => slacker/__init__.py
rename : async_orm/vendor/adisp.py => slacker/adisp.py
rename : async_orm/settings.py => slacker/django_backend/conf.py
rename : async_orm/incarnations/http/urls.py => slacker/django_backend/urls.py
rename : async_orm/incarnations/http/views.py => slacker/django_backend/views.py
rename : async_orm/chains.py => slacker/postpone.py
rename : async_orm/tests.py => slacker/tests.py
  • Loading branch information...
commit ee059b61f21b51232180e789e20817845aeaa827 1 parent d748671
@kmike authored
View
2  .hgignore
@@ -25,7 +25,7 @@ Thumbs.db
build/
dist/
MANIFEST
-django_async_orm.egg-info
+tornado_slacker.egg-info
#my files
db.sqlite
View
94 README.rst
@@ -1,8 +1,9 @@
-================
-django-async-orm
-================
+===============
+tornado-slacker
+===============
-This app makes non-blocking django ORM calls possible.
+This package provides an easy API for moving the work out of
+the tornado process.
Installation
============
@@ -10,96 +11,61 @@ Installation
::
pip install "tornado >= 1.2"
- pip install django-async-orm
+ pip install tornado-slacker
FIXME: this is not uploaded to pypi now
-Overview
-========
-
-This app can be used for tornado + django integration: run tornado
-as django management command (on a separate port) => all django code will be
-available in tornado process; then use this library instead of
-plain django ORM calls in Tornado handlers to make these calls non-blocking.
-
-::
-
- from django.contrib.auth.models import User
- from async_orm.incarnations.http import AsyncWrapper
-
- AsyncUser = AsyncWrapper(User)
-
- # ...
-
- def process_data(self):
- # all the django orm syntax is supported here
- qs = AsyncUser.objects.filter(is_staff=True)[:5]
- qs.execute(self.on_ready)
-
- def on_ready(self, users):
- # do something with query result
- print users
-
-or, with pep-342 syntax and adisp library (it is bundled)::
+Usage
+=====
- from async_orm.vendor import adisp
+Please dig into source code for more info, this README is totally
+incomplete.
- @adisp.process
- def process_data(self):
- qs = AsyncUser.objects.filter(is_staff=True)[:5]
- users = yield qs.fetch()
- print users
+TODO: proper usage guide: how to configure backend (e.g django) for this?
+How to deploy?
-You still can't rely on third-party code that uses django ORM
-in Tornado handlers but it is at least easy to reimplement this code
-if necessary.
+Performance notes
+=================
Currently the only implemented method for offloading query execution
from the ioloop is to execute the blocking code in a django view and
fetch results using tornado's AsyncHttpClient. This way it was possible
to get a simple implementation, easy deployment and a thread pool
-(managed by webserver) for free. HTTP, however, can cause a
-significant overhead.
-
-Please dig into source code for more info, this README is totally
-incomplete.
-
-TODO: proper usage guide: how to configure django for this? how to deploy?
-
-Configuration
-=============
+(managed by webserver) for free.
-(async.incarnations.http.urls must be included in urls)
-
-TODO: write this
-
-Usage
-=====
+IOLoop blocks on any CPU activity and making http requests plus
+pickling/unpickling can cause a significant overhead here. So if the query
+is fast (e.g. database primary key or index lookup, say 10ms) then it is
+better to call the query in 'blocking' way: the overall blocking
+time will be less than with 'async' approach because of reduced
+computations amount.
-TODO
+tornado-slacker unpickles the received results and unpickling can be
+CPU-intensive so it is better to return as little as possible from
+postponed chains.
Contributing
============
If you have any suggestions, bug reports or
annoyances please report them to the issue tracker
-at https://github.com/kmike/django-async-orm/issues
-
-Both hg and git pull requests are welcome!
+at https://github.com/kmike/tornado-slacker/issues
Source code:
-* https://bitbucket.org/kmike/django-async-orm/
-* https://github.com/kmike/django-async-orm/
+* https://bitbucket.org/kmike/tornado-slacker/
+* https://github.com/kmike/tornado-slacker/
+
+Both hg and git pull requests are welcome!
Credits
=======
Inspiration:
-* http://tornadogists.org/654157/
* https://github.com/satels/django-async-dbslayer/
* https://bitbucket.org/david/django-roa/
+* http://tornadogists.org/654157/
Third-party software: `adisp <https://code.launchpad.net/adisp>`_ (tornado
adisp implementation is taken from
View
1  async_orm/__init__.py
@@ -1 +0,0 @@
-
View
148 async_orm/chains.py
@@ -1,148 +0,0 @@
-import pprint
-try:
- import cPickle as pickle
-except ImportError:
- import pickle
-
-from django.db.models.loading import get_model
-
-
-class AsyncOrmException(Exception):
- pass
-
-
-class ChainProxy(object):
- """
- Stores attribute, call and slice chain without actully
- calling methods, accessing attributes and performing slicing.
-
- Collecting the access to private methods and attributes
- (beginning with __two_underscores) is not supported.
-
- FIXME: '_obj', '_chain' and 'restore' attributes of original
- object are replaced with the ones from this proxy.
- """
-
- def __init__(self, obj, **kwargs):
- self._obj = obj
- self._chain = []
- self._extra = kwargs
-
- def __getattr__(self, attr):
- # pickle.dumps internally checks if __getnewargs__ is defined
- # and thus returning ChainProxy object instead of
- # raising AttributeError breaks pickling. Returning self instead
- # of raising an exception for private attributes can possible
- # break something else so the access to private methods and attributes
- # is not overriden at all.
- if attr.startswith('__'):
- return self.__getattribute__(attr)
-
- # attribute access is stored as 1-element tuple
- self._chain.append((attr,))
- return self
-
- def __getitem__(self, slice):
- # slicing operation is stored as 2-element tuple
- self._chain.append((slice, None,))
- return self
-
- def __call__(self, *args, **kwargs):
- # method call is stored as 3-element tuple
- method_name = self._chain[-1][0]
- self._chain[-1] = (method_name, args, kwargs)
- return self
-
- def restore(self):
- """ Executes and returns the stored chain. """
- result = self._obj
- for op in self._chain:
- if len(op) == 1: # attribute
- result = getattr(result, op[0])
- elif len(op) == 2: # slice or index
- result = result[op[0]]
- elif len(op) == 3: # method
- result = getattr(result, op[0])(*op[1], **op[2])
- return result
-
- def __repr__(self):
- return "%s: %s" % (self._obj, pprint.pformat(self._chain))
-
-
-class ModelChainProxy(ChainProxy):
- """
- Adds support for pickling when proxy is applied to
- django.db.models.Model subclass.
-
- This handles QuerySet method arguments like Q objects,
- F objects and aggregate functions (e.g. Count) properly,
- but can break on QuerySets as arguments (queryset will be executed).
-
- Why not follow the advice from django docs and just pickle queryset.query?
- http://docs.djangoproject.com/en/dev/ref/models/querysets/#pickling-querysets
-
- The advice is limited to QuerySets. With ModelChainProxy it is possible
- to pickle any ORM calls including ones that don't return QuerySets:
- http://docs.djangoproject.com/en/dev/ref/models/querysets/#methods-that-do-not-return-querysets
-
- Moreover, using custom managers and model methods, as well as returning model
- attributes, is fully supported. This allows user to execute any
- orm-related code (e.g. populating the instance and saving it) in
- non-blocking manner: just write the code as a model or manager method.
- """
- def _model_data(self):
- meta = self._obj._meta
- return meta.app_label, meta.object_name
-
- def __getstate__(self):
- return dict(
- chain = self._chain,
- model_class = self._model_data()
- )
-
- def __setstate__(self, dict):
- self._chain = dict['chain']
- model_class = get_model(*dict['model_class'])
- self._obj = model_class
-
- @property
- def _pickled(self):
- return pickle.dumps(self, pickle.HIGHEST_PROTOCOL)
-
- def __repr__(self):
- app, model = self._model_data()
- return "%s.%s: %s" % (app, model, pprint.pformat(self._chain))
-
-
-class ProxyWrapper(object):
- """
- Creates a new ChainProxy subclass instance on every attribute access.
- Useful for wrapping existing classes into chain proxies.
- """
- proxy_class = ModelChainProxy
-
- def __init__(self, cls, **kwargs):
- self._cls = cls
- self._extra = kwargs
-
- def __getattr__(self, item):
- return getattr(self.proxy_class(self._cls, **self._extra), item)
-
-
-def repickle_chain(pickled_data):
- """
- Unpickles and executes pickled chain, then pickles the result
- and returns it. Raises AsyncOrmException on errors.
- """
- try:
- chain = pickle.loads(pickled_data)
- except pickle.PicklingError, e:
- raise AsyncOrmException(str(e))
-
- if not isinstance(chain, ChainProxy):
- raise AsyncOrmException('Pickled query is not an instance of ChainProxy')
-
- # TODO: better error handling
- restored = chain.restore()
- data = pickle.dumps(restored, pickle.HIGHEST_PROTOCOL)
- return data
View
1  async_orm/incarnations/__init__.py
@@ -1 +0,0 @@
-
View
61 async_orm/incarnations/http/__init__.py
@@ -1,61 +0,0 @@
-# coding: utf8
-"""
-
-This async_orm incarnation makes django ORM queries async
-by executing them in django view and fetching the results via
-http using tornado's async http client.
-
-This way we get a simple implementation, easy deployment and a
-thread pool (managed by webserver) for free.
-
-Http, however, can cause a significant overhead.
-
-"""
-
-try:
- import cPickle as pickle
-except ImportError:
- import pickle
-
-from tornado.httpclient import AsyncHTTPClient
-from django.core.urlresolvers import reverse
-
-from async_orm.vendor import adisp
-from async_orm.chains import ModelChainProxy, ProxyWrapper
-from async_orm.settings import HTTP_SERVER
-
-
-class TornadoHttpModelProxy(ModelChainProxy):
-
- def execute(self, callback):
- server = self._extra.get('server', HTTP_SERVER)
- path = self._extra.get('path', None) or reverse('async-orm-execute')
- url = server + path
-
- def on_response(response):
- result = pickle.loads(response.body)
- callback(result)
-
- http = AsyncHTTPClient()
- http.fetch(url, on_response, method='POST', body=self._pickled)
-
- fetch = adisp.async(execute)
-
-
-
-class AsyncWrapper(ProxyWrapper):
- """
- Returns async proxy for passed django.db.models.Model class.
- Constructor also accepts 'server' and 'path' keyword arguments.
-
- 'async-orm-execute' view enabled.
-
- Example::
-
- from django.contrib.auth.models import User
- from async_orm.incarnations.http import AsyncWrapper
-
- AsyncUser = AsyncWrapper(User, server='http://127.0.0.1:8001')
-
- """
- proxy_class = TornadoHttpModelProxy
View
7 async_orm/incarnations/http/urls.py
@@ -1,7 +0,0 @@
-from django.conf.urls.defaults import *
-
-from async_orm.incarnations.http.views import orm_execute
-
-urlpatterns = patterns('',
- url(r'^execute/$', orm_execute, name='async-orm-execute'),
-)
View
17 async_orm/incarnations/http/views.py
@@ -1,17 +0,0 @@
-#from time import sleep
-from django.http import HttpResponse, Http404, HttpResponseBadRequest
-from django.views.decorators.csrf import csrf_exempt
-
-from async_orm.chains import repickle_chain, AsyncOrmException
-
-@csrf_exempt
-def orm_execute(request):
- # FIXME: auth?
- if request.method != 'POST':
- raise Http404
-
- try:
- data = repickle_chain(request.raw_post_data)
- return HttpResponse(data)
- except AsyncOrmException, e:
- return HttpResponseBadRequest(str(e))
View
1  async_orm/models.py
@@ -1 +0,0 @@
-# hello, testrunner!
View
2  async_orm/settings.py
@@ -1,2 +0,0 @@
-from django.conf import settings
-HTTP_SERVER = getattr(settings, 'ASYNC_ORM_HTTP_SERVER', 'http://127.0.0.1:8000')
View
7 runtests.py
@@ -5,7 +5,7 @@
from django.conf import settings
from django.core.management import call_command
-# always use async_orm from the checkout, not the installed version
+# always use slacker from the checkout, not the installed version
sys.path.insert(0, os.path.dirname(__file__))
settings.configure(
@@ -13,7 +13,8 @@
'django.contrib.auth',
'django.contrib.contenttypes',
- 'async_orm',
+ 'slacker',
+ 'slacker.django_backend',
),
DATABASES = {
'default': dict(
@@ -23,4 +24,4 @@
)
if __name__ == "__main__":
- call_command('test', 'async_orm')
+ call_command('test', 'slacker', 'django_backend')
View
15 setup.py
@@ -9,23 +9,22 @@
version='0.0.1'
setup(
- name = 'django-async-orm',
+ name = 'tornado-slacker',
version = version,
author = 'Mikhail Korobov',
author_email = 'kmike84@gmail.com',
- url = 'https://github.com/kmike/django-async-orm/',
- download_url = 'https://bitbucket.org/kmike/django-async-orm/get/tip.zip',
+ url = 'https://github.com/kmike/tornado-slacker/',
+ download_url = 'https://bitbucket.org/kmike/tornado-slacker/get/tip.zip',
- description = 'This app makes non-blocking django ORM calls possible (currently using tornado.httpclient.AsyncHTTPClient)',
+ description = 'This package provides an easy API for moving the work out of the tornado process.',
long_description = open('README.rst').read(),
license = 'MIT license',
requires = ['django (>=1.2)', 'tornado (>= 1.2)'],
packages=[
- 'async_orm',
- 'async_orm.vendor',
- 'async_orm.incarnations',
- 'async_orm.incarnations.http'
+ 'slacker',
+ 'slacker.workers',
+ 'slacker.django_backend',
],
classifiers=[
View
0  async_orm/vendor/__init__.py → slacker/__init__.py
File renamed without changes
View
0  async_orm/vendor/adisp.py → slacker/adisp.py
File renamed without changes
View
74 slacker/django_backend/__init__.py
@@ -0,0 +1,74 @@
+from django.core.urlresolvers import reverse
+from slacker.postpone import PostponeWrapper
+from slacker.workers.http import TornadoAsyncHttpWorker
+from slacker.django_backend.conf import SLACKER_SERVER
+
+def get_async(obj, server=None, path=None):
+ """
+ Allows async iteractions with given object by moving actual work
+ to a django server.
+
+ Example::
+
+ # urls.py
+ # ...
+ urlpatterns = patterns('',
+ # ...
+ url(r'sakd3j7fhg8sdkjlk09fhgksdjhfg', include('slacker.django_backend.urls')),
+ # ...
+ )
+
+ # your tornado app code
+ from django.contrib.auth.models import User
+ from slacker.django_backend import get_async
+
+ AsyncUser = get_async(User)
+
+ # ...
+
+ def process_data(self):
+ # all the django orm syntax is supported here, including
+ # slicing, Q and F objects, aggregate functions like Count,
+ # custom managers and model methods and attributes
+
+ async_qs = AsyncUser.objects.filter(is_staff=True)[:5]
+ async_qs.proceed(self.on_ready)
+
+ def on_ready(self, users):
+ # do something with query result
+ print users
+
+ or even better, with pep-342 syntax and adisp library (it is bundled)::
+
+ from slacker import adisp
+
+ # ...
+
+ @adisp.process
+ def process_data(self):
+ qs = AsyncUser.objects.filter(is_staff=True)[:5]
+ users = yield qs.fetch()
+ print users
+
+
+ with functions (they must be top-level)::
+
+ def task(param1, param2):
+ # do something costly and blocking
+ return result
+
+ # do not use decorator syntax here, the original function is necessary
+ async_task = get_async(task)
+
+ # ...
+
+ @adisp.process
+ def process_data(self):
+ results = yield async_task('foo', 'bar')
+ print results
+
+ """
+ server = server or SLACKER_SERVER
+ path = path or reverse('slacker-execute')
+ worker = TornadoAsyncHttpWorker(server, path)
+ return PostponeWrapper(obj, worker)
View
2  slacker/django_backend/conf.py
@@ -0,0 +1,2 @@
+from django.conf import settings
+SLACKER_SERVER = getattr(settings, 'SLACKER_SERVER', 'http://127.0.0.1:8000')
View
1  slacker/django_backend/models.py
@@ -0,0 +1 @@
+# hello, django testrunner!
View
23 slacker/django_backend/tests.py
@@ -0,0 +1,23 @@
+from django.test import TestCase as DjangoTestCase
+from django.contrib.auth.models import User
+import pickle
+
+from slacker.postpone import Postponed
+
+class DjangoQueryPostponeTest(DjangoTestCase):
+
+ def setUp(self):
+ self.user = User.objects.create_user('example', 'example@example.com')
+
+ @property
+ def AsyncUser(self):
+ return Postponed(User)
+
+ def test_restore(self):
+ user_query = self.AsyncUser.objects.get(username='example')
+ self.assertEqual(user_query._proceed(), self.user)
+
+ def test_pickling_unpickling(self):
+ user_query = self.AsyncUser.objects.get(username='example')
+ self.assertEqual(pickle.loads(user_query._pickled)._proceed(), self.user)
+
View
7 slacker/django_backend/urls.py
@@ -0,0 +1,7 @@
+from django.conf.urls.defaults import *
+
+from slacker.django_backend.views import slacker_execute
+
+urlpatterns = patterns('',
+ url(r'^execute/$', slacker_execute, name='slacker-execute'),
+)
View
21 slacker/django_backend/views.py
@@ -0,0 +1,21 @@
+from __future__ import absolute_import
+
+#from time import sleep
+from django.http import HttpResponse, Http404, HttpResponseBadRequest
+from django.views.decorators.csrf import csrf_exempt
+
+from slacker.postpone import proceed_pickled, SlackerException
+
+@csrf_exempt
+def slacker_execute(request):
+ # FIXME: auth?
+ if request.method != 'POST':
+ raise Http404
+
+ # TODO: move boilerplate to process_pickled
+ # TODO: exceptions should be pickled, returned and re-raised on client
+ try:
+ data = proceed_pickled(request.raw_post_data)
+ return HttpResponse(data)
+ except SlackerException, e:
+ return HttpResponseBadRequest(str(e))
View
132 slacker/postpone.py
@@ -0,0 +1,132 @@
+import pprint
+try:
+ import cPickle as pickle
+except ImportError:
+ import pickle
+
+from slacker import adisp
+from slacker.workers.local import LocalWorker
+
+class SlackerException(Exception):
+ pass
+
+
+class Postponed(object):
+ """
+ Stores attribute, call and slice chain without actully
+ calling methods, accessing attributes and performing slicing.
+
+ Wrapped object and method arguments must be picklable.
+
+ Collecting the access to private methods and attributes
+ (beginning with __two_underscores) is not supported.
+
+ FIXME: some attributes (e.g. '_obj', '_chain', '_extra', 'proceed')
+ of original object are replaced with the ones from this proxy.
+ """
+
+ def __init__(self, obj, worker = None):
+ self._obj = obj
+ self._chain = []
+ self._worker = worker or LocalWorker()
+
+ def __repr__(self):
+ return "%s: %s" % (self._obj, pprint.pformat(self._chain))
+
+ def __getstate__(self):
+ return self._chain, self._obj
+
+ def __setstate__(self, state):
+ self._chain, self._obj = state
+ # always use local worker after unpickling
+ self._worker = LocalWorker()
+
+ @property
+ def _pickled(self):
+ return pickle.dumps(self, pickle.HIGHEST_PROTOCOL)
+
+ def __getattr__(self, attr):
+ # pickle.dumps internally checks if __getnewargs__ is defined
+ # and thus returning ChainProxy object instead of
+ # raising AttributeError breaks pickling. Returning self instead
+ # of raising an exception for private attributes can possibly
+ # break something else so the access to other private methods
+ # and attributes is also not overriden.
+ if attr.startswith('__'):
+ return self.__getattribute__(attr)
+
+ # attribute access is stored as 1-element tuple
+ self._chain.append((attr,))
+ return self
+
+ def __getitem__(self, slice):
+ # slicing operation is stored as 2-element tuple
+ self._chain.append((slice, None,))
+ return self
+
+ def __call__(self, *args, **kwargs):
+ # method call is stored as 3-element tuple
+ if not self._chain:
+ # top-level call
+ self._chain.append((None, args, kwargs))
+ else:
+ method_name = self._chain[-1][0]
+ self._chain[-1] = (method_name, args, kwargs)
+ return self
+
+ def _proceed(self):
+ """ Executes the collected chain and returns the result. """
+ result = self._obj
+ for op in self._chain:
+ if len(op) == 1: # attribute
+ result = getattr(result, op[0])
+ elif len(op) == 2: # slice or index
+ result = result[op[0]]
+ elif len(op) == 3: # callable
+ func = result if op[0] is None else getattr(result, op[0])
+ result = func(*op[1], **op[2])
+ return result
+
+ def proceed(self, callback, worker=None):
+ """
+ Executes the collected chain using given worker and calls the
+ callback with results.
+ """
+ worker = worker or self._worker
+ worker.proceed(self, callback)
+
+ # pep-342 variant of proceed
+ fetch = adisp.async(proceed)
+
+
+class PostponeWrapper(object):
+ """
+ Starts a new Postponed instance for every attribute access.
+ Useful for wrapping existing classes into postponing proxies.
+ """
+ def __init__(self, obj, worker=None):
+ self._obj = obj
+ self._worker = worker
+
+ def __getattr__(self, item):
+ return getattr(Postponed(self._obj, self._worker), item)
+
+
+def proceed_pickled(pickled_postponed_obj):
+ """
+ Unpickles postponed object, proceeds it locally, then pickles the result
+ and returns it. Raises SlackerException on errors.
+
+ Useful for worker implementation.
+ """
+ try:
+ postponed = pickle.loads(pickled_postponed_obj)
+ except pickle.PicklingError, e:
+ raise SlackerException(str(e))
+
+ if not isinstance(postponed, Postponed):
+ raise SlackerException('Pickled object is not an instance of Postponed')
+
+ # TODO: better error handling
+ result = postponed._proceed()
+ return pickle.dumps(result, pickle.HIGHEST_PROTOCOL)
View
32 async_orm/tests.py → slacker/tests.py
@@ -1,10 +1,6 @@
from django.utils import unittest
-from django.test import TestCase as DjangoTestCase
-from django.contrib.auth.models import User
-from django.db import models
-import pickle
-from async_orm.chains import ChainProxy, ModelChainProxy
+from slacker.postpone import Postponed
class Foo(object):
@@ -28,17 +24,17 @@ def name(self):
return self._name
-class ChainTest(unittest.TestCase):
+class PostponeTest(unittest.TestCase):
def assertRestored(self, chain, value):
- self.assertEqual(chain.restore(), value)
+ self.assertEqual(chain._proceed(), value)
def setUp(self):
self.foo = Foo('foo')
def _foo(self):
- # ChainProxy objects shouldn't be reused
- return ChainProxy(self.foo)
+ # PostponeProxy objects shouldn't be reused
+ return Postponed(self.foo)
def test_method_basic(self):
self.assertRestored(self._foo().get_name(), 'foo')
@@ -70,20 +66,8 @@ def test_no_execution(self):
self.foo.name
self.assertTrue(self.foo.name_accessed)
+ def test_top_level_callables(self):
+ chain = Postponed(Foo)('bar')
+ self.assertEqual(chain._proceed().name, 'bar')
-class ModelChainTest(DjangoTestCase):
- def setUp(self):
- self.user = User.objects.create_user('example', 'example@example.com')
-
- @property
- def AsyncUser(self):
- return ModelChainProxy(User)
-
- def test_restore(self):
- user = self.AsyncUser.objects.get(username='example')
- self.assertEqual(user.restore(), self.user)
-
- def test_pickling_unpickling(self):
- user = self.AsyncUser.objects.get(username='example')
- self.assertEqual(pickle.loads(user._pickled).restore(), self.user)
View
0  slacker/workers/__init__.py
No changes.
View
31 slacker/workers/http.py
@@ -0,0 +1,31 @@
+try:
+ import cPickle as pickle
+except ImportError:
+ import pickle
+
+from tornado.httpclient import AsyncHTTPClient
+
+class TornadoAsyncHttpWorker(object):
+ """
+ This worker sends pickled postponed object to a web server via
+ HTTP POST request and waits for a response. Response is unpickled
+ and passed to the callback.
+
+ Combined with traditional threaded web server like apache2 + mod_wsgi
+ this enables easy deployment and thread pool for free (managed by
+ webserver). HTTP, however, may cause a significant overhead.
+
+ Django backend implementation can be found at ``slacker.django_backend``.
+ """
+
+ def __init__(self, server='127.0.0.1:8000', path='/'):
+ self.url = server + path
+
+ def proceed(self, postponed, callback):
+
+ def on_response(response):
+ result = pickle.loads(response.body)
+ callback(result)
+
+ http = AsyncHTTPClient()
+ http.fetch(self.url, on_response, method='POST', body=postponed._pickled)
View
5 slacker/workers/local.py
@@ -0,0 +1,5 @@
+class LocalWorker(object):
+ """ Dummy worker for local immediate execution """
+ def proceed(self, postponed, callback):
+ callback(postponed._proceed())
+
Please sign in to comment.
Something went wrong with that request. Please try again.