Skip to content

Commit

Permalink
Doc: fix invalid content
Browse files Browse the repository at this point in the history
  • Loading branch information
Mahdi Zareie committed Feb 6, 2017
1 parent 9186164 commit eff58fd
Showing 1 changed file with 0 additions and 120 deletions.
120 changes: 0 additions & 120 deletions docs/index.md
Original file line number Diff line number Diff line change
Expand Up @@ -123,126 +123,6 @@ that's it

##Some more options :

Easy-Job is asynchronous task runner for django , it means you can run a function without waiting for the function to finish .

*Notice : current design of easy-job depends on django framework but in the near future we intend to break this dependency*


###How to setup Easy-Job in a django project

three simple steps are required to make setup easy-job in your django project :
####1. first open your settings file and add the following:

EASY_JOB = {
"easy_job_logger": "easy_job", # the logger name which easy_job itself will be using
"workers": {}
}
the inside workers you need to define your workers.
.. code-block:: python

# inside workers dictionary
EASY_JOB = {
"easy_job_logger": "easy_job", # the logger name which easy_job itself will be using
"workers": {
"worker1": {
"initializer": "easy_job.workers.rabbitmq.RabbitMQInitializer",
"count": 3,
"logger_name": "sample_worker",
"result_backend": {
"type": "Log",
"options": {
"logger_name": "sample_worker",
"log_level": logging.DEBUG
}
},
"options": {
'queue_name': 'sample_works2',
"serialization_method": "pickle",
"rabbitmq_configs": {
"connection_pool_configs": {
"max_size": 10,
"max_overflow": 10,
"timeout": 10,
"recycle": 3600,
"stale": 45
},
"connection_parameters": {
"host": "127.0.0.1"
}
}
}
},
}
}

let me explain what these configurations are from top :
* worker1: worker name
worker name is a custom name which you specify for the worker , later you will use this name to send
tasks to this particular worker.
* initializer: worker initializer class path
the dot.path to worker initializer class, you can use one of :
`easy_job.workers.rabbitmq.RabbitMQInitializer` or `easy_job.workers.mpqueue.MPQueueInitializer`
or you can create your own initializer
* count :
how many worker of this type you need
* logger: name of logger for this worker
name of the logger which workers will send their logs to it (it's different than result log)
* result_backend:
if you like to store the result of each task you can set result backend but generally result backend is optional.
to configure your result backend you need to provide
* result_backend_class :
type of result backend ,
you can use `poseidon_async.result_backends.log.LogResultBackend` or create your
own result backend .
* options:
options for the selected result backend , for example a Log result backend needs log_level and logger_name
but later another result backend may need other options
* options:
depending on the type of worker you chose ,it may have special options and configurations you can provide them in
this dictionary.
in this particular example which we are using RabbitMQ the following options are available:
* queue_name : name of rabbitMQ queue to use for transferring messages between main process and workers
* serialization_method : method of serialization , could be :code:`json` or *pickle*
* rabbitmq_configs : configurations related to rabbitmq , following is an example configuration:
```
{
"connection_pool_configs": {
"max_size": 10,
"max_overflow": 10,
"timeout": 10,
"recycle": 3600,
"stale": 45
},
"connection_parameters": {
"host": "127.0.0.1"
}
}
```
####2. call init() in wsgi.py
open your wsgi.py file and add the following:
```
import easy_job as easy_job
easy_job.init()
```
this little code will initialize easy-job and create all your workers .

####3.run your tasks
somewhere in your project you can run your tasks:

```
import easy_job
runner = easy_job.getRunner("my_worker") # the worker name you want to send your tasks
runner.run(
"path.to.your.function",
args=('unnamed','parameters','here'),
kwargs={'named':'parameters'}
)
```

that's it

###Some more options :

####Specifying retry policy when running tasks :
easy-job used retrying package to perform retrying on failure , if you intend to use this feature you should call `run()`
function with `retry_policy` parameter , like this :
Expand Down

0 comments on commit eff58fd

Please sign in to comment.