diff --git a/api/index.html b/api/index.html index abc1aa4..4739a56 100644 --- a/api/index.html +++ b/api/index.html @@ -483,29 +483,31 @@

GET /api/workflows

Accept: application/json +

Parameters:

Example response:

-
HTTP/1.1 200 OK
+
HTTP/1.1 200 OK
 
 [
-    {
-        "created": "2020-02-06T13:56:51",
-        "fullname": "example.ETL",
-        "id": "29e7ef80-fa1b-4b91-8ccb-ef01a91601db",
-        "name": "ETL",
-        "payload": {"foo": "bar"},
-        "periodic": false,
-        "project": "example",
-        "status": "pending",
-        "updated": "2020-02-06T13:56:51"
-    }
+    {
+        "created": "2020-02-06T13:56:51",
+        "fullname": "example.ETL",
+        "id": "29e7ef80-fa1b-4b91-8ccb-ef01a91601db",
+        "name": "ETL",
+        "payload": {"foo": "bar"},
+        "periodic": false,
+        "project": "example",
+        "status": "pending",
+        "updated": "2020-02-06T13:56:51"
+    }
 ]
 
+

GET /api/workflows/<id>

Get the details of a specific workflow instance, including its tasks.

Example request:

@@ -514,55 +516,57 @@

GET /api/workflows/<id> Accept: application/json

+

Example response:

-
HTTP/1.1 200 OK
-
-{
-    "created": "2020-02-06T13:56:51",
-    "fullname": "example.ETL",
-    "id": "29e7ef80-fa1b-4b91-8ccb-ef01a91601db",
-    "name": "ETL",
-    "payload": {},
-    "periodic": false,
-    "project": "example",
-    "status": "pending",
-    "tasks": [
-        {
-            "created": "2020-02-06T13:56:51",
-            "id": "c8606f67-9923-4c84-bc41-69efacb0c7cb",
-            "key": "EXTRACT",
-            "previous": [],
-            "status": "pending",
-            "task": "c8606f67-9923-4c84-bc41-69efacb0c7cb",
-            "updated": "2020-02-06T13:56:51
-        },
-        {
-            "created": "2020-02-06T13:56:51",
-            "id": "35a2d47b-8105-4d03-becb-7eb48f8c062e",
-            "key": "TRANSFORM",
-            "previous": [
-                "c8606f67-9923-4c84-bc41-69efacb0c7cb"
-            ],
-            "status": "pending",
-            "task": "35a2d47b-8105-4d03-becb-7eb48f8c062e",
-            "updated": "2020-02-06T13:56:51"
-        },
-        {
-            "created": "2020-02-06T13:56:51",
-            "id": "e5a8eb49-0a8c-4063-ad08-a5e9e7bd49d2",
-            "key": "LOAD",
-            "previous": [
-                "35a2d47b-8105-4d03-becb-7eb48f8c062e"
-            ],
-            "status": "pending",
-            "task": "e5a8eb49-0a8c-4063-ad08-a5e9e7bd49d2",
-            "updated": "2020-02-06T13:56:51"
-        }
-    ],
-    "updated": "2020-02-06T13:56:51"
-}
+
HTTP/1.1 200 OK
+
+{
+    "created": "2020-02-06T13:56:51",
+    "fullname": "example.ETL",
+    "id": "29e7ef80-fa1b-4b91-8ccb-ef01a91601db",
+    "name": "ETL",
+    "payload": {},
+    "periodic": false,
+    "project": "example",
+    "status": "pending",
+    "tasks": [
+        {
+            "created": "2020-02-06T13:56:51",
+            "id": "c8606f67-9923-4c84-bc41-69efacb0c7cb",
+            "key": "EXTRACT",
+            "previous": [],
+            "status": "pending",
+            "task": "c8606f67-9923-4c84-bc41-69efacb0c7cb",
+            "updated": "2020-02-06T13:56:51
+        },
+        {
+            "created": "2020-02-06T13:56:51",
+            "id": "35a2d47b-8105-4d03-becb-7eb48f8c062e",
+            "key": "TRANSFORM",
+            "previous": [
+                "c8606f67-9923-4c84-bc41-69efacb0c7cb"
+            ],
+            "status": "pending",
+            "task": "35a2d47b-8105-4d03-becb-7eb48f8c062e",
+            "updated": "2020-02-06T13:56:51"
+        },
+        {
+            "created": "2020-02-06T13:56:51",
+            "id": "e5a8eb49-0a8c-4063-ad08-a5e9e7bd49d2",
+            "key": "LOAD",
+            "previous": [
+                "35a2d47b-8105-4d03-becb-7eb48f8c062e"
+            ],
+            "status": "pending",
+            "task": "e5a8eb49-0a8c-4063-ad08-a5e9e7bd49d2",
+            "updated": "2020-02-06T13:56:51"
+        }
+    ],
+    "updated": "2020-02-06T13:56:51"
+}
 
+

POST /api/workflows

Execute a new workflow.

Example request:

@@ -577,22 +581,24 @@

POST /api/workflows

+

Example response:

-
HTTP/1.1 201 CREATED
-
-{
-    "created": "2020-02-06T14:01:02",
-    "fullname": "example.ETL",
-    "id": "43e70707-b661-42e1-a7df-5b98851ae340",
-    "name": "ETL",
-    "payload": {},
-    "periodic": false,
-    "project": "example",
-    "status": "pending",
-    "updated": "2020-02-06T14:01:02"
-}
+
HTTP/1.1 201 CREATED
+
+{
+    "created": "2020-02-06T14:01:02",
+    "fullname": "example.ETL",
+    "id": "43e70707-b661-42e1-a7df-5b98851ae340",
+    "name": "ETL",
+    "payload": {},
+    "periodic": false,
+    "project": "example",
+    "status": "pending",
+    "updated": "2020-02-06T14:01:02"
+}
 
+

GET /api/ping

Health endpoint used to monitor Director API.

Example request:

@@ -601,6 +607,7 @@

GET /api/ping

Accept: application/json
+

Example response:

HTTP/1.1 200 OK
 
diff --git a/guides/build-workflows/index.html b/guides/build-workflows/index.html
index f541df8..fd4bd44 100644
--- a/guides/build-workflows/index.html
+++ b/guides/build-workflows/index.html
@@ -496,6 +496,7 @@ 

Build Workflows

pass
+

Chaining multiple tasks

Chaining these tasks in the workflows.yml file is pretty simple :

# Chain example
@@ -511,6 +512,7 @@ 

Chaining multiple tasks

- C
+

In this example each task will be executed one after the other : first the task A will be executed, then the task B and finally the task C.

Launch tasks in parallel

@@ -534,12 +536,14 @@

Launch tasks in parallel

- C
+

In this example the group is named GROUP_1 but it can be anything. The important is to keep unique names in case of multiple groups in your workflow.

Periodic workflows

Celery provides a scheduler used to periodically execute some tasks. This scheduler is named the Celery beat.

-

Director allows you to periodically schedule a whole workflow using a simple YAML syntax :

+

Director allows you to periodically schedule a whole workflow using a simple YAML syntax.

+

First example:

example.CHAIN:
   tasks:
     - A
@@ -549,12 +553,27 @@ 

Periodic workflows

schedule: 60
-

The periodic > schedule key takes an integer argument (unity is the second). So in this example -the example.CHAIN worflow will be executed every 60 seconds.

+ +

Second example:

+
example.CHAIN_CRONTAB:
+  tasks:
+    - A
+    - B
+    - C
+  periodic:
+    schedule: "* */3 * * *"
+
+ + +

The periodic > schedule key takes an integer (unity is the second) or a string argument +(crontab +syntax). So in the first example, the example.CHAIN worflow will be executed every 60 seconds +and the second one, example.CHAIN_CRONTAB, every three hours.

Please note that the scheduler must be started to handle periodic workflows :

$ director celery beat
 
+

Tip

Celery also accepts the -B option when launching a worker :

@@ -573,6 +592,7 @@

Use of queues in Workflows

queue: q1
+

You need the start Celery worker instance with the --queues option:

$ director celery worker --loglevel=INFO --queues=q1
 
diff --git a/guides/enable-authentication/index.html b/guides/enable-authentication/index.html index f6e49da..13a5785 100644 --- a/guides/enable-authentication/index.html +++ b/guides/enable-authentication/index.html @@ -441,6 +441,7 @@

Manage user

$ director user [create|list|update|delete]
 
+

Create user example:

$ director user create john
 
diff --git a/guides/error-tracking/index.html b/guides/error-tracking/index.html index 26eefe9..f6e7a3a 100644 --- a/guides/error-tracking/index.html +++ b/guides/error-tracking/index.html @@ -395,6 +395,7 @@

Error Tracking

DIRECTOR_SENTRY_DSN="https://xyz@sentry.example.com/0"
 
+

Let's imagine the following workflow :

# workflows.yml
 ---
@@ -404,6 +405,7 @@ 

Error Tracking

- ERROR_TASK
+

With the associated tasks :

# tasks/example.py
 from director import task
@@ -417,6 +419,7 @@ 

Error Tracking

print(1/0)
+

When a Celery worker will execute this code, an issue will be created in Sentry with the ZeroDivisionError :

Sentry issue

In order to group the issues by workflow's name or by project, Director associated some tags to the event :

diff --git a/guides/run-workflows/index.html b/guides/run-workflows/index.html index 8d04a42..622054b 100644 --- a/guides/run-workflows/index.html +++ b/guides/run-workflows/index.html @@ -471,16 +471,18 @@

Using the CLI

$ director workflow run ovh.MY_WORKFLOW
 
+

Using the API

You can run a workflow using a POST request on the Director API. This is very convenient if your applications are based on webservices.

The request is a POST on the /api/workflows endpoint :

-
$ curl --header "Content-Type: application/json" \
-  --request POST \
-  --data '{"project":"ovh", "name": "MY_WORKFLOW", "payload": {}}' \
-  http://localhost:8000/api/workflows
+
$ curl --header "Content-Type: application/json" \
+  --request POST \
+  --data '{"project":"ovh", "name": "MY_WORKFLOW", "payload": {}}' \
+  http://localhost:8000/api/workflows
 
+

Technical explanation

To really understand this feature it's important to know how native Celery works.

diff --git a/guides/use-payload/index.html b/guides/use-payload/index.html index c729edc..b9b45e3 100644 --- a/guides/use-payload/index.html +++ b/guides/use-payload/index.html @@ -486,6 +486,7 @@

Use Payload

- SEND_MAIL
+

This usecase is simple :

  1. the fist task creates an order about a specific product,
  2. @@ -497,13 +498,15 @@

    Send payload

    $ director workflow run product.ORDER '{"user": 1234, "product": 1000}'
     
    +

    or

    -
    $ curl --header "Content-Type: application/json" \
    -  --request POST \
    -  --data '{"project": "product", "name": "ORDER", "payload": {"user": 1234, "product": 1000}}' \
    -  http://localhost:8000/api/workflows
    +
    $ curl --header "Content-Type: application/json" \
    +  --request POST \
    +  --data '{"project": "product", "name": "ORDER", "payload": {"user": 1234, "product": 1000}}' \
    +  http://localhost:8000/api/workflows
     
    +

    Handle payload

    You can handle the payload in the code using the kwargs dictionnary :

    @task(name="ORDER_PRODUCT")
    @@ -524,6 +527,7 @@ 

    Handle payload

    mail.send()
    +

    As you can see the payload is forwarded to all the tasks contained in your workflow.

    Create the schema

    @@ -542,6 +546,7 @@

    Create the schema

    }
    +

    Then you can reference it in your workflow using the schema keyword :

    product.ORDER:
       tasks:
    @@ -550,6 +555,7 @@ 

    Create the schema

    schema: order
    +

    Tip

    You can host your schemas into subfolders (ie $DIRECTOR_HOME/schemas/foo/bar/baz.json) @@ -562,6 +568,7 @@

    Create the schema

    Aborted!
    +

    The API returns a 400 Bad request error.

    Periodic workflows

    Celery Director provides a YAML syntax to periodically schedule a workflow. @@ -574,6 +581,7 @@

    Periodic workflows

    payload: {"user": False} +

    The corresponding task can easily handle this default value :

    @task(name="UPDATE_CACHE")
     def update_cache(*args, **kwargs):
    @@ -584,6 +592,7 @@ 

    Periodic workflows

    return update_user(user)
    +

    This way the whole list of users will be updated every hours, and a manual update can be done on a specific user :

    $ director workflow run users.UPDATE_CACHE '{"user": "john.doe"}'
    diff --git a/guides/write-tasks/index.html b/guides/write-tasks/index.html
    index 2c24996..8911e3c 100644
    --- a/guides/write-tasks/index.html
    +++ b/guides/write-tasks/index.html
    @@ -493,6 +493,7 @@ 

    Create a task

    pass
    +

    Warning

    The name parameter in the task decorator is mandatory. Because it will be used in the YAML @@ -530,6 +531,7 @@

    Task signature

    print(args)
    +

    The following workflows present different usecases and the output of the C task (see the Build Workflows guide to understand the YAML format) :

    example.NO_PARENT:
    @@ -555,6 +557,7 @@ 

    Task signature

    # Result : ([{'result': 'a_data'}, {'result': 'b_data'}],)
    +

    Bound Tasks

    Celery allows use to bind a task, providing the task instance itself as the first parameter.

    @@ -567,6 +570,7 @@

    Bound Tasks

    print(self.name) +

    Celery Task Options

    The task() decorator provided by Director is just a wrapper of the native app.task() decorator provided by Celery, so all the original options are still available.

    diff --git a/index.html b/index.html index 057e8bb..8f40b51 100644 --- a/index.html +++ b/index.html @@ -568,6 +568,7 @@

    Installation

    pip install celery-director
     
    +

    Usage

    Write your code in Python

    # tasks/orders.py
    @@ -592,6 +593,7 @@ 

    Write your code in Python

    mail.send()
    +

    Build your workflows in YAML

    # workflows.yml
     product.ORDER:
    @@ -600,11 +602,13 @@ 

    Build your workflows in YAML

    - SEND_MAIL
    +

    Run it

    You can simply test your workflow in local :

    $ director workflow run product.ORDER '{"user": 1234, "product": 1000}'
     
    +

    And run it in production using the director API :

    $ curl --header "Content-Type: application/json" \
       --request POST \
    @@ -612,6 +616,7 @@ 

    Run it

    http://localhost:8000/api/workflows
    +

    Project layout

    .env                # The configuration file.
     workflows.yml       # The workflows definition.
    @@ -620,6 +625,7 @@ 

    Project layout

    ... # Other files containing other tasks.
    +

    Commands