This repository contains a sample application that demonstrates how to use Cloudflare Queues and Dapr to build an event-driven application.
The Dapr application (producer) in this repository will run locally and publish a message to a Cloudflare queue. A Cloudflare worker (consumer) will read the message from the queue and write it to the console.
The following is required to run this sample:
-
Clone this repository to your local machine.
-
Install Dapr CLI
-
Use the Dapr CLI to install the Dapr runtime locally:
dapr init
-
-
Install Node.js
-
Install Cloudflare Wrangler
-
Ensure you're on a Cloudflare paid plan, since that is required to use Cloudflare queues.
-
Enable Queues in the Cloudflare dashboard.
- Dashboard > Workers > Queues
- Enable Queues Beta
The application architecture consists of three parts:
- A Cloudflare queue
- A consumer Cloudflare worker that reads messages from the queue.
- A producer Dapr app that will publish messages to the queue.
-
Open a terminal and use the wrangler CLI to login to Cloudflare:
wrangler login
Follow the instruction in the browser to login to Cloudflare.
The response in the terminal should end with:
Successfully logged in.
-
Create the Cloudflare queue using the wrangler CLI:
wrangler queues create dapr-messages
The response in the terminal should end with:
Created queue dapr-messages.
You can either create a new consumer worker by following steps 1-3, or use the existing consumer worker in this repository and continue from step 4.
-
In the root folder, create a worker to consume messages:
wrangler init consumer
- Create package.json:
Y
- Use TypeScript:
Y
- Create worker:
Fetch handler
- Write tests:
N
A new folder named consumer will be created which contains the worker.
- Create package.json:
-
Update the consumer/src/index.ts file to:
export default { async queue( batch: MessageBatch<Error>, env: Env ): Promise<void> { let messages = JSON.stringify(batch.messages); console.log(messages); }, };
-
Add the following lines to the consumer/wrangler.toml file:
[[queues.consumers]] queue = "dapr-messages" max_batch_size = 1
-
Ensure that you're in the consumer folder and install the dependencies:
cd consumer
npm install
-
Publish the consumer worker:
wrangler publish
The response in the terminal should end with:
Published consumer (... sec) https://consumer.<SUBDOMAIN>.workers.dev Consumer for dapr-messages Current Deployment ID: <DEPLOYMENT_ID>
-
Start tailing the log of the consumer worker:
wrangler tail
The Cloudflare binding uses a Cloudflare worker to publish messages since only Cloudflare workers can access the queue.
There are two options for this worker:
- Dapr provisions the worker.
- You use a pre-provisioned Cloudflare worker.
This sample uses option 1. Read the Cloudflare Queues binding spec and choose Manually provision the Worker script if you want to go for option 2.
-
Rename the
producer/resources/binding.yaml.template
toproducer/resources/binding.yaml
. -
Open the
binding.yaml
file and inspect its content.apiVersion: dapr.io/v1alpha1 kind: Component metadata: name: cloudflare-queues spec: type: bindings.cloudflare.queues version: v1 # Increase the initTimeout if Dapr is managing the Worker for you initTimeout: "120s" metadata: # Name of the existing Cloudflare Queue (required) - name: queueName value: "dapr-messages" # Name of the Worker (required) - name: workerName value: "dapr-message-worker" # PEM-encoded private Ed25519 key (required) - name: key value: | -----BEGIN PRIVATE KEY----- MC4CAQ... -----END PRIVATE KEY----- # Cloudflare account ID (required to have Dapr manage the Worker) - name: cfAccountID value: "" # API token for Cloudflare (required to have Dapr manage the Worker) - name: cfAPIToken value: "" # URL of the Worker (required if the Worker has been pre-created outside of Dapr) - name: workerUrl value: ""
The
metadata.name
,spec.metadata.queueName
andspec.metadata.workerName
values have already been set. Ensure that thequeueName
matches thequeue
setting in the consumer workerwrangler.toml
file.Values for
spec.metadata.key
,spec.metadata.cfAccountID
, andspec.metadata.cfAPIToken
still need to be provided. -
Follow these instructions in the Dapr docs to set the value for
spec.metadata.key
. -
The Cloudflare account ID should go in the
spec.metadata.cfAccountID
field. You can find the account ID in the Cloudflare dashboard URL:https://dash.cloudflare.com/<ACCOUNT_ID>/workers/overview
. -
A Cloudflare API token should go in the
spec.metadata.cfAPIToken
field. It can be generated as follows:- In the Cloudflare dashboard, go to the Workers page.
- Click the API tokens link
- Click the Create token button
- Click the Use template button for Edit Cloudflare Workers
- Update the permissions to only contain:
- Account | Worker Scripts | Edit
- Update the Account Resources to only contain:
- Include | <YOUR ACCOUNT>
- Set a time to live (TTL) for the token, the shorter the better, if you're just testing.
Now the binding file is complete. The file is gitignored so the secrets won't be committed to the repository.
Let's have a look at the Dapr app that will send the messages to the Cloudflare queue.
-
Inspect the
producer/index.ts
file.import { DaprClient } from "@dapr/dapr"; // Common settings const daprHost = "http://localhost"; const daprPort = process.env.DAPR_HTTP_PORT || "3500"; async function main() { console.log("Starting..."); const bindingName = "cloudflare-queues"; const bindingOperation = "publish"; const client = new DaprClient(daprHost, daprPort); for(var i = 1; i <= 10; i++) { const message = { data: "Hello World " + i }; const response = await client.binding.send(bindingName, bindingOperation, message); if (response) { console.log(response); } await sleep(1000); } console.log("Completed."); } async function sleep(ms: number) { return new Promise(resolve => setTimeout(resolve, ms)); } main().catch((e) => { console.error(e); process.exit(1); })
Note that the
bindingName
is set tocloudflare-queues
and matches the value in thebinding.yaml
. ThebindingOperation
is set topublish
(create
could be used as an alias).
-
Open a new terminal window and navigate to the
producer
folder. -
Run npm install to install the dependencies:
npm install
-
Run the following command to start the producer app:
dapr run --app-id producer --resources-path ./resources -- npm run start
-
The terminal that logs the tail of the consumer app should show a log statement for each of the ten messages sent:
Unknown Event - Ok @ 17/02/2023, 11:22:50 (log) [{"body":"{\"data\":\"Hello World 1\"}","timestamp":"2023-02-17T10:22:50.556Z","id":"8f6293d9d04001e3f2a12be5c47acde2"}] ...
If you don't want to keep the Cloudflare workers and queue, you can delete them as follows:
-
Disconnect the
consumer
worker from the queue:wrangler queues consumer remove dapr-messages consumer
The response in the terminal should end with:
Removed consumer from queue dapr-messages.
-
Delete the
consumer
worker:wrangler delete consumer
Type
Y
to confirm the deletion of the worker.The response in the terminal should end with:
Successfully deleted consumer
-
Delete the Dapr generated
dapr-message-worker
worker:wrangler delete --name dapr-message-worker
Type
Y
to confirm the deletion of the worker.The response in the terminal should end with:
Successfully deleted dapr-message-worker
-
Delete the
dapr-messages
queue:wrangler queues delete dapr-messages
The response in the terminal should end with:
Deleted queue dapr-messages.
Read about the Dapr Cloudflare Queues bindings spec on the Dapr docs site.
Any questions or comments about this sample? Join the Dapr discord and post a message the #components-contrib
channel.
Have you made something with Cloudflare and Dapr? Post a message in the #show-and-tell
channel, we love to see your creations!