Skip to content

alicejgibbons/java-kafka-pubsub

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

2 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

D3E pub/sub sample with Kafka OAuth2 Private Key JWT

In this quickstart, there is a publisher microservice checkout and a subscriber microservice order-processor to demonstrate how Dapr enables a publish-subscribe pattern. checkout generates messages and publishes to a specific orders topic, and order-processor subscribers listen for messages of topic orders.

Visit this link for more information about Dapr and Pub-Sub.

This quickstart includes one publisher:

  • Java client message generator checkout

And one subscriber:

  • Java subscriber order-processor

Pre-requisites

  • Docker Desktop
  • Java JDK 21 (or greater)
  • Apache Maven version 3.x.

Initialize Dapr on your local desktop:

  1. Download and unzip the correct Dapr D3e bundle for your desktop architecture that was sent over by Diagrid team:
  • daprbundle_linux_arm64.tar.gz
  • daprbundle_linux_amd64.tar.gz
  • daprbundle_windows_amd64.zip
  1. Navigate to the folder you unzipped and initialize Dapr from this location using the following command:
./dapr init --from-dir .

⌛  Making the jump to hyperspace...
⚠  Local bundle installation using --from-dir flag is currently a preview feature and is subject to change. It is only available from CLI version 1.7 onwards.
ℹ️  Installing runtime version 1.14.5-d3e.2
↙  Extracting binaries and setting up components...
✅  Extracting binaries and setting up components...
✅  Extracted binaries and completed components set up.
ℹ️  daprd binary has been installed to /root/.dapr/bin.
ℹ️  dapr_placement container is running.
ℹ️  dapr_scheduler container is running.
✅  Success! Dapr is up and running. To get started, go here: https://docs.dapr.io/getting-started
  1. Make sure the installed Dapr version looks is the dapr-1.14.5-d3e.2 version specified in the details.json folder of the unzipped archive.
~/cat details.json 
{"daprd": "1.14.5-d3e.2", "dashboard": "0.15.0", "cli": "1.14.1", "daprBinarySubDir": "dist", "dockerImageSubDir": "docker", "daprImageName": "public.ecr.aws/diagrid/d3e/dapr:1.14.5-d3e.2", "daprImageFileName": "public.ecr.aws-diagrid-d3e-dapr-1.14.5-d3e.2.tar.gz"}
  1. Test your Dapr installation is correct by running the sample application using a local Kafka broker without authentication.

Start by spinning up a local Kafka broker:

# Spin up a kafka container
docker run -d -p 9092:9092 --name broker apache/kafka:latest

Run all apps with multi-app run template file:

This section shows how to run both applications at once using multi-app run template files with dapr run -f .. This enables to you test the interactions between multiple applications.

  1. Install dependencies:
cd ./order-processor
mvn clean install
cd ..
cd ./checkout
mvn clean install
cd ..
  1. Open a new terminal window and run the Dapr multi app run template:
dapr run -f .

You will see both Dapr and application logs in the terminal. Eventually you should see 10 messages being sent via Kafka from the publisher to the subscriber app. At the end, the terminal console output should look similar to this for the application logs:

== APP - checkout-sdk == 563 [main] INFO com.service.CheckoutServiceApplication - Published data: 1
== APP - order-processor-sdk == 2023-09-04 13:57:18.434  INFO 82828 --- [nio-8080-exec-3] c.s.c.OrderProcessingServiceController   : Subscriber received: 1
== APP - checkout-sdk == 1576 [main] INFO com.service.CheckoutServiceApplication - Published data: 2
== APP - order-processor-sdk == 2023-09-04 13:57:19.419  INFO 82828 --- [nio-8080-exec-4] c.s.c.OrderProcessingServiceController   : Subscriber received: 2
== APP - checkout-sdk == 2587 [main] INFO com.service.CheckoutServiceApplication - Published data: 3
== APP - order-processor-sdk == 2023-09-04 13:57:20.431  INFO 82828 --- [nio-8080-exec-5] c.s.c.OrderProcessingServiceController   : Subscriber received: 3
== APP - checkout-sdk == 3602 [main] INFO com.service.CheckoutServiceApplication - Published data: 4
== APP - order-processor-sdk == 2023-09-04 13:57:21.447  INFO 82828 --- [nio-8080-exec-6] c.s.c.OrderProcessingServiceController   : Subscriber received: 4
== APP - checkout-sdk == 4612 [main] INFO com.service.CheckoutServiceApplication - Published data: 5
== APP - order-processor-sdk == 2023-09-04 13:57:22.455  INFO 82828 --- [nio-8080-exec-7] c.s.c.OrderProcessingServiceController   : Subscriber received: 5
== APP - checkout-sdk == 5624 [main] INFO com.service.CheckoutServiceApplication - Published data: 6
== APP - order-processor-sdk == 2023-09-04 13:57:23.468  INFO 82828 --- [nio-8080-exec-8] c.s.c.OrderProcessingServiceController   : Subscriber received: 6
== APP - checkout-sdk == 6631 [main] INFO com.service.CheckoutServiceApplication - Published data: 7
== APP - order-processor-sdk == 2023-09-04 13:57:24.474  INFO 82828 --- [nio-8080-exec-9] c.s.c.OrderProcessingServiceController   : Subscriber received: 7
== APP - checkout-sdk == 7643 [main] INFO com.service.CheckoutServiceApplication - Published data: 8
== APP - order-processor-sdk == 2023-09-04 13:57:25.487  INFO 82828 --- [io-8080-exec-10] c.s.c.OrderProcessingServiceController   : Subscriber received: 8
== APP - checkout-sdk == 8649 [main] INFO com.service.CheckoutServiceApplication - Published data: 9
== APP - order-processor-sdk == 2023-09-04 13:57:26.492  INFO 82828 --- [nio-8080-exec-2] c.s.c.OrderProcessingServiceController   : Subscriber received: 9
== APP - checkout-sdk == 9662 [main] INFO com.service.CheckoutServiceApplication - Published data: 10
== APP - order-processor-sdk == 2023-09-04 13:57:27.504  INFO 82828 --- [nio-8080-exec-1] c.s.c.OrderProcessingServiceController   : Subscriber received: 10
  1. Stop and clean up application processes if they are not autocleaned.
dapr stop -f .

Update your Kafka component file with your OAuth2 Private Key JWT settings

  1. Uncomment and update the Kafka component Yaml file here with the correct details for your Kafka message broker including the OAuth2 infromation. Specifically you need to enter the following:
  • Setting authType to oidc_private_key_jwt enables SASL authentication via the OAUTHBEARER mechanism. This supports specifying a private key JWT from an external OAuth2 or OIDC identity provider. Currently, only the client_credentials grant is supported.

  • Configure oidcTokenEndpoint to the full URL for the identity provider access token endpoint.

  • Set oidcClientID to the client ID, oidcClientAssertionCert to the client assertion certificate and oidcClientAssertionKey to the client assertion key provisioned in the identity provider.

  • If caCert is specified in the component configuration, the certificate is appended to the system CA trust for verifying the identity provider certificate. Similarly, if skipVerify is specified in the component configuration, verification will also be skipped when accessing the identity provider.

  • By default, the only scope requested for the token is openid; it is highly recommended that additional scopes be specified via oidcScopes in a comma-separated list and validated by the Kafka broker. If additional scopes are not used to narrow the validity of the access token, a compromised Kafka broker could replay the token to access other services as the Dapr clientID.

Additional docs can be found in the docs PR here.

  1. Comment out the kafka-local-test.yaml file and run the same test as above to confirm if the OAuth2 Private Key JWT settings are working correctly.
dapr run -f .

About

An example using java-kafka-pubsub with Dapr for testing

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published