Kafka stream ecommerce order enricher based on json payloads.
Given an order topic enrich order data with customers and product data via kafka streams
The following topics are needed
topic name | description |
---|---|
products | The products |
customers | Customer Info |
orders | Basic order Info |
orders-with-products | Intermediary topic with the orders with enriched products |
orders-enriched | The real deal here |
The following diagram illustrates the topology
The following section describes the different schemas present in the topology
Product information
{
"id": "8f226d2a-d5d2-411a-b3ed-a85407f0c4ef",
"skuCode": 6,
"description": "Spice - Montreal Steak Spice"
}
Customer data and email
{
"id": "0322cc54-be29-439e-b929-25bc1f04c240",
"first_name": "Willy",
"last_name": "Pariso",
"email": "wparisoy@list-manage.com"
}
Basic Order Information with ids refering the diferent microservices(customer, product)
{
"id": "59a93295-188d-4345-9b3c-84126983dbc8",
"customerId": "0322cc54-be29-439e-b929-25bc1f04c240",
"items": [{
"id": "8f226d2a-d5d2-411a-b3ed-a85407f0c4ef",
"quantity": 5
}]
}]
Joined order with products
{
"orderId" : "59a93295-188d-4345-9b3c-84126983dbc8",
"customer" : {
"first_name": "Willy",
"last_name": "Pariso",
"email": "wparisoy@list-manage.com"
},
"products" : [ {
"skuCode" : 6,
"description" : "Spice - Montreal Steak Spice",
"quantity" : 3
} ]
}
Joinned order with cutomers and products
{
"orderId" : "59a93295-188d-4345-9b3c-84126983dbc8",
"customer" : {
"first_name": "Willy",
"last_name": "Pariso",
"email": "wparisoy@list-manage.com"
},
"products" : [ {
"skuCode" : 6,
"description" : "Spice - Montreal Steak Spice",
"quantity" : 3
} ]
}
./gradlew clean test
./gradlew clean build
docker build -t order-processor:1.0.0 .
The following steps are required to run the application with the demo dataset
To run a kafka cluster use any docker image you like(or kafka binaries)
Connect to the kafka cluster and then execute the following commands
kafka-topics --create --zookeeper localhost:2181 --replication-factor 1 --partitions 3 --topic customers
kafka-topics --create --zookeeper localhost:2181 --replication-factor 1 --partitions 3 --topic products
kafka-topics --create --zookeeper localhost:2181 --replication-factor 1 --partitions 3 --topic orders
kafka-topics --create --zookeeper localhost:2181 --replication-factor 1 --partitions 3 --topic orders-enriched
kafka-topics --create --zookeeper localhost:2181 --replication-factor 1 --partitions 3 --topic orders-with-product
docker run --name=orders-processor order-processor:1.0.0
In the data folder there is a dataset available to run a end to end run to the program.
Use any tool you like to move the dataset into their respective topics
Change