A minimal, end-to-end example where a Spring Boot REST API accepts a PNR payload (name, phone, receivedFrom, ticketing, itinerary), publishes it to Kafka, and a downstream consumer reads it for further processing.
- REST endpoint:
POST /pnrs - Kafka producer: publishes JSON messages to
demo-topic - Kafka consumer: logs consumed PNRs (this stands in for your check-in/enrichment service)
- Docker Compose: single-broker Kafka (KRaft) for local dev
- Postman collection (optional): ready to import
- Java 17+ (
java -version) - Maven (
mvn -version) — or use the Maven wrapper - Docker + Docker Compose`
Use this docker-compose.yml (works on Windows/macOS/Linux):
version: '3.8'
services:
kafka:
image: bitnami/kafka:3.7
container_name: kafka
ports:
- "9092:9092"
environment:
- KAFKA_ENABLE_KRAFT=yes
- KAFKA_CFG_NODE_ID=1
- KAFKA_CFG_PROCESS_ROLES=broker,controller
- KAFKA_CFG_CONTROLLER_QUORUM_VOTERS=1@kafka:9093
- KAFKA_CFG_LISTENERS=PLAINTEXT://:9092,CONTROLLER://:9093
- KAFKA_CFG_ADVERTISED_LISTENERS=PLAINTEXT://localhost:9092
- KAFKA_CFG_CONTROLLER_LISTENER_NAMES=CONTROLLER
- KAFKA_CFG_LISTENER_SECURITY_PROTOCOL_MAP=CONTROLLER:PLAINTEXT,PLAINTEXT:PLAINTEXT
- KAFKA_CFG_INTER_BROKER_LISTENER_NAME=PLAINTEXT
- KAFKA_CFG_OFFSETS_TOPIC_REPLICATION_FACTOR=1
- KAFKA_CFG_TRANSACTION_STATE_LOG_REPLICATION_FACTOR=1
- KAFKA_CFG_TRANSACTION_STATE_LOG_MIN_ISR=1
- KAFKA_CFG_AUTO_CREATE_TOPICS_ENABLE=true # convenient for dev
restart: unless-stoppedBring it up:
docker compose down -v
docker compose up -d
docker logs -f kafkaWait until you see the broker start without errors.
Optional: create the topic inside the container
docker exec -it kafka bash /opt/bitnami/kafka/bin/kafka-topics.sh --bootstrap-server localhost:9092 > --create --topic demo-topic --partitions 1 --replication-factor 1 /opt/bitnami/kafka/bin/kafka-topics.sh --bootstrap-server localhost:9092 --list exit
If 8080 is free:
mvn spring-boot:runIf 8080 is busy, run on 8081:
mvn spring-boot:run -Dspring-boot.run.arguments="--server.port=8081"You should see:
Tomcat started on port 8081 (http)
Use curl (adjust port if you stayed on 8080):
curl -X POST http://localhost:8081/pnrs -H "Content-Type: application/json" -d '{
"name": {"first":"Ada","last":"Lovelace","title":"Ms"},
"phone":"+353700000",
"receivedFrom":"WEB-PORTAL",
"ticketing":{"ticketNumber":"125-1234567890","issueDate":"2025-09-10","validatingCarrier":"BA","formOfPayment":"CC"},
"itinerary":[
{"from":"DUB","to":"LHR","flightNo":"BA831","departureUtc":"2025-10-01T07:50:00Z","arrivalUtc":"2025-10-01T09:10:00Z","cabin":"Y"},
{"from":"LHR","to":"DUB","flightNo":"BA834","departureUtc":"2025-10-05T18:30:00Z","arrivalUtc":"2025-10-05T20:00:00Z","cabin":"Y"}
]
}'Expected:
- HTTP 201 with
{"messageKey":"<uuid>"} - App logs show the consumer printing something like:
Consumed event key=<uuid>, name=Ada Lovelace, receivedFrom=WEB-PORTAL, legs=2, firstLeg=DUB-LHR BA831
POST /pnrs
Request body
{
"name": { "first": "Ada", "last": "Lovelace", "title": "Ms" },
"phone": "+353700000",
"receivedFrom": "WEB-PORTAL",
"ticketing": {
"ticketNumber": "125-1234567890",
"issueDate": "2025-09-10",
"validatingCarrier": "BA",
"formOfPayment": "CC"
},
"itinerary": [
{
"from": "DUB",
"to": "LHR",
"flightNo": "BA831",
"departureUtc": "2025-10-01T07:50:00Z",
"arrivalUtc": "2025-10-01T09:10:00Z",
"cabin": "Y"
}
]
}Response
{ "messageKey": "<uuid>" }- Topic:
demo-topic(JSON messages) - Key: random UUID in this demo. In real systems, key by PNR locator so all updates for a PNR land in the same partition.
- Producer:
KafkaTemplate<String, Object>with JSON serializer - Consumer: Spring
@KafkaListener, JSON deserializer
Note: The consumer is configured to deserialize directly to
PnrRequest. If you disabled type headers in the producer, make sure your consumer sets:spring: kafka: consumer: properties: spring.json.trusted.packages: com.example.pnrkafkademo.model spring.json.value.default.type: com.example.pnrkafkademo.model.PnrRequest spring.json.use.type.headers: false
pnr-kafka-demo/
├─ docker-compose.yml
├─ pom.xml
├─ README.md
└─ src/main/java/com/example/pnrkafkademo/
├─ PnrKafkaDemoApplication.java
├─ config/
│ └─ KafkaConfig.java
├─ kafka/
│ ├─ PnrProducer.java
│ └─ PnrConsumer.java
├─ model/
│ └─ PnrRequest.java
└─ web/
└─ PnrController.java
Import a ready collection and environment or create the request manually.
- Collection:
PNR_Kafka_Demo.postman_collection.json - Environment:
PNR_Local.postman_environment.json- Variables:
baseUrl=http://localhost,port=8081(change to 8080 if needed)
- Variables:
Steps:
- Postman → Import → select both JSON files
- Choose environment PNR Local
- Open collection PNR Kafka Demo → Create PNR → Send
src/main/resources/application.yml
server:
port: 8081 # change to 8081 if needed
spring:
kafka:
bootstrap-servers: localhost:9092
consumer:
key-deserializer: org.apache.kafka.common.serialization.StringDeserializer
value-deserializer: org.springframework.kafka.support.serializer.JsonDeserializer
properties:
spring.json.trusted.packages: com.example.pnrkafkademo.model
spring.json.value.default.type: com.example.pnrkafkademo.model.PnrRequest
spring.json.use.type.headers: false