Skip to content

deepa-manral/Java_project

Repository files navigation

PNR Kafka Demo (Spring Boot)

A minimal, end-to-end example where a Spring Boot REST API accepts a PNR payload (name, phone, receivedFrom, ticketing, itinerary), publishes it to Kafka, and a downstream consumer reads it for further processing.

What you get

  • REST endpoint: POST /pnrs
  • Kafka producer: publishes JSON messages to demo-topic
  • Kafka consumer: logs consumed PNRs (this stands in for your check-in/enrichment service)
  • Docker Compose: single-broker Kafka (KRaft) for local dev
  • Postman collection (optional): ready to import

Prerequisites

  • Java 17+ (java -version)
  • Maven (mvn -version) — or use the Maven wrapper
  • Docker + Docker Compose`

Quick start

1) Start Kafka

Use this docker-compose.yml (works on Windows/macOS/Linux):

version: '3.8'
services:
  kafka:
    image: bitnami/kafka:3.7
    container_name: kafka
    ports:
      - "9092:9092"
    environment:
      - KAFKA_ENABLE_KRAFT=yes
      - KAFKA_CFG_NODE_ID=1
      - KAFKA_CFG_PROCESS_ROLES=broker,controller
      - KAFKA_CFG_CONTROLLER_QUORUM_VOTERS=1@kafka:9093
      - KAFKA_CFG_LISTENERS=PLAINTEXT://:9092,CONTROLLER://:9093
      - KAFKA_CFG_ADVERTISED_LISTENERS=PLAINTEXT://localhost:9092
      - KAFKA_CFG_CONTROLLER_LISTENER_NAMES=CONTROLLER
      - KAFKA_CFG_LISTENER_SECURITY_PROTOCOL_MAP=CONTROLLER:PLAINTEXT,PLAINTEXT:PLAINTEXT
      - KAFKA_CFG_INTER_BROKER_LISTENER_NAME=PLAINTEXT
      - KAFKA_CFG_OFFSETS_TOPIC_REPLICATION_FACTOR=1
      - KAFKA_CFG_TRANSACTION_STATE_LOG_REPLICATION_FACTOR=1
      - KAFKA_CFG_TRANSACTION_STATE_LOG_MIN_ISR=1
      - KAFKA_CFG_AUTO_CREATE_TOPICS_ENABLE=true  # convenient for dev
    restart: unless-stopped

Bring it up:

docker compose down -v
docker compose up -d
docker logs -f kafka

Wait until you see the broker start without errors.

Optional: create the topic inside the container

docker exec -it kafka bash
/opt/bitnami/kafka/bin/kafka-topics.sh --bootstrap-server localhost:9092 >   --create --topic demo-topic --partitions 1 --replication-factor 1
/opt/bitnami/kafka/bin/kafka-topics.sh --bootstrap-server localhost:9092 --list
exit

2) Run the Spring Boot app

If 8080 is free:

mvn spring-boot:run

If 8080 is busy, run on 8081:

mvn spring-boot:run -Dspring-boot.run.arguments="--server.port=8081"

You should see:

Tomcat started on port 8081 (http)

3) Send a test request

Use curl (adjust port if you stayed on 8080):

curl -X POST http://localhost:8081/pnrs   -H "Content-Type: application/json"   -d '{
    "name": {"first":"Ada","last":"Lovelace","title":"Ms"},
    "phone":"+353700000",
    "receivedFrom":"WEB-PORTAL",
    "ticketing":{"ticketNumber":"125-1234567890","issueDate":"2025-09-10","validatingCarrier":"BA","formOfPayment":"CC"},
    "itinerary":[
      {"from":"DUB","to":"LHR","flightNo":"BA831","departureUtc":"2025-10-01T07:50:00Z","arrivalUtc":"2025-10-01T09:10:00Z","cabin":"Y"},
      {"from":"LHR","to":"DUB","flightNo":"BA834","departureUtc":"2025-10-05T18:30:00Z","arrivalUtc":"2025-10-05T20:00:00Z","cabin":"Y"}
    ]
  }'

Expected:

  • HTTP 201 with {"messageKey":"<uuid>"}
  • App logs show the consumer printing something like:
    Consumed event key=<uuid>, name=Ada Lovelace, receivedFrom=WEB-PORTAL, legs=2, firstLeg=DUB-LHR BA831
    

API

Create PNR

POST /pnrs

Request body

{
  "name": { "first": "Ada", "last": "Lovelace", "title": "Ms" },
  "phone": "+353700000",
  "receivedFrom": "WEB-PORTAL",
  "ticketing": {
    "ticketNumber": "125-1234567890",
    "issueDate": "2025-09-10",
    "validatingCarrier": "BA",
    "formOfPayment": "CC"
  },
  "itinerary": [
    {
      "from": "DUB",
      "to": "LHR",
      "flightNo": "BA831",
      "departureUtc": "2025-10-01T07:50:00Z",
      "arrivalUtc": "2025-10-01T09:10:00Z",
      "cabin": "Y"
    }
  ]
}

Response

{ "messageKey": "<uuid>" }

Kafka

  • Topic: demo-topic (JSON messages)
  • Key: random UUID in this demo. In real systems, key by PNR locator so all updates for a PNR land in the same partition.
  • Producer: KafkaTemplate<String, Object> with JSON serializer
  • Consumer: Spring @KafkaListener, JSON deserializer

Note: The consumer is configured to deserialize directly to PnrRequest. If you disabled type headers in the producer, make sure your consumer sets:

spring:
  kafka:
    consumer:
      properties:
        spring.json.trusted.packages: com.example.pnrkafkademo.model
        spring.json.value.default.type: com.example.pnrkafkademo.model.PnrRequest
        spring.json.use.type.headers: false

Project structure

pnr-kafka-demo/
├─ docker-compose.yml
├─ pom.xml
├─ README.md
└─ src/main/java/com/example/pnrkafkademo/
   ├─ PnrKafkaDemoApplication.java
   ├─ config/
   │   └─ KafkaConfig.java
   ├─ kafka/
   │   ├─ PnrProducer.java
   │   └─ PnrConsumer.java
   ├─ model/
   │   └─ PnrRequest.java
   └─ web/
       └─ PnrController.java

Postman (optional)

Import a ready collection and environment or create the request manually.

  • Collection: PNR_Kafka_Demo.postman_collection.json
  • Environment: PNR_Local.postman_environment.json
    • Variables: baseUrl=http://localhost, port=8081 (change to 8080 if needed)

Steps:

  1. Postman → Import → select both JSON files
  2. Choose environment PNR Local
  3. Open collection PNR Kafka DemoCreate PNRSend

Configuration

src/main/resources/application.yml

server:
  port: 8081        # change to 8081 if needed

spring:
  kafka:
    bootstrap-servers: localhost:9092
    consumer:
      key-deserializer: org.apache.kafka.common.serialization.StringDeserializer
      value-deserializer: org.springframework.kafka.support.serializer.JsonDeserializer
      properties:
        spring.json.trusted.packages: com.example.pnrkafkademo.model
        spring.json.value.default.type: com.example.pnrkafkademo.model.PnrRequest
        spring.json.use.type.headers: false

About

Java Project showcasing Spring Boot-Kafka integration

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages