Skip to content

schocco/kafka-streams-demo

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

2 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Kafka Streams Demo

Stock Data Processing with Spring Cloud Stream and the Confluent Platform

Setup

  • Check the docker bridge IP (docker network inspect bridge) and change the URLs in the application.yml config file and in the KAFKA_ADVERTISED_LISTENERS url in the docker-compose.yml files if needed
  • Download the Stock Data CSVs
    cd cp-docker
    ./download-data.sh
  • Start the docker containers with Kafka, Zookeeper, Kafka Connect and the Schema Registry.
    docker-compose up -d
    docker-compose ps
  • When all containers are up, create a new connect task for the CSV import to a Kafka Topic.
    The connect plugin will register the schema of the imported data in the Schema registry.
    ./start-connector.sh
    curl localhost:8083/connector-plugins | jq
  • The connector will send messages to Kafka when CSVs are placed in cp-docker/connect/spool-input
    head -n 10 us-etfs.csv > connect/spool-input/etfs.csv 
  • When the application is started, it should receive the messages of the first few rows. The Messages are sent in Avro format and are deserialized to pojos, which were created with the Avro maven plugin

References

About

processing stock market data with Kafka Streams

Resources

License

Stars

Watchers

Forks

Packages

No packages published