Skip to content

DivLoic/kafka-application4s

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

53 Commits
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Kafka App for Scala

Master Workflow

This module is the attached source code from the blog post Getting Started with Scala and Apache Kafka. It discusses how to use the basic Kafka Clients in a Scala application. Originally inpired by the first scala example, it goes beyond by showing multiple ways to produce, to consume and to configure the clients.

  1. Try it
  2. Produce
  3. Consume
  4. Read More

Try it

git clone https://github.com/DivLoic/kafka-application4s.git
cd kafka-application4s
sbt compile

Local

You first need to run Kafka and the Schema Registry. Any recent installation of Kafka or the Confluent platform can be used. Many installation methods can be found on the CP Download Page.

i.e. Confluent Cli on Mac

curl -sL https://cnfl.io/cli | sh -s -- latest -b /usr/local/bin
export CONFLUENT_HOME=...
export PATH=$PATH:$CONFLUENT_HOME
confluent local services schema-registry start

Cloud

The module also works with a cluster hosted on Confluent Cloud. You will find in consumer.conf and producer.conf the commented config related to the cloud. After that, you will need either to edit these files or to define the following variables:

export BOOTSTRAP_SERVERS="...:9092"
export CLUSTER_API_KEY="..."
export CLUSTER_API_SECRET="..."
export SCHEMA_REGISTRY_URL="https:/..."
export SR_API_KEY="..."
export SR_API_SECRET="..."

For more on Confluent Cloud login see the documentation.

Produce

Run:

sbt produce "-Djline.terminal=none" --error  

asciicast

Consume

Run:

sbt consume "-Djline.terminal=none" --error  

asciicast

Read more

About

https://medium.com/xebia-france/getting-started-with-scala-and-apache-kafka-62bb1ca6a77f

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages