Skip to content

cloudnautique/acorn-kafka

 
 

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

8 Commits
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

What is Kafka ?

Kafka is a distributed event streaming platform which provides a highly scalable, fault-tolerant, and publish-subscribe system. It is designed for handling real-time data feeds and processing large-scale data streams.

Kafka is commonly used for building real-time data pipelines and streaming applications. It allows producers to publish messages to topics, and consumers to subscribe to those topics. It is often employed in scenarios such as log aggregation, data integration, and event-driven architectures.

Kafka as an Acorn Service

The Acornfile used to create a Kafka based Acorn Service is available in the GitHub repository at https://github.com/acorn-io/kafka. This service triggers the creation of a Kafka message brocker running in a single container which can easily be used by an application during development.

This Kafka instance defines a default topic which can be used by an application to produce and consume messages.

The Acorn image of this service is hosted in GitHub container registry at ghcr.io/acorn-io/kafka

Usage

The examples folder contains a sample application using this Service.

This app consists of 2 Python containers:

  • the first one produce a message every second. Each message has a key containing a random country code and a value containing country details
  • the second one is a consumer of those messages

In the example Acornfile we first define a service property:

services: kafka: {
 if args.dev {
  build: {
   context:   "../"
   acornfile: "../Acornfile"
  }
 } else {
  image: "ghcr.io/acorn-io/kafka:v#.#.#-#"
 }
}

Next we define the application containers. Theses ones connect to the Kafka service via environment variables which values are set based on the service's properties.

containers: {

 consumer: {
  build: {
   context: "consumer"
  }
  consumes: ["kafka"]
  env: {
   KAFKA_HOST: "@{service.kafka.address}"
   KAFKA_PORT: "@{service.kafka.port.9092}"
   KAFKA_TOPIC: "@{service.kafka.data.topicName}"
  }
  memory: 128Mi
 }

 producer: {
  build: {
   context: "producer"
  }
  consumes: ["kafka"]
  env: {
   KAFKA_HOST: "@{service.kafka.address}"
   KAFKA_PORT: "@{service.kafka.port.9092}"
   KAFKA_TOPIC: "@{service.kafka.data.topicName}"
  }
  memory: 128Mi
 }
}

These containers are respectively built using the Dockerfiles from the examples/producer and example/consumer folders. Once built, they can communicate with the Kafka service using the address and credentials provided via the dedicated variables.

This example can be run with the following command (to be run from the examples folder)

acorn run -n app

When run in Acorn Sandbox we can see Kafka service, producer and consumer are created. We can then get the producer and consumer logs from the UI:

Producer logs

Consumer logs

Note: access to the sandbox requires only a GitHub account, which is used for authentication.

Run in Acorn

An application running in the Sandbox will automatically shut down after 2 hours, but you can use the Acorn Pro plan to remove the time limit and gain additional functionalities.

Disclaimer

Disclaimer: You agree all software products on this site, including Acorns or their contents, may contain projects and materials subject to intellectual property restrictions and/or Open-Source license (“Restricted Items”). Restricted Items found anywhere within this Acorn or on Acorn.io are provided “as-is” without warranty of any kind and are subject to their own Open-Source licenses and your compliance with such licenses are solely and exclusively your responsibility. Kafka is licensed under the Apache2 license which can be found here and Acorn.io does not endorse and is not affiliated with Apache Foundation. By using Acorn.io you agree to our general disclaimer here: https://www.acorn.io/terms-of-use.

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Shell 100.0%