Skip to content

Pulls local Docker stats as a Redpanda Kafka stream into Deephaven

License

Notifications You must be signed in to change notification settings

deephaven-examples/redpanda-docker-stats

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

17 Commits
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

redpanda-docker-stats

Pulls local Docker stats as a Redpanda Kafka stream into Deephaven

Redpanda is an open-source Kafka-compatible event streaming platform. This sample app shows how to ingest Docker stats data from Redpanda into Deephaven.

How it works

Deephaven

This app runs using Deephaven with Docker. See our Quickstart.

Components

  • docker-compose.yml - The Docker Compose file for the application. This is the same as the Deephaven docker-compose file with Redpanda described in our Simple Kafka import.
  • kafka-produce.py - The Python script that pulls the data from Docker stats into streaming Kafka data onto Redpanda.
  • data/app.d/start.app - The Deephaven application mode app file.
  • data/app.d/tables.py - The Python script that pulls the data from Kafka stream and stores it into Deephaven.

High level overview

This app pulls data from the local Docker containers. The data is placed into a Redpanda Kafka stream.

Once data is collected in Kafka, Deephaven consumes the stream.

Launch Redpanda and Deephaven

To launch the latest release, you can clone the repository and run via:

git clone https://github.com/deephaven-examples/redpanda-docker-stats.git
cd redpanda-docker-stats
docker-compose up -d

Or, you may download the release docker-compose.yml file if preferred:

mkdir redpanda-docker-stats
cd redpanda-docker-stats
curl https://raw.githubusercontent.com/deephaven-examples/redpanda-docker-stats/main/release/docker-compose.yml -o docker-compose.yml
docker-compose up -d

This starts the containers needed for Redpanda and Deephaven.

To start listening to the Kafka topic docker-stats, navigate to http://localhost:10000/ide.

In the Panels table you will see a table for docker-stats and a figure for memoryUsage

Launch Python script

The Python script uses confluent_kafka and you must have this installed on your machine. To install, run:

pip install confluent_kafka

To produce the Kafka stream, execute the kafka-produce.py script in your terminal:

python3 ./kafka-produce.py

Note

The code in this repository is built for Deephaven Community Core v0.10.0. No guarantee of forward or backward compatibility is given.

About

Pulls local Docker stats as a Redpanda Kafka stream into Deephaven

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Languages