Skip to content

Latest commit

 

History

History
169 lines (126 loc) · 5.2 KB

README.md

File metadata and controls

169 lines (126 loc) · 5.2 KB

kaskade

GitHub DockerHub MIT License Version Python Versions Platform Libraries.io dependency status for latest release

Kaskade

Kaskade is a text user interface (TUI) for Apache Kafka, built with Textual. It includes features like:

  • Admin:
    • List topics, partitions, groups and group members
    • Topic information like lag, replicas and records count
    • Create, edit and delete topics
    • Filter topics by name
  • Consumer:
    • Json, string, integer, long, float, boolean and double deserialization
    • Filter by key, value, header and/or partition
    • Schema Registry support with avro

Screenshots

kaskade kaskade
kaskade kaskade

Installation

Install with pipx:

pipx install kaskade

pipx will install kaskade and kskd aliases.

Upgrade with pipx:

pipx upgrade kaskade

How to install pipx for your OS at: pipx Installation.

Running kaskade

Help:

kaskade --help
kaskade admin --help
kaskade consumer --help

Admin view:

kaskade admin -b localhost:9092

Consumer view:

kaskade consumer -b localhost:9092 -t my-topic

Running with docker:

docker run --rm -it --network my-networtk sauljabin/kaskade:latest admin -b my-kafka:9092
docker run --rm -it --network my-networtk sauljabin/kaskade:latest consumer -b my-kafka:9092 -t my-topic

Configuration examples

Multiple bootstrap servers:

kaskade admin -b broker1:9092,broker2:9092

Consume and deserialize:

kaskade consumer -b localhost:9092 -t my-topic -k json -v json

Consuming from the beginning:

kaskade consumer -b localhost:9092 -t my-topic -x auto.offset.reset=earliest

Schema registry simple connection and avro deserialization:

kaskade consumer -b localhost:9092 -s url=http://localhost:8081 -t my-topic -k avro -v avro

More Schema Registry configurations at: SchemaRegistryClient.

librdkafka clients do not currently support AVRO Unions in (de)serialization, more at: Limitations for librdkafka clients.

SSL encryption example:

kaskade admin -b ${BOOTSTRAP_SERVERS} -x security.protocol=SSL

For more information about SSL encryption and SSL authentication go to the librdkafka official

page: Configure librdkafka client.

Confluent cloud admin:

kaskade admin -b ${BOOTSTRAP_SERVERS} \
        -x security.protocol=SASL_SSL \
        -x sasl.mechanism=PLAIN \
        -x sasl.username=${CLUSTER_API_KEY} \
        -x sasl.password=${CLUSTER_API_SECRET}

Confluent cloud consumer:

kaskade consumer -b ${BOOTSTRAP_SERVERS} \
        -x security.protocol=SASL_SSL \
        -x sasl.mechanism=PLAIN \
        -x sasl.username=${CLUSTER_API_KEY} \
        -x sasl.password=${CLUSTER_API_SECRET} \
        -s url=${SCHEMA_REGISTRY_URL} \
        -s basic.auth.user.info=${SR_API_KEY}:${SR_API_SECRET} \
        -t my-topic \
        -k string \
        -v avro

More about confluent cloud configuration at: Kafka Client Quick Start for Confluent Cloud.

Development

For development instructions see DEVELOPMENT.md.