Skip to content

Latest commit

 

History

History
171 lines (118 loc) · 5.17 KB

README.md

File metadata and controls

171 lines (118 loc) · 5.17 KB

Kafka Asset Bundle

Build Status codecov

A bundle of Kafka operations and apis for Teraslice.

Releases

You can find a list of releases, changes, and pre-built asset bundles here.

Getting Started

This asset bundle requires a running Teraslice cluster, you can find the documentation here.

# Step 1: make sure you have teraslice-cli installed
yarn global add teraslice-cli

# Step 2:
# FIXME: this should be accurate
teraslice-cli asset deploy ...

IMPORTANT: Additionally make sure have installed the required connectors.

Connectors

Kafka Connector

Terafoundation connector for Kafka producer and consumer clients.

To install from the root of your terafoundation based service.

npm install terafoundation_kafka_connector

Configuration:

The terafoundation level configuration is as follows:

Configuration Description Type Notes
brokers List of seed brokers for the kafka environment String[] optional, defaults to ["localhost:9092"]
security_protocol Protocol used to communicate with brokers, may be set to plaintext or ssl String optional, defaults to plaintext
ssl_ca_location File or directory path to CA certificate(s) for verifying the broker's key String only used when security_protocol is set to ssl
ssl_certificate_location Path to client's public key (PEM) used for authentication String only used when security_protocol is set to ssl
ssl_crl_location Path to CRL for verifying broker's certificate validity String only used when security_protocol is set to ssl
ssl_key_location Path to client's private key (PEM) used for authentication String only used when security_protocol is set to ssl
ssl_key_password Private key passphrase String only used when security_protocol is set to ssl

When using this connector in code, this connector exposes two different client implementations. One for producers type: producer and one for consumers type: consumer.

Configuration Description Type Notes
options Consumer or Producer specific options Object required, see below
topic_options librdkafka defined settings that apply per topic Object optional, defaults to {}
rdkafka_options librdkafka defined settings that are not subscription specific Object optional, defaults to {}

The options object enables setting a few properties

Configuration Description Type Notes
type What type of connector is required, either consumer or producer. String required, defaults to consumer
group For type 'consumer', what consumer group to use String optional
poll_interval For type 'producer', how often (in milliseconds) the producer connection is polled to keep it alive. Number optional, defaults to 100

Consumer connector configuration example:

{
    options: {
        type: 'consumer',
        group: 'example-group'
    },
    topic_options: {
        'enable.auto.commit': false
    },
    rdkafka_options: {
        'fetch.min.bytes': 100000
    }
}

Producer connector configuration example:

{
    options: {
        type: 'producer',
        poll_interval: 1000,
    },
    topic_options: {},
    rdkafka_options: {
        'compression.codec': 'gzip',
        'topic.metadata.refresh.interval.ms': -1,
        'log.connection.close': false,
    }
}

Terafoundation configuration example:

terafoundation:
    connectors:
        kafka:
            default:
                brokers: "localhost:9092"

Development

Tests

Run the kafka tests

Requirements:

  • kafka - A running instance of kafka

Environment:

  • KAFKA_BROKERS - Defaults to localhost:9092
yarn test

Build

Build a compiled asset bundle to deploy to a teraslice cluster.

Install Teraslice CLI

yarn global add teraslice-cli
teraslice-cli assets build

Contributing

Pull requests are welcome. For major changes, please open an issue first to discuss what you would like to change.

Please make sure to update tests as appropriate.

License

MIT licensed.