Skip to content

Commit

Permalink
[WSO2-Release] [Release 5.0.13] update documentation for release 5.0.13
Browse files Browse the repository at this point in the history
  • Loading branch information
wso2-jenkins-bot committed Nov 26, 2021
1 parent bfceb98 commit 9989a60
Show file tree
Hide file tree
Showing 5 changed files with 608 additions and 17 deletions.
14 changes: 7 additions & 7 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -19,16 +19,16 @@ For information on <a target="_blank" href="https://siddhi.io/">Siddhi</a> and i

## Latest API Docs

Latest API Docs is <a target="_blank" href="https://siddhi-io.github.io/siddhi-io-kafka/api/5.0.12">5.0.12</a>.
Latest API Docs is <a target="_blank" href="https://siddhi-io.github.io/siddhi-io-kafka/api/5.0.13">5.0.13</a>.

## Features

* <a target="_blank" href="https://siddhi-io.github.io/siddhi-io-kafka/api/5.0.12/#kafka-sink">kafka</a> *(<a target="_blank" href="http://siddhi.io/en/v5.1/docs/query-guide/#sink">Sink</a>)*<br> <div style="padding-left: 1em;"><p><p style="word-wrap: break-word;margin: 0;">A Kafka sink publishes events processed by WSO2 SP to a topic with a partition for a Kafka cluster. The events can be published in the <code>TEXT</code> <code>XML</code> <code>JSON</code> or <code>Binary</code> format.<br>If the topic is not already created in the Kafka cluster, the Kafka sink creates the default partition for the given topic. The publishing topic and partition can be a dynamic value taken from the Siddhi event.<br>To configure a sink to use the Kafka transport, the <code>type</code> parameter should have <code>kafka</code> as its value.</p></p></div>
* <a target="_blank" href="https://siddhi-io.github.io/siddhi-io-kafka/api/5.0.12/#kafka-replay-request-sink">kafka-replay-request</a> *(<a target="_blank" href="http://siddhi.io/en/v5.1/docs/query-guide/#sink">Sink</a>)*<br> <div style="padding-left: 1em;"><p><p style="word-wrap: break-word;margin: 0;">This sink is used to request replay of specific range of events on a specified partition of a topic.</p></p></div>
* <a target="_blank" href="https://siddhi-io.github.io/siddhi-io-kafka/api/5.0.12/#kafkamultidc-sink">kafkaMultiDC</a> *(<a target="_blank" href="http://siddhi.io/en/v5.1/docs/query-guide/#sink">Sink</a>)*<br> <div style="padding-left: 1em;"><p><p style="word-wrap: break-word;margin: 0;">A Kafka sink publishes events processed by WSO2 SP to a topic with a partition for a Kafka cluster. The events can be published in the <code>TEXT</code> <code>XML</code> <code>JSON</code> or <code>Binary</code> format.<br>If the topic is not already created in the Kafka cluster, the Kafka sink creates the default partition for the given topic. The publishing topic and partition can be a dynamic value taken from the Siddhi event.<br>To configure a sink to publish events via the Kafka transport, and using two Kafka brokers to publish events to the same topic, the <code>type</code> parameter must have <code>kafkaMultiDC</code> as its value.</p></p></div>
* <a target="_blank" href="https://siddhi-io.github.io/siddhi-io-kafka/api/5.0.12/#kafka-source">kafka</a> *(<a target="_blank" href="http://siddhi.io/en/v5.1/docs/query-guide/#source">Source</a>)*<br> <div style="padding-left: 1em;"><p><p style="word-wrap: break-word;margin: 0;">A Kafka source receives events to be processed by WSO2 SP from a topic with a partition for a Kafka cluster. The events received can be in the <code>TEXT</code> <code>XML</code> <code>JSON</code> or <code>Binary</code> format.<br>If the topic is not already created in the Kafka cluster, the Kafka sink creates the default partition for the given topic.</p></p></div>
* <a target="_blank" href="https://siddhi-io.github.io/siddhi-io-kafka/api/5.0.12/#kafka-replay-response-source">kafka-replay-response</a> *(<a target="_blank" href="http://siddhi.io/en/v5.1/docs/query-guide/#source">Source</a>)*<br> <div style="padding-left: 1em;"><p><p style="word-wrap: break-word;margin: 0;">This source is used to listen to replayed events requested from kafka-replay-request sink</p></p></div>
* <a target="_blank" href="https://siddhi-io.github.io/siddhi-io-kafka/api/5.0.12/#kafkamultidc-source">kafkaMultiDC</a> *(<a target="_blank" href="http://siddhi.io/en/v5.1/docs/query-guide/#source">Source</a>)*<br> <div style="padding-left: 1em;"><p><p style="word-wrap: break-word;margin: 0;">The Kafka Multi-Datacenter(DC) source receives records from the same topic in brokers deployed in two different kafka clusters. It filters out all the duplicate messages and ensuresthat the events are received in the correct order using sequential numbering. It receives events in formats such as <code>TEXT</code>, <code>XML</code> JSON<code> and </code>Binary`.The Kafka Source creates the default partition '0' for a given topic, if the topic has not yet been created in the Kafka cluster.</p></p></div>
* <a target="_blank" href="https://siddhi-io.github.io/siddhi-io-kafka/api/5.0.13/#kafka-sink">kafka</a> *(<a target="_blank" href="http://siddhi.io/en/v5.1/docs/query-guide/#sink">Sink</a>)*<br> <div style="padding-left: 1em;"><p><p style="word-wrap: break-word;margin: 0;">A Kafka sink publishes events processed by WSO2 SP to a topic with a partition for a Kafka cluster. The events can be published in the <code>TEXT</code> <code>XML</code> <code>JSON</code> or <code>Binary</code> format.<br>If the topic is not already created in the Kafka cluster, the Kafka sink creates the default partition for the given topic. The publishing topic and partition can be a dynamic value taken from the Siddhi event.<br>To configure a sink to use the Kafka transport, the <code>type</code> parameter should have <code>kafka</code> as its value.</p></p></div>
* <a target="_blank" href="https://siddhi-io.github.io/siddhi-io-kafka/api/5.0.13/#kafka-replay-request-sink">kafka-replay-request</a> *(<a target="_blank" href="http://siddhi.io/en/v5.1/docs/query-guide/#sink">Sink</a>)*<br> <div style="padding-left: 1em;"><p><p style="word-wrap: break-word;margin: 0;">This sink is used to request replay of specific range of events on a specified partition of a topic.</p></p></div>
* <a target="_blank" href="https://siddhi-io.github.io/siddhi-io-kafka/api/5.0.13/#kafkamultidc-sink">kafkaMultiDC</a> *(<a target="_blank" href="http://siddhi.io/en/v5.1/docs/query-guide/#sink">Sink</a>)*<br> <div style="padding-left: 1em;"><p><p style="word-wrap: break-word;margin: 0;">A Kafka sink publishes events processed by WSO2 SP to a topic with a partition for a Kafka cluster. The events can be published in the <code>TEXT</code> <code>XML</code> <code>JSON</code> or <code>Binary</code> format.<br>If the topic is not already created in the Kafka cluster, the Kafka sink creates the default partition for the given topic. The publishing topic and partition can be a dynamic value taken from the Siddhi event.<br>To configure a sink to publish events via the Kafka transport, and using two Kafka brokers to publish events to the same topic, the <code>type</code> parameter must have <code>kafkaMultiDC</code> as its value.</p></p></div>
* <a target="_blank" href="https://siddhi-io.github.io/siddhi-io-kafka/api/5.0.13/#kafka-source">kafka</a> *(<a target="_blank" href="http://siddhi.io/en/v5.1/docs/query-guide/#source">Source</a>)*<br> <div style="padding-left: 1em;"><p><p style="word-wrap: break-word;margin: 0;">A Kafka source receives events to be processed by WSO2 SP from a topic with a partition for a Kafka cluster. The events received can be in the <code>TEXT</code> <code>XML</code> <code>JSON</code> or <code>Binary</code> format.<br>If the topic is not already created in the Kafka cluster, the Kafka sink creates the default partition for the given topic.</p></p></div>
* <a target="_blank" href="https://siddhi-io.github.io/siddhi-io-kafka/api/5.0.13/#kafka-replay-response-source">kafka-replay-response</a> *(<a target="_blank" href="http://siddhi.io/en/v5.1/docs/query-guide/#source">Source</a>)*<br> <div style="padding-left: 1em;"><p><p style="word-wrap: break-word;margin: 0;">This source is used to listen to replayed events requested from kafka-replay-request sink</p></p></div>
* <a target="_blank" href="https://siddhi-io.github.io/siddhi-io-kafka/api/5.0.13/#kafkamultidc-source">kafkaMultiDC</a> *(<a target="_blank" href="http://siddhi.io/en/v5.1/docs/query-guide/#source">Source</a>)*<br> <div style="padding-left: 1em;"><p><p style="word-wrap: break-word;margin: 0;">The Kafka Multi-Datacenter(DC) source receives records from the same topic in brokers deployed in two different kafka clusters. It filters out all the duplicate messages and ensuresthat the events are received in the correct order using sequential numbering. It receives events in formats such as <code>TEXT</code>, <code>XML</code> JSON<code> and </code>Binary`.The Kafka Source creates the default partition '0' for a given topic, if the topic has not yet been created in the Kafka cluster.</p></p></div>

## Installation

Expand Down
Loading

0 comments on commit 9989a60

Please sign in to comment.