Skip to content
Permalink
Browse files
Update README.md
  • Loading branch information
jhuynh1 committed Jan 31, 2020
1 parent 2778b1e commit 7e41cc2176949c7ab21bd8b6fb65eb496484f327
Showing 1 changed file with 34 additions and 16 deletions.
@@ -1,23 +1,39 @@
## geode-kafka-connector

## What is geode-kafka-connector

Kafka provides an integration point through Source and Sink Connectors. The GeodeKafkaSource allows Geode to be a data source for Kafka
The GeodeKafkaSink allows Geode to consume data off of topics and store data from Kafka.

### How to install the geode-kafka-connector
---
#### Prequisite
Kafka is installed and is up and running. See the Kafka quickstart for more info: [Kafka Quickstart](https://kafka.apache.org/quickstart)
* Kafka is installed and is up and running. See the Kafka quickstart for more info: [Kafka Quickstart](https://kafka.apache.org/quickstart)
* A Geode Cluster with at least one locator and one server and regions to source from and sink to.
* Topics created in Kafka to source from and sink to.
---
Installation of the connector is similar in process to other Kafka Connectors. For now, we will follow the guide for [Manual Installation](https://docs.confluent.io/current/connect/managing/install.html#install-connector-manually).

Installation of the connector is similar in process to other Kafka Connectors. For now, we will follow the guide for [Manual Installation](https://docs.confluent.io/current/connect/managing/install.html#install-connector-manually)
In summary, we will use the standalone worker for this example.
* Explode a zip file or build into a known and Kafka accessible location
* Modify the connect-standalone.properties,

* Create and modify connect-geode-source.properties file
//TODO
* Modify the connect-standalone.properties and point to where the connector is installed.
```
plugin.path=/Users/jhuynh/Pivotal/geode-kafka-connector/build/libs/
```
* Create and modify connect-geode-source.properties file, for example
```
name=geode-kafka-sink
connector.class=GeodeKafkaSink
tasks.max=1
topicToRegions=[someTopicToSinkFrom:someRegionToConsume]
locators=localHost[10334]
```
* Create and modify connect-geode-sink.properties files
//TODO
```
name=geode-kafka-source
connector.class=GeodeKafkaSource
tasks.max=1
regionToTopics=[someRegionToSourceFrom:someTopicToConsume]
locators=localHost[10334]
```

* Run
bin/connect-standalone.sh config/connect-standalone.properties config/connect-geode-source.properties config/connect-geode-sink.properties
@@ -27,18 +43,19 @@ bin/connect-standalone.sh config/connect-standalone.properties config/connect-ge
### Configuration Properties
| Property | Required | Description | Default |
|---|---|---|---|
| locators | yes| A comma separated string of locators that configure which locators to connect to | "localhost[10334]" |
#### GeodeKafkaSink Properties
| Property | Required | Description| Default |
|---|---|---|---|
| locators | no, but...| A comma separated string of locators that configure which locators to connect to | "localhost[10334]" |
|topicToRegions| yes| A comma separated list of "one topic to many regions" bindings. Each binding is surrounded by brackets. For example "[topicName:regionName], [anotherTopic: regionName, anotherRegion]" | None. This is required to be set in the source connector properties
|nullValuesMeanRemove | no | If set to true, when topics send a SinkRecord with a null value, we will convert to an operation similar to region.remove instead of putting a null value into the region | true

The topicToRegions property allows us to create mappings between topics and regions. A single one-to-one mapping would look similar to "[topic:region]" A one-to-many mapping can be made by comma separating the regions, for example "[topic:region1,region2]" This is equivalent to both regions being consumers of the topic.
* The topicToRegions property allows us to create mappings between topics and regions. A single one-to-one mapping would look similar to "[topic:region]" A one-to-many mapping can be made by comma separating the regions, for example "[topic:region1,region2]" This is equivalent to both regions being consumers of the topic.

#### GeodeKafkaSource Properties
| Property | Required| Description| Default |
|---|---|---|---|
| locators | no, but...| A comma separated string of locators that configure which locators to connect to | "localhost[10334]" |
|regionToTopics| yes | A comma separated list of "one region to many topics" mappings. Each mapping is surrounded by brackets. For example "[regionName:topicName], "[anotherRegion: topicName, anotherTopic]" | None. This is required to be set in the source connector properties
|geodeConnectorBatchSize| no | Maximum number of records to return on each poll| 100 |
|geodeConnectorQueueSize| no | Maximum number of entries in the connector queue before backing up all Geode cq listeners sharing the task queue | 10000 |
@@ -47,13 +64,14 @@ The topicToRegions property allows us to create mappings between topics and reg
| durableClientTimeout| no | How long in milliseconds to persist values in Geode's durable queue before the queue is invalidated| 60000 |
| cqPrefix| no| Prefix string to identify Connector cq's on a Geode server |cqForGeodeKafka |

The regionToTopics property allows us to create mappings between regions and topics. A single one-to-one mapping would look similar to "[region:topic]" A one-to-many mapping can be made by comma separating the topics, for example "[region:topic1,topic2]" This is equivalent to the region be a producer for both topics
* The regionToTopics property allows us to create mappings between regions and topics. A single one-to-one mapping would look similar to "[region:topic]" A one-to-many mapping can be made by comma separating the topics, for example "[region:topic1,topic2]" This is equivalent to the region be a producer for both topics

---

* Consider modifying Kafka Properties like tasks.max in the source and sink parameters.

Possible Upcoming Featured:
Formatter - Possibly a JSON to and from PDX formatter
Security - security settings for Geode
Dynamic Region creation - Dynamically create regions when topics are created (filter what names to look for and what types of regions to create)
Allow a single worker to connect to multiple Geode Clusters?
* Formatters - Possibly a JSON to and from PDX formatter
* Security - security settings for Geode
* Dynamic Region creation - Dynamically create regions when topics are created (filter what names to look for and what types of regions to create)
* Allow a single worker to connect to multiple Geode Clusters?

0 comments on commit 7e41cc2

Please sign in to comment.