Skip to content
This repository has been archived by the owner on May 2, 2023. It is now read-only.

Unable to install kafka-connect-datagen:0.1.0 #654

Open
mstolin opened this issue Dec 18, 2018 · 36 comments
Open

Unable to install kafka-connect-datagen:0.1.0 #654

mstolin opened this issue Dec 18, 2018 · 36 comments

Comments

@mstolin
Copy link

mstolin commented Dec 18, 2018

After running docker-compose up -d in examples/cp-all-in-one i get the following error:

$ docker-compose up -d
Building connect
Downloading context: https://github.com/confluentinc/kafka-connect-datagen/raw/master/Dockerfile-confluenthub     778B
Step 1/3 : FROM confluentinc/cp-kafka-connect:5.0.0
 ---> 7df8759460f7
Step 2/3 : ENV CONNECT_PLUGIN_PATH="/usr/share/java,/usr/share/confluent-hub-components"
 ---> Using cache
 ---> 54861200b09e
Step 3/3 : RUN  confluent-hub install --no-prompt confluentinc/kafka-connect-datagen:0.1.0
 ---> Running in e32e29b5eae7
Running in a "--no-prompt" mode 
java.net.UnknownHostException: api.hub.confluent.io 
 
Error: Unknown error 
ERROR: Service 'connect' failed to build: The command '/bin/sh -c confluent-hub install --no-prompt confluentinc/kafka-connect-datagen:0.1.0' returned a non-zero code: 7

I was able to avoid this error after i set connect back to image: confluentinc/cp-kafka-connect:5.0.0 .

@ybyzek
Copy link
Contributor

ybyzek commented Dec 19, 2018

@mstolin

From the command line (i.e., outside of Docker), what is the output from the following command?

curl https://api.hub.confluent.io/api/plugins

@gAmUssA
Copy link
Contributor

gAmUssA commented Dec 19, 2018

@mstolin which branch do you use?

@mstolin
Copy link
Author

mstolin commented Dec 19, 2018

@gAmUssA
5.1.0-post

@ybyzek

[{"name":"vertica-analytics-platform","version":"9.0.0","manifest_url":"https://api.hub.confluent.io/api/plugins/vertica/vertica-analytics-platform/versions/9.0.0","title":"Vertica Analytics Platform","description":"Vertica is the most advanced SQL database analytics portfolio built from the very first line of code to address the most demanding Big Data analytics initiatives.","logo":"https://d1i4a15mxbxib1.cloudfront.net/api/plugins/vertica/vertica-analytics-platform/versions/9.0.0/assets/Vertica.jpg","documentation_url":"https://my.vertica.com/docs/9.0.x/HTML/","source_url":null,"support":null,"owner":{"username":"vertica","type":"Organization","name":"Vertica","url":"https://my.vertica.com/","logo":"https://d1i4a15mxbxib1.cloudfront.net/api/plugins/vertica/vertica-analytics-platform/versions/9.0.0/assets/Vertica.jpg"},"archive":null,"docker_image":null,"confluent_verified":{"level":"standard"},"features":{"supported_encodings":["any"],"single_message_transforms":null,"confluent_control_center_integration":null,"kafka_connect_api":null,"delivery_guarantee":null},"license":[{"name":"Proprietary","url":null,"logo":null}],"component_types":["sink","source"],"release_date":null,"tags":["Analytics"],"requirements":null,"signatures":null,"last_modified":1535074643000},{"name":"debezium-connector-mysql","version":"0.8.3","manifest_url":"https://api.hub.confluent.io/api/plugins/debezium/debezium-connector-mysql/versions/0.8.3","title":"Debezium MySQL CDC Connector","description":"Debezium’s MySQL Connector can monitor and record all of the row-level changes in the databases on a <a href=\"https://www.mysql.com/\">MySQL</a> server or <a href=\"https://www.mysql.com/products/cluster/availability.html\">HA MySQL cluster</a>. The first time it connects to a MySQL server/cluster, it reads a consistent snapshot of all of the databases. When that snapshot is complete, the connector continuously reads the changes that were committed to MySQL 5.6 or later and generates corresponding insert, update and delete events. All of the events for each table are recorded in a separate Kafka topic, where they can be easily consumed by applications and services.\n\nAs of Debezium 0.4.0, this connector adds preliminary support for <a href=\"https://aws.amazon.com/rds/mysql/\">Amazon RDS</a> and <a href=\"https://aws.amazon.com/rds/aurora/\">Amazon Aurora (MySQL compatibility)</a>. However, due to limitations of these hosted forms of MySQL, the connector retains locks during an initial consistent snapshot for the duration of the snapshot.","logo":"https://d1i4a15mxbxib1.cloudfront.net/api/plugins/debezium/debezium-connector-mysql/versions/0.8.3/assets/color_debezium_256px.png","documentation_url":"http://debezium.io/docs/connectors/mysql/","source_url":"https://github.com/debezium/debezium/","support":null,"owner":{"username":"debezium","type":"Organization","name":"Debezium Community","url":"https://debezium.io","logo":"https://d1i4a15mxbxib1.cloudfront.net/api/plugins/debezium/debezium-connector-mysql/versions/0.8.3/assets/color_debezium_256px.png"},"archive":{"name":"debezium-debezium-connector-mysql-0.8.3.zip","url":"https://d1i4a15mxbxib1.cloudfront.net/api/plugins/debezium/debezium-connector-mysql/versions/0.8.3/debezium-debezium-connector-mysql-0.8.3.zip","mime_type":"application/zip","md5":"e961a734c7f71ea6139abf139eec7c72","sha1":"952565f693e3c4350634104a77464756daab1212","asc":null},"docker_image":{"namespace":"debezium","name":"debezium/connect","tag":"0.8.3","registry":null},"confluent_verified":null,"features":{"supported_encodings":["any"],"single_message_transforms":true,"confluent_control_center_integration":true,"kafka_connect_api":true,"delivery_guarantee":null},"license":[{"name":"Apache 2.0","url":"https://github.com/debezium/debezium/blob/master/LICENSE.txt","logo":null}],"component_types":["source"],"release_date":null,"tags":["change data capture","rdbms","cdc","mysql","dbms","relational","jdbc","amazon rds","amazon aurora","snapshot"],"requirements":["MySQL 5.6.x, 5.7.x, or later"],"signatures":null,"last_modified":1544832647000},{"name":"debezium-connector-postgresql","version":"0.8.3","manifest_url":"https://api.hub.confluent.io/api/plugins/debezium/debezium-connector-postgresql/versions/0.8.3","title":"Debezium PostgreSQL CDC Connector","description":"Debezium’s PostgreSQL Connector can monitor and record the row-level changes in the schemas of a <a href=\"\">PostgreSQL database</a>. The first time it connects to a PostgreSQL server/cluster, it reads a consistent snapshot of all of the schemas. When that snapshot is complete, the connector continuously streams the changes that were committed to PostgreSQL 9.6 or later and generates corresponding insert, update and delete events. All of the events for each table are recorded in a separate Kafka topic, where they can be easily consumed by applications and services.\nThis connector requires the PostgreSQL server have a logical decoding plugin installed and configured. See the <a href=\"http://debezium.io/docs/connectors/postgresql/\">documentation</a> for details.","logo":"https://d1i4a15mxbxib1.cloudfront.net/api/plugins/debezium/debezium-connector-postgresql/versions/0.8.3/assets/color_debezium_256px.png","documentation_url":"http://debezium.io/docs/connectors/postgresql/","source_url":"https://github.com/debezium/debezium/","support":null,"owner":{"username":"debezium","type":"Organization","name":"Debezium Community","url":"https://debezium.io","logo":"https://d1i4a15mxbxib1.cloudfront.net/api/plugins/debezium/debezium-connector-postgresql/versions/0.8.3/assets/color_debezium_256px.png"},"archive":{"name":"debezium-debezium-connector-postgresql-0.8.3.zip","url":"https://d1i4a15mxbxib1.cloudfront.net/api/plugins/debezium/debezium-connector-postgresql/versions/0.8.3/debezium-debezium-connector-postgresql-0.8.3.zip","mime_type":"application/zip","md5":"83395ba2db6484b613bc5acf7138b34f","sha1":"dc656c0865c08d7e70d237be323b4f4fa15ce53e","asc":null},"docker_image":{"namespace":"debezium","name":"debezium/connect","tag":"0.8.3","registry":null},"confluent_verified":null,"features":{"supported_encodings":["any"],"single_message_transforms":true,"confluent_control_center_integration":true,"kafka_connect_api":true,"delivery_guarantee":null},"license":[{"name":"Apache 2.0","url":"https://github.com/debezium/debezium/blob/master/LICENSE.txt","logo":null}],"component_types":["source"],"release_date":null,"tags":["change data capture","rdbms","cdc","postgresql","json","dbms","relational","postgres","snapshot"],"requirements":["PostgreSQL 9.6 or later"],"signatures":null,"last_modified":1544832651000},{"name":"debezium-connector-mongodb","version":"0.8.3","manifest_url":"https://api.hub.confluent.io/api/plugins/debezium/debezium-connector-mongodb/versions/0.8.3","title":"Debezium MongoDB CDC Connector","description":"Debezium’s MongoDB Connector can monitor a <a href=\"https://docs.mongodb.com/manual/tutorial/deploy-replica-set/\">MongoDB replica set</a> or a <a href=\"https://docs.mongodb.com/manual/core/sharded-cluster-components/\">MongoDB sharded cluster</a> for document changes in databases and collections, recording those changes as events in Kafka topics. The connector automatically handles the addition or removal of shards in a sharded cluster, changes in membership of each replica set, elections within each replica set, and awaiting the resolution of communications problems.","logo":"https://d1i4a15mxbxib1.cloudfront.net/api/plugins/debezium/debezium-connector-mongodb/versions/0.8.3/assets/color_debezium_256px.png","documentation_url":"http://debezium.io/docs/connectors/mongodb/","source_url":"https://github.com/debezium/debezium/","support":null,"owner":{"username":"debezium","type":"Organization","name":"Debezium Community","url":"https://debezium.io","logo":"https://d1i4a15mxbxib1.cloudfront.net/api/plugins/debezium/debezium-connector-mongodb/versions/0.8.3/assets/color_debezium_256px.png"},"archive":{"name":"debezium-debezium-connector-mongodb-0.8.3.zip","url":"https://d1i4a15mxbxib1.cloudfront.net/api/plugins/debezium/debezium-connector-mongodb/versions/0.8.3/debezium-debezium-connector-mongodb-0.8.3.zip","mime_type":"application/zip","md5":"336667c5469ba03e97471418f785e5bb","sha1":"014b282a5f8b6526aa2b66abc337663be0914ede","asc":null},"docker_image":{"namespace":"debezium","name":"debezium/connect","tag":"0.8.3","registry":null},"confluent_verified":null,"features":{"supported_encodings":["any"],"single_message_transforms":true,"confluent_control_center_integration":true,"kafka_connect_api":true,"delivery_guarantee":null},"license":[{"name":"Apache 2.0","url":"https://github.com/debezium/debezium/blob/master/LICENSE.txt","logo":null}],"component_types":["source"],"release_date":null,"tags":["change data capture","cdc","document database","json","mongodb","snapshot"],"requirements":["MongoDB 3.6.x and later"],"signatures":null,"last_modified":1544832644000},{"name":"kafka-connect-ibmmq","version":"5.1.0","manifest_url":"https://api.hub.confluent.io/api/plugins/confluentinc/kafka-connect-ibmmq/versions/5.1.0","title":"Kafka Connect IBM MQ","description":"The IBM MQ Source Connector is used to read messages from an IBM MQ cluster and write them to a Kafka topic.\n\nIt is included in <a href=\"https://www.confluent.io/product/confluent-enterprise/\">Confluent Enterprise Platform</a>, or can be downloaded and installed separately. It can be used for free for 30 days, but after that does require an Enterprise license. <a href=\"https://www.confluent.io/contact/\">Contact Confluent</a> for more details.","logo":"https://d1i4a15mxbxib1.cloudfront.net/api/plugins/confluentinc/kafka-connect-ibmmq/versions/5.1.0/assets/ibm-mq.jpg","documentation_url":"https://docs.confluent.io/5.1.0/connect/connect-jms/kafka-connect-ibmmq/docs/index.html","source_url":null,"support":{"provider_name":"Confluent, Inc.","summary":"This connector is <a href=\"http://confluent.io/subscription/\">fully supported by\nConfluent</a> as part of a\n<a href=\"https://www.confluent.io/product/confluent-enterprise/\">Confluent Enterprise Platform</a> subscription.","url":"http://confluent.io/subscription/","logo":"https://d1i4a15mxbxib1.cloudfront.net/api/plugins/confluentinc/kafka-connect-ibmmq/versions/5.1.0/assets/confluent.png"},"owner":{"username":"confluentinc","type":"Organization","name":"Confluent, Inc.","url":"http://confluent.io","logo":"https://d1i4a15mxbxib1.cloudfront.net/api/plugins/confluentinc/kafka-connect-ibmmq/versions/5.1.0/assets/confluent.png"},"archive":{"name":"confluentinc-kafka-connect-ibmmq-5.1.0.zip","url":"https://d1i4a15mxbxib1.cloudfront.net/api/plugins/confluentinc/kafka-connect-ibmmq/versions/5.1.0/confluentinc-kafka-connect-ibmmq-5.1.0.zip","mime_type":"application/zip","md5":"a2de5898c3e732fc6778c92a87877f6b","sha1":"ee3239ca3ac7510737f637703b0969a2d1697187","asc":null},"docker_image":{"namespace":null,"name":null,"tag":null,"registry":null},"confluent_verified":null,"features":{"supported_encodings":["any"],"single_message_transforms":true,"confluent_control_center_integration":true,"kafka_connect_api":true,"delivery_guarantee":null},"license":[{"name":"Confluent Software Evaluation License","url":"https://www.confluent.io/software-evaluation-license","logo":null}],"component_types":["source"],"release_date":"2018-12-15","tags":["JMS","IBM MQ","Message Broker"],"requirements":null,"signatures":null,"last_modified":1544899191000},{"name":"kafka-connect-activemq","version":"5.1.0","manifest_url":"https://api.hub.confluent.io/api/plugins/confluentinc/kafka-connect-activemq/versions/5.1.0","title":"Kafka Connect ActiveMQ","description":"The ActiveMQ Source Connector is used to read messages from an ActiveMQ cluster and write them to a Kafka topic.\n\nIt is included in <a href=\"https://www.confluent.io/product/confluent-enterprise/\">Confluent Enterprise Platform</a>, or can be downloaded and installed separately. It can be used for free for 30 days, but after that does require an Enterprise license. <a href=\"https://www.confluent.io/contact/\">Contact Confluent</a> for more details.","logo":null,"documentation_url":"https://docs.confluent.io/5.1.0/connect/connect-jms/kafka-connect-activemq/docs/index.html","source_url":null,"support":{"provider_name":"Confluent, Inc.","summary":"This connector is <a href=\"http://confluent.io/subscription/\">fully supported by\nConfluent</a> as part of a\n<a href=\"https://www.confluent.io/product/confluent-enterprise/\">Confluent Enterprise Platform</a> subscription.","url":"http://confluent.io/subscription/","logo":"https://d1i4a15mxbxib1.cloudfront.net/api/plugins/confluentinc/kafka-connect-activemq/versions/5.1.0/assets/confluent.png"},"owner":{"username":"confluentinc","type":"Organization","name":"Confluent, Inc.","url":"http://confluent.io","logo":"https://d1i4a15mxbxib1.cloudfront.net/api/plugins/confluentinc/kafka-connect-activemq/versions/5.1.0/assets/confluent.png"},"archive":{"name":"confluentinc-kafka-connect-activemq-5.1.0.zip","url":"https://d1i4a15mxbxib1.cloudfront.net/api/plugins/confluentinc/kafka-connect-activemq/versions/5.1.0/confluentinc-kafka-connect-activemq-5.1.0.zip","mime_type":"application/zip","md5":"a7698b5be35af036bbcd7d444d0a6fb4","sha1":"0cdbf9082e26bada4943f253d3697dd2d7a8cca5","asc":null},"docker_image":{"namespace":null,"name":null,"tag":null,"registry":null},"confluent_verified":null,"features":{"supported_encodings":["any"],"single_message_transforms":true,"confluent_control_center_integration":true,"kafka_connect_api":true,"delivery_guarantee":null},"license":[{"name":"Confluent Software Evaluation License","url":"https://www.confluent.io/software-evaluation-license","logo":null}],"component_types":["source"],"release_date":"2018-12-15","tags":["JMS","AMQ","Message Broker","ActiveMQ"],"requirements":null,"signatures":null,"last_modified":1544899166000},{"name":"kafka-connect-mqtt","version":"1.0.0-preview","manifest_url":"https://api.hub.confluent.io/api/plugins/confluentinc/kafka-connect-mqtt/versions/1.0.0-preview","title":"Kafka Connect MQTT","description":"A Kafka Connect plugin for sending and receiving data from a Mqtt broker.","logo":null,"documentation_url":"https://docs.confluent.io/current/connect/kafka-connect-mqtt/","source_url":null,"support":{"provider_name":null,"summary":"This connector is not currently supported and is instead provided as a preview covered by the <a href=\"https://www.confluent.io/confluent-software-evaluation-license/\">Confluent Software Evaluation License</a>.","url":null,"logo":null},"owner":{"username":"confluentinc","type":null,"name":"Confluent, Inc.","url":null,"logo":null},"archive":{"name":"confluentinc-kafka-connect-mqtt-1.0.0-preview.zip","url":"https://d1i4a15mxbxib1.cloudfront.net/api/plugins/confluentinc/kafka-connect-mqtt/versions/1.0.0-preview/confluentinc-kafka-connect-mqtt-1.0.0-preview.zip","mime_type":"application/zip","md5":"be4ba14518f5c33e19c7ced272db8158","sha1":"b56241d93962d6c8ca65ecaed01014a49c616fa1","asc":null},"docker_image":{"namespace":null,"name":null,"tag":null,"registry":null},"confluent_verified":null,"features":{"supported_encodings":["any"],"single_message_transforms":true,"confluent_control_center_integration":true,"kafka_connect_api":true,"delivery_guarantee":null},"license":[{"name":"Confluent Software Evaluation License","url":"https://www.confluent.io/software-evaluation-license","logo":null}],"component_types":["sink","source"],"release_date":"2018-10-18","tags":["MQTT","Internet of Things","IOT"],"requirements":null,"signatures":null,"last_modified":1544550369000},{"name":"kafka-connect-syslog","version":"1.0.0-preview","manifest_url":"https://api.hub.confluent.io/api/plugins/confluentinc/kafka-connect-syslog/versions/1.0.0-preview","title":"Kafka Connect Syslog","description":"The Confluent Syslog Connector is used to move messages from Network devices into Kafka.","logo":null,"documentation_url":"https://docs.confluent.io/connect/kafka-connect-syslog/","source_url":null,"support":{"provider_name":null,"summary":"This connector is not currently supported and is instead provided as a preview covered by the <a href=\"https://www.confluent.io/confluent-software-evaluation-license/\">Confluent Software Evaluation License</a>.","url":null,"logo":null},"owner":{"username":"confluentinc","type":"Organization","name":"Confluent, Inc.","url":"https://confluent.io/","logo":"https://d1i4a15mxbxib1.cloudfront.net/api/plugins/confluentinc/kafka-connect-syslog/versions/1.0.0-preview/assets/confluent.png"},"archive":{"name":"confluentinc-kafka-connect-syslog-1.0.0-preview.zip","url":"https://d1i4a15mxbxib1.cloudfront.net/api/plugins/confluentinc/kafka-connect-syslog/versions/1.0.0-preview/confluentinc-kafka-connect-syslog-1.0.0-preview.zip","mime_type":"application/zip","md5":"8427bded9738eab907eb4a773ef457c4","sha1":"cbc7ec14c4126ddf5f0738ffec9828f8141703f7","asc":null},"docker_image":{"namespace":null,"name":null,"tag":null,"registry":null},"confluent_verified":null,"features":{"supported_encodings":["any"],"single_message_transforms":true,"confluent_control_center_integration":true,"kafka_connect_api":true,"delivery_guarantee":null},"license":[{"name":"Confluent Software Evaluation License","url":"https://www.confluent.io/software-evaluation-license","logo":null}],"component_types":["source"],"release_date":"2018-11-23","tags":["Logging","Syslog","Network"],"requirements":null,"signatures":null,"last_modified":1544549739000},{"name":"kafka-connect-maprdb","version":"1.0.0-preview","manifest_url":"https://api.hub.confluent.io/api/plugins/confluentinc/kafka-connect-maprdb/versions/1.0.0-preview","title":"Kafka Connect MapRDB","description":"A Kafka Connect plugin for writing data from Kafka to MapR DB.","logo":null,"documentation_url":"https://github.com/jcustenborder/kafka-connect-maprdb","source_url":null,"support":{"provider_name":null,"summary":"This connector is not currently supported and is instead provided as a preview covered by the <a href=\"https://www.confluent.io/confluent-software-evaluation-license/\">Confluent Software Evaluation License</a>.","url":null,"logo":null},"owner":{"username":"confluentinc","type":null,"name":"Confluent, Inc.","url":null,"logo":null},"archive":{"name":"confluentinc-kafka-connect-maprdb-1.0.0-preview.zip","url":"https://d1i4a15mxbxib1.cloudfront.net/api/plugins/confluentinc/kafka-connect-maprdb/versions/1.0.0-preview/confluentinc-kafka-connect-maprdb-1.0.0-preview.zip","mime_type":"application/zip","md5":"9802525e8d62179534324178fdfbd020","sha1":"91364a5e4bc7ddb8e53d43d39c75d984caa80710","asc":null},"docker_image":{"namespace":null,"name":null,"tag":null,"registry":null},"confluent_verified":null,"features":{"supported_encodings":["any"],"single_message_transforms":true,"confluent_control_center_integration":true,"kafka_connect_api":true,"delivery_guarantee":null},"license":[{"name":"Confluent Software Evaluation License","url":"https://www.confluent.io/software-evaluation-license","logo":null}],"component_types":["sink"],"release_date":"2018-10-18","tags":["MapR","MapRDB"],"requirements":null,"signatures":null,"last_modified":1544550373000},{"name":"kafka-connect-jdbc","version":"5.1.0","manifest_url":"https://api.hub.confluent.io/api/plugins/confluentinc/kafka-connect-jdbc/versions/5.1.0","title":"Kafka Connect JDBC","description":"The JDBC source connector allows you to import data from any relational database with a JDBC driver into Kafka topics. By using JDBC, this connector can support a wide variety of databases without requiring custom code for each one.\n\nData is loaded by periodically executing a SQL query and creating an output record for each row in the result set. By default, all tables in a database are copied, each to its own output topic. The database is monitored for new or deleted tables and adapts automatically. When copying data from a table, the connector can load only new or modified rows by specifying which columns should be used to detect new or modified data.\n\nThe JDBC sink connector allows you to export data from Kafka topics to any relational database with a JDBC driver. By using JDBC, this connector can support a wide variety of databases without requiring a dedicated connector for each one. The connector polls data from Kafka to write to the database based on the topics subscription. It is possible to achieve idempotent writes with upserts. Auto-creation of tables, and limited auto-evolution is also supported.","logo":"https://d1i4a15mxbxib1.cloudfront.net/api/plugins/confluentinc/kafka-connect-jdbc/versions/5.1.0/assets/jdbc.jpg","documentation_url":"https://docs.confluent.io/5.1.0/connect/connect-jdbc/docs/index.html","source_url":"https://github.com/confluentinc/kafka-connect-jdbc","support":{"provider_name":"Confluent, Inc.","summary":"Confluent supports the JDBC sink and source connectors alongside community members as part of its Confluent Platform offering.","url":"https://docs.confluent.io/current/","logo":"https://d1i4a15mxbxib1.cloudfront.net/api/plugins/confluentinc/kafka-connect-jdbc/versions/5.1.0/assets/confluent.png"},"owner":{"username":"confluentinc","type":"Organization","name":"Confluent, Inc.","url":"https://confluent.io/","logo":"https://d1i4a15mxbxib1.cloudfront.net/api/plugins/confluentinc/kafka-connect-jdbc/versions/5.1.0/assets/confluent.png"},"archive":{"name":"confluentinc-kafka-connect-jdbc-5.1.0.zip","url":"https://d1i4a15mxbxib1.cloudfront.net/api/plugins/confluentinc/kafka-connect-jdbc/versions/5.1.0/confluentinc-kafka-connect-jdbc-5.1.0.zip","mime_type":"application/zip","md5":"3b54b8778808af2a4f773d7de196e552","sha1":"1da0e39fe0c9b9e2d845dfc25bb2b0600165e0ad","asc":null},"docker_image":{"namespace":"confluentinc","name":"cp-kafka-connect","tag":"5.1.0","registry":null},"confluent_verified":null,"features":{"supported_encodings":["any"],"single_message_transforms":true,"confluent_control_center_integration":true,"kafka_connect_api":true,"delivery_guarantee":null},"license":[{"name":"Confluent Community License","url":"http://www.confluent.io/confluent-community-license","logo":null}],"component_types":["sink","source"],"release_date":"2018-12-15","tags":["rdbms","oracle","sybase","vertica","sqlite","jdbc","dbms","sql server","sql","database","postgresql","db2","derby","mysql","sap hana"],"requirements":null,"signatures":null,"last_modified":1544900310000},{"name":"kafka-connect-s3","version":"5.1.0","manifest_url":"https://api.hub.confluent.io/api/plugins/confluentinc/kafka-connect-s3/versions/5.1.0","title":"Kafka Connect S3","description":"The S3 connector, currently available as a sink, allows you to export data from Kafka topics to S3 objects in either Avro or JSON formats. In addition, for certain data layouts, S3 connector exports data by guaranteeing exactly-once delivery semantics to consumers of the S3 objects it produces.\n\nBeing a sink, the S3 connector periodically polls data from Kafka and in turn uploads it to S3. A partitioner is used to split the data of every Kafka partition into chunks. Each chunk of data is represented as an S3 object, whose key name encodes the topic, the Kafka partition and the start offset of this data chunk. If no partitioner is specified in the configuration, the default partitioner which preserves Kafka partitioning is used. The size of each data chunk is determined by the number of records written to S3 and by schema compatibility.","logo":"https://d1i4a15mxbxib1.cloudfront.net/api/plugins/confluentinc/kafka-connect-s3/versions/5.1.0/assets/s3.jpg","documentation_url":"https://docs.confluent.io/5.1.0/connect/connect-storage-cloud/kafka-connect-s3/docs/index.html","source_url":"https://github.com/confluentinc/kafka-connect-storage-cloud","support":{"provider_name":"Confluent, Inc.","summary":"Confluent supports the S3 sink connector alongside community members as part of its Confluent Platform offering.","url":"https://docs.confluent.io/current/","logo":"https://d1i4a15mxbxib1.cloudfront.net/api/plugins/confluentinc/kafka-connect-s3/versions/5.1.0/assets/confluent.png"},"owner":{"username":"confluentinc","type":"Organization","name":"Confluent, Inc.","url":"https://confluent.io/","logo":"https://d1i4a15mxbxib1.cloudfront.net/api/plugins/confluentinc/kafka-connect-s3/versions/5.1.0/assets/confluent.png"},"archive":{"name":"confluentinc-kafka-connect-s3-5.1.0.zip","url":"https://d1i4a15mxbxib1.cloudfront.net/api/plugins/confluentinc/kafka-connect-s3/versions/5.1.0/confluentinc-kafka-connect-s3-5.1.0.zip","mime_type":"application/zip","md5":"9fc9e88970f48bd540df61d3226946c8","sha1":"9dcdabcd2226ad03d47b8cfb39760e295e43a268","asc":null},"docker_image":{"namespace":"confluentinc","name":"cp-kafka-connect","tag":"5.1.0","registry":null},"confluent_verified":null,"features":{"supported_encodings":["any"],"single_message_transforms":true,"confluent_control_center_integration":true,"kafka_connect_api":true,"delivery_guarantee":null},"license":[{"name":"Confluent Community License","url":"http://www.confluent.io/confluent-community-license","logo":null}],"component_types":["sink"],"release_date":"2018-12-15","tags":["s3","aws"],"requirements":["AWS S3 bucket with write permissions"],"signatures":null,"last_modified":1544900663000},{"name":"kafka-connect-cdc-mssql","version":"1.0.0-preview","manifest_url":"https://api.hub.confluent.io/api/plugins/confluentinc/kafka-connect-cdc-mssql/versions/1.0.0-preview","title":"Kafka Connect CDC Microsoft SQL","description":"Kafka Connect plugin for reading changes from Microsoft SQL Server utilizing the change tracking feature.","logo":null,"documentation_url":"https://docs.confluent.io/current/connect/kafka-connect-cdc-mssql/","source_url":null,"support":{"provider_name":null,"summary":"This connector is not currently supported and is instead provided as a preview covered by the <a href=\"https://www.confluent.io/confluent-software-evaluation-license/\">Confluent Software Evaluation License</a>.","url":null,"logo":null},"owner":{"username":"confluentinc","type":null,"name":"Confluent, Inc.","url":null,"logo":null},"archive":{"name":"confluentinc-kafka-connect-cdc-mssql-1.0.0-preview.zip","url":"https://d1i4a15mxbxib1.cloudfront.net/api/plugins/confluentinc/kafka-connect-cdc-mssql/versions/1.0.0-preview/confluentinc-kafka-connect-cdc-mssql-1.0.0-preview.zip","mime_type":"application/zip","md5":"0e767c2262b2d1becbdae873080de646","sha1":"ad1a1cf3d4080a331520da90728c1ad05148af55","asc":null},"docker_image":{"namespace":null,"name":null,"tag":null,"registry":null},"confluent_verified":null,"features":{"supported_encodings":["any"],"single_message_transforms":true,"confluent_control_center_integration":true,"kafka_connect_api":true,"delivery_guarantee":null},"license":[{"name":"Confluent Software Evaluation License","url":"https://www.confluent.io/software-evaluation-license","logo":null}],"component_types":["source"],"release_date":"2018-10-18","tags":["CDC","Microsoft SQL Server","Change Tracking"],"requirements":null,"signatures":null,"last_modified":1544550381000},{"name":"kafka-connect-salesforce","version":"1.0.0-preview","manifest_url":"https://api.hub.confluent.io/api/plugins/confluentinc/kafka-connect-salesforce/versions/1.0.0-preview","title":"Kafka Connect Salesforce","description":"The Salesforce connector integrates Salesforce.com with Kafka. The Salesforce Source Connector is used to capture changes from Salesforce.com utilizing the Salesforce streaming API.","logo":null,"documentation_url":"https://docs.confluent.io/current/connect/kafka-connect-salesforce/","source_url":null,"support":{"provider_name":null,"summary":"This connector is not currently supported and is instead provided as a preview covered by the <a href=\"https://www.confluent.io/confluent-software-evaluation-license/\">Confluent Software Evaluation License</a>.","url":null,"logo":null},"owner":{"username":"confluentinc","type":null,"name":"Confluent, Inc.","url":null,"logo":null},"archive":{"name":"confluentinc-kafka-connect-salesforce-1.0.0-preview.zip","url":"https://d1i4a15mxbxib1.cloudfront.net/api/plugins/confluentinc/kafka-connect-salesforce/versions/1.0.0-preview/confluentinc-kafka-connect-salesforce-1.0.0-preview.zip","mime_type":"application/zip","md5":"6e221b1f16fc82871217772135113477","sha1":"c06d7b1c51ba32c99314662bb8c8a2cc8226f359","asc":null},"docker_image":{"namespace":null,"name":null,"tag":null,"registry":null},"confluent_verified":null,"features":{"supported_encodings":["any"],"single_message_transforms":true,"confluent_control_center_integration":true,"kafka_connect_api":true,"delivery_guarantee":null},"license":[{"name":"Confluent Software Evaluation License","url":"https://www.confluent.io/software-evaluation-license","logo":null}],"component_types":["source"],"release_date":"2018-06-22","tags":["Salesforce"],"requirements":null,"signatures":null,"last_modified":1543256415000},{"name":"kafka-connect-avro-converter","version":"5.1.0","manifest_url":"https://api.hub.confluent.io/api/plugins/confluentinc/kafka-connect-avro-converter/versions/5.1.0","title":"Kafka Connect Avro Converter","description":"The Kafka Connect Avro Converter integrates with <a href=\"https://docs.confluent.io/current/schema-registry/docs/intro.html\">Schema Registry</a> to convert data for Kafka Connect to and from Avro format.","logo":null,"documentation_url":"https://docs.confluent.io/current/schema-registry/docs/connect.html","source_url":"https://github.com/confluentinc/schema-registry","support":{"provider_name":"Confluent, Inc.","summary":"Confluent supports the Avro Converter alongside community members as part of its Confluent Platform offering.","url":"https://docs.confluent.io/current/","logo":"https://d1i4a15mxbxib1.cloudfront.net/api/plugins/confluentinc/kafka-connect-avro-converter/versions/5.1.0/assets/confluent.png"},"owner":{"username":"confluentinc","type":"Organization","name":"Confluent, Inc.","url":"https://confluent.io/","logo":"https://d1i4a15mxbxib1.cloudfront.net/api/plugins/confluentinc/kafka-connect-avro-converter/versions/5.1.0/assets/confluent.png"},"archive":{"name":"confluentinc-kafka-connect-avro-converter-5.1.0.zip","url":"https://d1i4a15mxbxib1.cloudfront.net/api/plugins/confluentinc/kafka-connect-avro-converter/versions/5.1.0/confluentinc-kafka-connect-avro-converter-5.1.0.zip","mime_type":"application/zip","md5":"e3b2060768a9e1cf5d7dd5dfdb848d17","sha1":"2e95b1ed53016f172445f31ce247514674e4f9d6","asc":null},"docker_image":{"namespace":"confluentinc","name":"cp-kafka-connect","tag":"5.1.0","registry":null},"confluent_verified":null,"features":{"supported_encodings":["avro"],"single_message_transforms":false,"confluent_control_center_integration":false,"kafka_connect_api":false,"delivery_guarantee":null},"license":[{"name":"Apache License 2.0","url":"http://www.apache.org/licenses/LICENSE-2.0.html","logo":null}],"component_types":["converter"],"release_date":"2018-12-15","tags":["schema registry","avro"],"requirements":null,"signatures":null,"last_modified":1544901654000},{"name":"kafka-connect-replicator","version":"5.1.0","manifest_url":"https://api.hub.confluent.io/api/plugins/confluentinc/kafka-connect-replicator/versions/5.1.0","title":"Confluent Kafka Replicator","description":"Replicator allows you to easily and reliably replicate topics from one Kafka cluster to another. It continuously copies the messages in multiple topics, when necessary creating the topics in the destination cluster using the same topic configuration in the source cluster.\nThis includes preserving the number of partitions, the replication factor, and any configuration overrides specified for individual topics.\n\nReplicator is included in <a href=\"https://www.confluent.io/product/confluent-enterprise/\">Confluent Enterprise Platform</a>, or can be downloaded and installed separately. It can be used for free for 30 days, but after that does require an Enterprise license. <a href=\"https://www.confluent.io/contact/\">Contact Confluent</a> for more details.","logo":"https://d1i4a15mxbxib1.cloudfront.net/api/plugins/confluentinc/kafka-connect-replicator/versions/5.1.0/assets/apache-kafka.png","documentation_url":"https://docs.confluent.io/current/connect/connect-replicator/docs/index.html","source_url":"none","support":{"provider_name":"Confluent, Inc.","summary":"This connector is <a href=\"http://confluent.io/subscription/\">fully supported by\nConfluent</a> as part of a\n<a href=\"https://www.confluent.io/product/confluent-enterprise/\">Confluent Enterprise Platform</a> subscription.","url":"http://confluent.io/subscription/","logo":"https://d1i4a15mxbxib1.cloudfront.net/api/plugins/confluentinc/kafka-connect-replicator/versions/5.1.0/assets/confluent.png"},"owner":{"username":"confluentinc","type":"Organization","name":"Confluent, Inc.","url":"http://confluent.io","logo":"https://d1i4a15mxbxib1.cloudfront.net/api/plugins/confluentinc/kafka-connect-replicator/versions/5.1.0/assets/confluent.png"},"archive":{"name":"confluentinc-kafka-connect-replicator-5.1.0.zip","url":"https://d1i4a15mxbxib1.cloudfront.net/api/plugins/confluentinc/kafka-connect-replicator/versions/5.1.0/confluentinc-kafka-connect-replicator-5.1.0.zip","mime_type":"application/zip","md5":"25c6f9b53a95b12b33492daf0d7a130d","sha1":"b3ec4fb704d37d38aedefc68a7f852d91996ae8e","asc":null},"docker_image":{"namespace":"confluentinc","name":"cp-enterprise-replicator","tag":null,"registry":null},"confluent_verified":null,"features":{"supported_encodings":["any"],"single_message_transforms":true,"confluent_control_center_integration":true,"kafka_connect_api":true,"delivery_guarantee":null},"license":[{"name":"Confluent Software Evaluation License","url":"https://www.confluent.io/software-evaluation-license","logo":null}],"component_types":["source"],"release_date":"2018-12-15","tags":["Aggregation","Multi-DC","Active","Passive","Kafka","Cluster","Replication","Multiple data center"],"requirements":null,"signatures":null,"last_modified":1544900443000},{"name":"kafka-connect-jms","version":"5.1.0","manifest_url":"https://api.hub.confluent.io/api/plugins/confluentinc/kafka-connect-jms/versions/5.1.0","title":"Kafka Connect JMS","description":"The Confluent JMS Source Connector is used to move messages from any JMS-compliant broker into Kafka.\nIt supports any traditional JMS Broker, such as IBM MQ, ActiveMQ, Tibco EMS, and Solace Appliance.\nThis connector uses JNDI to connect to the JMS broker, consume messages from the specified topic or queue, and write them into the specified Kafka topic.\n\nIt is included in <a href=\"https://www.confluent.io/product/confluent-enterprise/\">Confluent Enterprise Platform</a>, or can be downloaded and installed separately. It can be used for free for 30 days, but after that does require an Enterprise license. <a href=\"https://www.confluent.io/contact/\">Contact Confluent</a> for more details.","logo":"https://d1i4a15mxbxib1.cloudfront.net/api/plugins/confluentinc/kafka-connect-jms/versions/5.1.0/assets/jms.jpeg","documentation_url":"https://docs.confluent.io/5.1.0/connect/connect-jms/kafka-connect-jms/docs/index.html","source_url":null,"support":{"provider_name":"Confluent, Inc.","summary":"This connector is <a href=\"http://confluent.io/subscription/\">fully supported by\nConfluent</a> as part of a\n<a href=\"https://www.confluent.io/product/confluent-enterprise/\">Confluent Enterprise Platform</a> subscription.","url":"http://confluent.io/subscription/","logo":"https://d1i4a15mxbxib1.cloudfront.net/api/plugins/confluentinc/kafka-connect-jms/versions/5.1.0/assets/confluent.png"},"owner":{"username":"confluentinc","type":"Organization","name":"Confluent, Inc.","url":"http://confluent.io","logo":"https://d1i4a15mxbxib1.cloudfront.net/api/plugins/confluentinc/kafka-connect-jms/versions/5.1.0/assets/confluent.png"},"archive":{"name":"confluentinc-kafka-connect-jms-5.1.0.zip","url":"https://d1i4a15mxbxib1.cloudfront.net/api/plugins/confluentinc/kafka-connect-jms/versions/5.1.0/confluentinc-kafka-connect-jms-5.1.0.zip","mime_type":"application/zip","md5":"573c307cd6246933c9c679125eecfe71","sha1":"a79e521905a33d6367ae6f5bde499e8b8d8c5ab8","asc":null},"docker_image":{"namespace":null,"name":null,"tag":null,"registry":null},"confluent_verified":null,"features":{"supported_encodings":["any"],"single_message_transforms":true,"confluent_control_center_integration":true,"kafka_connect_api":true,"delivery_guarantee":null},"license":[{"name":"Confluent Software Evaluation License","url":"https://www.confluent.io/software-evaluation-license","logo":null}],"component_types":["source"],"release_date":"2018-12-15","tags":["JMS","Message Broker"],"requirements":null,"signatures":null,"last_modified":1544899093000},{"name":"kafka-connect-neo4j","version":"1.0.0-preview","manifest_url":"https://api.hub.confluent.io/api/plugins/confluentinc/kafka-connect-neo4j/versions/1.0.0-preview","title":"Kafka Connect Neo4j","description":"The Neo4j connector is a Kafka Connect plugin for integration with <a href=\"https://neo4j.com/\">Neo4j</a>.","logo":null,"documentation_url":"https://docs.confluent.io/connect/kafka-connect-neo4j/","source_url":null,"support":{"provider_name":null,"summary":"This connector is not currently supported and is instead provided as a preview covered by the <a href=\"https://www.confluent.io/confluent-software-evaluation-license/\">Confluent Software Evaluation License</a>.","url":null,"logo":null},"owner":{"username":"confluentinc","type":"Organization","name":"Confluent, Inc.","url":"https://confluent.io/","logo":"https://d1i4a15mxbxib1.cloudfront.net/api/plugins/confluentinc/kafka-connect-neo4j/versions/1.0.0-preview/assets/confluent.png"},"archive":{"name":"confluentinc-kafka-connect-neo4j-1.0.0-preview.zip","url":"https://d1i4a15mxbxib1.cloudfront.net/api/plugins/confluentinc/kafka-connect-neo4j/versions/1.0.0-preview/confluentinc-kafka-connect-neo4j-1.0.0-preview.zip","mime_type":"application/zip","md5":"36133fb4e87edf1855579667f655450b","sha1":"54efaf3b40d275105f44f807452d775641d219f1","asc":null},"docker_image":{"namespace":null,"name":null,"tag":null,"registry":null},"confluent_verified":null,"features":{"supported_encodings":["any"],"single_message_transforms":true,"confluent_control_center_integration":true,"kafka_connect_api":true,"delivery_guarantee":null},"license":[{"name":"Confluent Software Evaluation License","url":"https://www.confluent.io/software-evaluation-license","logo":null}],"component_types":["sink"],"release_date":"2018-12-07","tags":["Graph","Neo4j","Database"],"requirements":null,"signatures":null,"last_modified":1544213888000},{"name":"kafka-connect-rabbitmq","version":"1.0.0-preview","manifest_url":"https://api.hub.confluent.io/api/plugins/confluentinc/kafka-connect-rabbitmq/versions/1.0.0-preview","title":"Kafka Connect RabbitMQ","description":"A Kafka Connect connector for reading data from RabbitMQ.","logo":null,"documentation_url":"https://docs.confluent.io/current/connect/kafka-connect-rabbitmq/","source_url":null,"support":{"provider_name":null,"summary":"This connector is not currently supported and is instead provided as a preview covered by the <a href=\"https://www.confluent.io/confluent-software-evaluation-license/\">Confluent Software Evaluation License</a>.","url":null,"logo":null},"owner":{"username":"confluentinc","type":null,"name":"Confluent, Inc.","url":null,"logo":null},"archive":{"name":"confluentinc-kafka-connect-rabbitmq-1.0.0-preview.zip","url":"https://d1i4a15mxbxib1.cloudfront.net/api/plugins/confluentinc/kafka-connect-rabbitmq/versions/1.0.0-preview/confluentinc-kafka-connect-rabbitmq-1.0.0-preview.zip","mime_type":"application/zip","md5":"ae6c27e9467e4d76f767c3985b8121a0","sha1":"9ec3ff67a40727a642c5f72631607a2be263a098","asc":null},"docker_image":{"namespace":null,"name":null,"tag":null,"registry":null},"confluent_verified":null,"features":{"supported_encodings":["any"],"single_message_transforms":true,"confluent_control_center_integration":true,"kafka_connect_api":true,"delivery_guarantee":null},"license":[{"name":"Confluent Software Evaluation License","url":"https://www.confluent.io/software-evaluation-license","logo":null}],"component_types":["source","transform"],"release_date":"2018-10-10","tags":["RabbitMQ","Messaging"],"requirements":null,"signatures":null,"last_modified":1544550385000},{"name":"kafka-connect-elasticsearch","version":"5.1.0","manifest_url":"https://api.hub.confluent.io/api/plugins/confluentinc/kafka-connect-elasticsearch/versions/5.1.0","title":"Kafka Connect Elasticsearch","description":"The Elasticsearch connector allows moving data from Kafka to Elasticsearch. It writes data from a topic in Kafka to an index in Elasticsearch and all data for a topic have the same type.\n\nElasticsearch is often used for text queries, analytics and as an key-value store (use cases). The connector covers both the analytics and key-value store use cases. For the analytics use case, each message is in Kafka is treated as an event and the connector uses topic+partition+offset as a unique identifier for events, which then converted to unique documents in Elasticsearch. For the key-value store use case, it supports using keys from Kafka messages as document ids in Elasticsearch and provides configurations ensuring that updates to a key are written to Elasticsearch in order. For both use cases, Elasticsearch’s idempotent write semantics guarantees exactly once delivery.\n\nMapping is the process of defining how a document, and the fields it contains, are stored and indexed. Users can explicitly define mappings for types in indices. When a mapping is not explicitly defined, Elasticsearch can determine field names and types from data, however, some types such as timestamp and decimal, may not be correctly inferred. To ensure that the types are correctly inferred, the connector provides a feature to infer a mapping from the schemas of Kafka messages.","logo":"https://d1i4a15mxbxib1.cloudfront.net/api/plugins/confluentinc/kafka-connect-elasticsearch/versions/5.1.0/assets/elasticsearch.jpg","documentation_url":"https://docs.confluent.io/5.1.0/connect/connect-elasticsearch/docs/index.html","source_url":"https://github.com/confluentinc/kafka-connect-elasticsearch","support":{"provider_name":"Confluent, Inc.","summary":"Confluent supports the Elasticsearch sink connector alongside community members as part of its Confluent Platform offering.","url":"https://docs.confluent.io/current/","logo":"https://d1i4a15mxbxib1.cloudfront.net/api/plugins/confluentinc/kafka-connect-elasticsearch/versions/5.1.0/assets/confluent.png"},"owner":{"username":"confluentinc","type":"Organization","name":"Confluent, Inc.","url":"https://confluent.io/","logo":"https://d1i4a15mxbxib1.cloudfront.net/api/plugins/confluentinc/kafka-connect-elasticsearch/versions/5.1.0/assets/confluent.png"},"archive":{"name":"confluentinc-kafka-connect-elasticsearch-5.1.0.zip","url":"https://d1i4a15mxbxib1.cloudfront.net/api/plugins/confluentinc/kafka-connect-elasticsearch/versions/5.1.0/confluentinc-kafka-connect-elasticsearch-5.1.0.zip","mime_type":"application/zip","md5":"331b96745dc89d89af1df5f340939564","sha1":"e695512934e25eee8810091603d7ebdc9fca350c","asc":null},"docker_image":{"namespace":"confluentinc","name":"cp-kafka-connect","tag":"5.1.0","registry":null},"confluent_verified":null,"features":{"supported_encodings":["any"],"single_message_transforms":true,"confluent_control_center_integration":true,"kafka_connect_api":true,"delivery_guarantee":null},"license":[{"name":"Confluent Community License","url":"http://www.confluent.io/confluent-community-license","logo":null}],"component_types":["sink"],"release_date":"2018-12-15","tags":["analytics","search","Elastic","elasticsearch","log"],"requirements":["Elasticsearch 2.x, 5.x, or 6.x"],"signatures":null,"last_modified":1544899410000},{"name":"kafka-connect-hdfs","version":"5.1.0","manifest_url":"https://api.hub.confluent.io/api/plugins/confluentinc/kafka-connect-hdfs/versions/5.1.0","title":"Kafka Connect HDFS","description":"The HDFS connector allows you to export data from Kafka topics to HDFS files in a variety of formats and integrates with Hive to make data immediately available for querying with HiveQL.\n\nThe connector periodically polls data from Kafka and writes them to HDFS. The data from each Kafka topic is partitioned by the provided partitioner and divided into chunks. Each chunk of data is represented as an HDFS file with topic, Kafka partition, start and end offsets of this data chunk in the filename. If no partitioner is specified in the configuration, the default partitioner which preserves the Kafka partitioning is used. The size of each data chunk is determined by the number of records written to HDFS, the time written to HDFS and schema compatibility.\n\nThe HDFS connector integrates with Hive and when it is enabled, the connector automatically creates an external Hive partitioned table for each Kafka topic and updates the table according to the available data in HDFS.","logo":null,"documentation_url":"https://docs.confluent.io/5.1.0/connect/connect-hdfs/docs/index.html","source_url":"https://github.com/confluentinc/kafka-connect-hdfs","support":{"provider_name":"Confluent, Inc.","summary":"Confluent supports the HDFS sink connector alongside community members as part of its Confluent Platform offering.","url":"https://docs.confluent.io/current/","logo":"https://d1i4a15mxbxib1.cloudfront.net/api/plugins/confluentinc/kafka-connect-hdfs/versions/5.1.0/assets/confluent.png"},"owner":{"username":"confluentinc","type":"Organization","name":"Confluent, Inc.","url":"https://confluent.io/","logo":"https://d1i4a15mxbxib1.cloudfront.net/api/plugins/confluentinc/kafka-connect-hdfs/versions/5.1.0/assets/confluent.png"},"archive":{"name":"confluentinc-kafka-connect-hdfs-5.1.0.zip","url":"https://d1i4a15mxbxib1.cloudfront.net/api/plugins/confluentinc/kafka-connect-hdfs/versions/5.1.0/confluentinc-kafka-connect-hdfs-5.1.0.zip","mime_type":"application/zip","md5":"68c21b2a59868aca10eebc67b8f937a5","sha1":"91bda801dc8811eb5a5214a17f69a597fe08a952","asc":null},"docker_image":{"namespace":"confluentinc","name":"cp-kafka-connect","tag":"5.1.0","registry":null},"confluent_verified":null,"features":{"supported_encodings":["any"],"single_message_transforms":true,"confluent_control_center_integration":true,"kafka_connect_api":true,"delivery_guarantee":null},"license":[{"name":"Confluent Community License","url":"http://www.confluent.io/confluent-community-license","logo":null}],"component_types":["sink"],"release_date":"2018-12-15","tags":["hive","hdfs","hadoop"],"requirements":null,"signatures":null,"last_modified":1544900065000},{"name":"kafka-connect-influxdb","version":"1.0.0-preview","manifest_url":"https://api.hub.confluent.io/api/plugins/confluentinc/kafka-connect-influxdb/versions/1.0.0-preview","title":"Kafka Connect InfluxDB","description":"This plugin has both a source and a sink for writing data to InfluxDB. The Sink Connector will process the data and batch the payload based on the host. The Source connector will emulate an InfluxDB endpoint and allow a standard client to write data.","logo":null,"documentation_url":"https://docs.confluent.io/current/connect/kafka-connect-influxdb/","source_url":null,"support":{"provider_name":null,"summary":"This connector is not currently supported and is instead provided as a preview covered by the <a href=\"https://www.confluent.io/confluent-software-evaluation-license/\">Confluent Software Evaluation License</a>.","url":null,"logo":null},"owner":{"username":"confluentinc","type":null,"name":"Confluent, Inc.","url":null,"logo":null},"archive":{"name":"confluentinc-kafka-connect-influxdb-1.0.0-preview.zip","url":"https://d1i4a15mxbxib1.cloudfront.net/api/plugins/confluentinc/kafka-connect-influxdb/versions/1.0.0-preview/confluentinc-kafka-connect-influxdb-1.0.0-preview.zip","mime_type":"application/zip","md5":"7df548cc1dc05f8a82808a3099934739","sha1":"92708247394325b3604792e9c2334fe5face921a","asc":null},"docker_image":{"namespace":null,"name":null,"tag":null,"registry":null},"confluent_verified":null,"features":{"supported_encodings":["any"],"single_message_transforms":true,"confluent_control_center_integration":true,"kafka_connect_api":true,"delivery_guarantee":null},"license":[{"name":"Confluent Software Evaluation License","url":"https://www.confluent.io/software-evaluation-license","logo":null}],"component_types":["sink"],"release_date":"2018-06-21","tags":["Time Series","Database"],"requirements":null,"signatures":null,"last_modified":1535074638000},{"name":"kafka-connect-kinesis","version":"1.0.0-preview","manifest_url":"https://api.hub.confluent.io/api/plugins/confluentinc/kafka-connect-kinesis/versions/1.0.0-preview","title":"Kafka Connect Kinesis","description":"Kafka Connect plugin for receiving data from Amazon Kinesis","logo":null,"documentation_url":"https://docs.confluent.io/current/connect/kafka-connect-kinesis/","source_url":null,"support":{"provider_name":null,"summary":"This connector is not currently supported and is instead provided as a preview covered by the <a href=\"https://www.confluent.io/confluent-software-evaluation-license/\">Confluent Software Evaluation License</a>.","url":null,"logo":null},"owner":{"username":"confluentinc","type":null,"name":"Confluent, Inc.","url":null,"logo":null},"archive":{"name":"confluentinc-kafka-connect-kinesis-1.0.0-preview.zip","url":"https://d1i4a15mxbxib1.cloudfront.net/api/plugins/confluentinc/kafka-connect-kinesis/versions/1.0.0-preview/confluentinc-kafka-connect-kinesis-1.0.0-preview.zip","mime_type":"application/zip","md5":"37195aeb050a006dfbbbfe3b607cb76f","sha1":"999110f6423f847876b6ed6fca25ae88fb015e3c","asc":null},"docker_image":{"namespace":null,"name":null,"tag":null,"registry":null},"confluent_verified":null,"features":{"supported_encodings":["any"],"single_message_transforms":true,"confluent_control_center_integration":true,"kafka_connect_api":true,"delivery_guarantee":null},"license":[{"name":"Confluent Software Evaluation License","url":"https://www.confluent.io/software-evaluation-license","logo":null}],"component_types":["source"],"release_date":"2018-10-18","tags":["Kinesis","AWS"],"requirements":null,"signatures":null,"last_modified":1544550388000},{"name":"kafka-connect-gcs","version":"5.0.1","manifest_url":"https://api.hub.confluent.io/api/plugins/confluentinc/kafka-connect-gcs/versions/5.0.1","title":"Kafka Connect GCS","description":"The GCS connector, currently available as a sink, allows you to export data from Kafka topics to GCS objects in either Avro or JSON formats.\n\nBeing a sink, the GCS connector periodically polls data from Kafka and in turn uploads it to GCS. A partitioner is used to split the data of every Kafka partition into chunks. Each chunk of data is represented as an GCS object, whose key name encodes the topic,\nthe Kafka partition and the start offset of this data chunk. If no partitioner is specified in the configuration, the default partitioner which preserves Kafka partitioning is used. The size of each data chunk is determined by the number of records written to GCS and by schema compatibility.\nIt is included in <a href=\"https://www.confluent.io/product/confluent-enterprise/\">Confluent Enterprise Platform</a>, or can be downloaded and installed separately. It can be used for free for 30 days, but after that does require an Enterprise license. <a href=\"https://www.confluent.io/contact/\">Contact Confluent</a> for more details.","logo":"https://d1i4a15mxbxib1.cloudfront.net/api/plugins/confluentinc/kafka-connect-gcs/versions/5.0.1/assets/googlecloud.png","documentation_url":"https://docs.confluent.io/current/connect/kafka-connect-gcs/index.html","source_url":null,"support":{"provider_name":"Confluent, Inc.","summary":"This connector is <a href=\"https://www.confluent.io/subscription/\">fully supported by Confluent</a> as part of a\n<a href=\"https://www.confluent.io/product/confluent-enterprise/\">Confluent Enterprise Platform</a> subscription.","url":"https://docs.confluent.io/current/","logo":null},"owner":{"username":"confluentinc","type":"Organization","name":"Confluent, Inc.","url":"https://confluent.io/","logo":null},"archive":{"name":"confluentinc-kafka-connect-gcs-5.0.1.zip","url":"https://d1i4a15mxbxib1.cloudfront.net/api/plugins/confluentinc/kafka-connect-gcs/versions/5.0.1/confluentinc-kafka-connect-gcs-5.0.1.zip","mime_type":"application/zip","md5":"cc046ba700bcdd86ecfbdac753ec195a","sha1":"90d5c45d925c3b98503dfac517d2574a2c3758f0","asc":null},"docker_image":{"namespace":null,"name":null,"tag":null,"registry":null},"confluent_verified":null,"features":{"supported_encodings":["any"],"single_message_transforms":true,"confluent_control_center_integration":true,"kafka_connect_api":true,"delivery_guarantee":null},"license":[{"name":"Confluent Software Evaluation License","url":"https://www.confluent.io/software-evaluation-license","logo":null}],"component_types":["sink"],"release_date":"2018-08-10","tags":["cloud","gcp","gcs","google","storage","platform"],"requirements":["GCS bucket with write permissions"],"signatures":null,"last_modified":1535074635000},{"name":"kafka-connect-vertica","version":"1.0.0-preview","manifest_url":"https://api.hub.confluent.io/api/plugins/confluentinc/kafka-connect-vertica/versions/1.0.0-preview","title":"Kafka Connect Vertica","description":"Kafka Connect plugin for writing data to HPE Vertica.","logo":null,"documentation_url":"https://docs.confluent.io/","source_url":null,"support":{"provider_name":null,"summary":"This connector is not currently supported and is instead provided as a preview covered by the <a href=\"https://www.confluent.io/confluent-software-evaluation-license/\">Confluent Software Evaluation License</a>.","url":null,"logo":null},"owner":{"username":"confluentinc","type":null,"name":"Confluent, Inc.","url":null,"logo":null},"archive":{"name":"confluentinc-kafka-connect-vertica-1.0.0-preview.zip","url":"https://d1i4a15mxbxib1.cloudfront.net/api/plugins/confluentinc/kafka-connect-vertica/versions/1.0.0-preview/confluentinc-kafka-connect-vertica-1.0.0-preview.zip","mime_type":"application/zip","md5":"607e477e34cdbe7b5913fd366ecc4298","sha1":"3634d9bdbcd8e8c7ab5bfeabf706497af04c0088","asc":null},"docker_image":{"namespace":null,"name":null,"tag":null,"registry":null},"confluent_verified":null,"features":{"supported_encodings":["any"],"single_message_transforms":true,"confluent_control_center_integration":true,"kafka_connect_api":true,"delivery_guarantee":null},"license":[{"name":"Confluent Software Evaluation License","url":"https://www.confluent.io/software-evaluation-license","logo":null}],"component_types":["sink"],"release_date":"2018-10-18","tags":["Vertica","Database"],"requirements":null,"signatures":null,"last_modified":1544550396000},{"name":"kafka-connect-datagen","version":"0.1.0","manifest_url":"https://api.hub.confluent.io/api/plugins/confluentinc/kafka-connect-datagen/versions/0.1.0","title":"Kafka Connect Datagen","description":"For demos only: A Kafka Connect connector for generating mock data, not suitable for production","logo":null,"documentation_url":"https://github.com/confluentinc/kafka-connect-datagen/blob/master/README.md","source_url":"https://github.com/confluentinc/kafka-connect-datagen","support":{"provider_name":"Community Support","summary":"This connector is open source at https://github.com/confluentinc/kafka-connect-datagen and supported by community members.","url":"https://github.com/confluentinc/kafka-connect-datagen","logo":null},"owner":{"username":"confluentinc","type":"Organization","name":"Confluent, Inc.","url":"https://confluent.io/","logo":"https://d1i4a15mxbxib1.cloudfront.net/api/plugins/confluentinc/kafka-connect-datagen/versions/0.1.0/assets/confluent.png"},"archive":{"name":"confluentinc-kafka-connect-datagen-0.1.0.zip","url":"https://d1i4a15mxbxib1.cloudfront.net/api/plugins/confluentinc/kafka-connect-datagen/versions/0.1.0/confluentinc-kafka-connect-datagen-0.1.0.zip","mime_type":"application/zip","md5":"99d1186196719fd953014c8122891560","sha1":"674d880bc61b6f203da367f8197996bfb9e04273","asc":null},"docker_image":{"namespace":null,"name":null,"tag":null,"registry":null},"confluent_verified":null,"features":{"supported_encodings":["any"],"single_message_transforms":true,"confluent_control_center_integration":true,"kafka_connect_api":true,"delivery_guarantee":null},"license":[{"name":"The Apache License, Version 2.0","url":"https://www.apache.org/licenses/LICENSE-2.0","logo":null}],"component_types":["source"],"release_date":"2018-10-30","tags":["datagen","generator","demo"],"requirements":["Confluent Platform 4.x or later","Apache Kafka 1.x or later"],"signatures":null,"last_modified":1540913007000},{"name":"kafka-connect-cassandra","version":"1.0.2","manifest_url":"https://api.hub.confluent.io/api/plugins/confluentinc/kafka-connect-cassandra/versions/1.0.2","title":"Kafka Connect Cassandra","description":"The Confluent Cassandra Sink Connector is used to move messages from Kafka into <a href=\"http://cassandra.apache.org/\">Apache Cassandra</a>.\nIt can be used for free for 30 days, but after that does require a Confluent Enterprise license. <a href=\"https://www.confluent.io/contact/\">Contact Confluent</a> for more details.","logo":null,"documentation_url":"https://docs.confluent.io/current/connect/kafka-connect-cassandra/","source_url":null,"support":{"provider_name":"Confluent, Inc.","summary":"This connector is <a href=\"https://www.confluent.io/subscription\">fully supported by\nConfluent</a> as of Confluent Platform 5.0.0, as part of a\n<a href=\"https://www.confluent.io/product/confluent-enterprise/\">Confluent Enterprise Platform</a> subscription.","url":"https://docs.confluent.io/connect/kafka-connect-cassandra/","logo":"https://d1i4a15mxbxib1.cloudfront.net/api/plugins/confluentinc/kafka-connect-cassandra/versions/1.0.2/assets/confluent.png"},"owner":{"username":"confluentinc","type":"Organization","name":"Confluent, Inc.","url":"https://confluent.io/","logo":"https://d1i4a15mxbxib1.cloudfront.net/api/plugins/confluentinc/kafka-connect-cassandra/versions/1.0.2/assets/confluent.png"},"archive":{"name":"confluentinc-kafka-connect-cassandra-1.0.2.zip","url":"https://d1i4a15mxbxib1.cloudfront.net/api/plugins/confluentinc/kafka-connect-cassandra/versions/1.0.2/confluentinc-kafka-connect-cassandra-1.0.2.zip","mime_type":"application/zip","md5":"0cb6869d11de25a2b53c776929fbe1cd","sha1":"8ae38d0530ad50941d249c7e56bc4e626e21af42","asc":null},"docker_image":{"namespace":null,"name":null,"tag":null,"registry":null},"confluent_verified":null,"features":{"supported_encodings":["any"],"single_message_transforms":true,"confluent_control_center_integration":true,"kafka_connect_api":true,"delivery_guarantee":null},"license":[{"name":"Confluent Software Evaluation License","url":"https://www.confluent.io/software-evaluation-license","logo":null}],"component_types":["sink"],"release_date":"2018-09-21","tags":["Cassandra","Database"],"requirements":null,"signatures":null,"last_modified":1537577561000},{"name":"kafka-connect-iothub","version":"0.6","manifest_url":"https://api.hub.confluent.io/api/plugins/microsoft/kafka-connect-iothub/versions/0.6","title":"Kafka Connect IoT Hub","description":"Kafka Connect Azure IoT Hub consists of 2 connectors - a source connector and a sink connector. The source connector is used to pump data from <a href=\"https://azure.microsoft.com/en-us/services/iot-hub/\">Azure IoT Hub</a> to <a href=\"https://kafka.apache.org/\">Apache Kafka</a>, whereas the sink connector reads messages from Kafka and sends them to IoT devices via <a href=\"https://azure.microsoft.com/en-us/services/iot-hub/\">Azure IoT Hub</a>. When used in tandem, the 2 connectors allow communicating with IoT devices by simply posting and reading messages to/from Kafka topics. This should make it easier for open source systems and other systems that already interface with Kafka to communicate with Azure IoT devices.\nFor more information on the capabilities of the connectors and how to use them, please refer to <a href=\"https://github.com/Azure/toketi-kafka-connect-iothub/blob/master/README_Source.md\">source connector</a> and <a href=\"https://github.com/Azure/toketi-kafka-connect-iothub/blob/master/README_Sink.md\">sink connector</a>.","logo":"https://d1i4a15mxbxib1.cloudfront.net/api/plugins/microsoft/kafka-connect-iothub/versions/0.6/assets/Azure.png","documentation_url":"https://github.com/Azure/toketi-kafka-connect-iothub/blob/master/README_Sink.md","source_url":"https://github.com/Azure/toketi-kafka-connect-iothub/","support":null,"owner":{"username":"microsoft","type":"Organization","name":"Microsoft Corporation","url":"https://microsoft.com","logo":"https://d1i4a15mxbxib1.cloudfront.net/api/plugins/microsoft/kafka-connect-iothub/versions/0.6/assets/Microsoft.jpg"},"archive":{"name":"microsoft-kafka-connect-iothub-0.6.zip","url":"https://d1i4a15mxbxib1.cloudfront.net/api/plugins/microsoft/kafka-connect-iothub/versions/0.6/microsoft-kafka-connect-iothub-0.6.zip","mime_type":"application/zip","md5":"74bffa5e38470706ca33376a2f7a4821","sha1":"28a1b64facf94834f3ceda62061994c9cfee05a1","asc":null},"docker_image":null,"confluent_verified":null,"features":{"supported_encodings":["any"],"single_message_transforms":true,"confluent_control_center_integration":true,"kafka_connect_api":true,"delivery_guarantee":null},"license":[{"name":"MIT","url":"https://github.com/Azure/toketi-kafka-connect-iothub/blob/master/LICENSE","logo":null}],"component_types":["sink","source"],"release_date":null,"tags":["azure","iot","messaging"],"requirements":null,"signatures":null,"last_modified":1535074643000},{"name":"attunity-cdc","version":"6.1.0","manifest_url":"https://api.hub.confluent.io/api/plugins/attunity/attunity-cdc/versions/6.1.0","title":"Attunity Replicate","description":"Used by enterprises around the world, Attunity Replicate is a software solution that accelerates data replication, ingest, and streaming across a wide range of databases, data warehouses and data platforms. For more information go to:<a style=\"color:#4597cb\" href=\"www.attunity.com/confluent\">www.attunity.com/confluent</a>.","logo":"https://d1i4a15mxbxib1.cloudfront.net/api/plugins/attunity/attunity-cdc/versions/6.1.0/assets/logo.jpg","documentation_url":"https://discover.attunity.com/contact-us","source_url":null,"support":{"provider_name":"Atunity","summary":"For more information, go <a style=\"color:#4597cb\" href=\"https://discover.attunity.com/contact-us\">here</a>.","url":null,"logo":null},"owner":{"username":"attunity","type":"Organization","name":"Attunity ","url":"https://discover.attunity.com/contact-us","logo":"https://d1i4a15mxbxib1.cloudfront.net/api/plugins/attunity/attunity-cdc/versions/6.1.0/assets/logo.jpg"},"archive":null,"docker_image":null,"confluent_verified":{"level":"standard"},"features":{"supported_encodings":["any"],"single_message_transforms":null,"confluent_control_center_integration":null,"kafka_connect_api":null,"delivery_guarantee":null},"license":[{"name":"Proprietary","url":null,"logo":null}],"component_types":["source"],"release_date":null,"tags":["CDC"],"requirements":null,"signatures":null,"last_modified":1545244431000},{"name":"kafka-connect-hbase","version":"1.0.0","manifest_url":"https://api.hub.confluent.io/api/plugins/nishutayal/kafka-connect-hbase/versions/1.0.0","title":"Kafka Connect HBase Sink","description":"It's a basic Apache Kafka Connect SinkConnector which allows moving data from Kafka topics into HBase tables.","logo":null,"documentation_url":"https://github.com/nishutayal/kafka-connect-hbase/blob/master/README.md","source_url":"https://github.com/nishutayal/kafka-connect-hbase","support":{"provider_name":"Open Source Community","summary":"Support provided through community involvement.","url":"https://github.com/nishutayal/kafka-connect-hbase/issues","logo":null},"owner":{"username":"nishutayal","type":null,"name":"Nishu Tayal","url":null,"logo":null},"archive":{"name":"nishutayal-kafka-connect-hbase-1.0.0.zip","url":"https://d1i4a15mxbxib1.cloudfront.net/api/plugins/nishutayal/kafka-connect-hbase/versions/1.0.0/nishutayal-kafka-connect-hbase-1.0.0.zip","mime_type":"application/zip","md5":"a654d2a9965b23755fe69697e5acf132","sha1":"3b97edf1b2e89632335ee6197916145c054eaa80","asc":null},"docker_image":{"namespace":null,"name":null,"tag":null,"registry":null},"confluent_verified":null,"features":{"supported_encodings":["any"],"single_message_transforms":true,"confluent_control_center_integration":true,"kafka_connect_api":true,"delivery_guarantee":null},"license":[{"name":"Apache License 2.0","url":"http://www.apache.org/licenses/LICENSE-2.0.html","logo":null}],"component_types":["sink"],"release_date":"2018-10-16","tags":["hbase"],"requirements":null,"signatures":null,"last_modified":1539645525000},{"name":"couchbase-connector","version":"3.3.0","manifest_url":"https://api.hub.confluent.io/api/plugins/couchbase/couchbase-connector/versions/3.3.0","title":"Couchbase DB Connector","description":"kafka-connect-couchbase is a Kafka Connect plugin for transferring data between Couchbase Server and Kafka. It includes a \"source connector\" for publishing document change notifications from Couchbase to a Kafka topic, as well as a \"sink connector\" that subscribes to one or more Kafka topics and writes the messages to Couchbase.","logo":"https://d1i4a15mxbxib1.cloudfront.net/api/plugins/couchbase/couchbase-connector/versions/3.3.0/assets/CouchDB2.jpg","documentation_url":"https://developer.couchbase.com/documentation/server/current/connectors/kafka/kafka-intro.html","source_url":"https://github.com/couchbase/kafka-connect-couchbase","support":null,"owner":{"username":"couchbase","type":"Organization","name":"Couchbase","url":"https://www.couchbase.com/","logo":"https://d1i4a15mxbxib1.cloudfront.net/api/plugins/couchbase/couchbase-connector/versions/3.3.0/assets/CouchDB2.jpg"},"archive":null,"docker_image":null,"confluent_verified":{"level":"standard"},"features":{"supported_encodings":["any"],"single_message_transforms":null,"confluent_control_center_integration":null,"kafka_connect_api":null,"delivery_guarantee":null},"license":[{"name":"Apache 2.0","url":null,"logo":null}],"component_types":["sink","source"],"release_date":null,"tags":["Database"],"requirements":null,"signatures":null,"last_modified":1535074642000},{"name":"kafka-connect-mongodb","version":"1.2.0","manifest_url":"https://api.hub.confluent.io/api/plugins/hpgrahsl/kafka-connect-mongodb/versions/1.2.0","title":"Kafka Connect MongoDB Sink","description":"It's a basic Apache Kafka Connect SinkConnector which allows moving data from Kafka topics into MongoDB collections.","logo":null,"documentation_url":"https://github.com/hpgrahsl/kafka-connect-mongodb/blob/master/README.md","source_url":"https://github.com/hpgrahsl/kafka-connect-mongodb","support":{"provider_name":"Open Source Community","summary":"Support provided through community involvement.","url":"https://github.com/hpgrahsl/kafka-connect-mongodb/issues","logo":null},"owner":{"username":"hpgrahsl","type":null,"name":"Hans-Peter Grahsl","url":"https://twitter.com/hpgrahsl","logo":null},"archive":{"name":"hpgrahsl-kafka-connect-mongodb-1.2.0.zip","url":"https://d1i4a15mxbxib1.cloudfront.net/api/plugins/hpgrahsl/kafka-connect-mongodb/versions/1.2.0/hpgrahsl-kafka-connect-mongodb-1.2.0.zip","mime_type":"application/zip","md5":"e0691b9772b6e04f4e5210929907dff7","sha1":"e571e2f6596332206a0bd9abf8d594fbc4ec400d","asc":null},"docker_image":{"namespace":null,"name":null,"tag":null,"registry":null},"confluent_verified":null,"features":{"supported_encodings":["any"],"single_message_transforms":true,"confluent_control_center_integration":true,"kafka_connect_api":true,"delivery_guarantee":null},"license":[{"name":"The Apache License, Version 2.0","url":"https://www.apache.org/licenses/LICENSE-2.0","logo":null}],"component_types":["sink"],"release_date":"2018-10-20","tags":["mongo","giantideas","humongous","documents","json","mongodb","bson","nosql"],"requirements":null,"signatures":null,"last_modified":1540329219000},{"name":"kafka-connect-redis","version":"0.0.2.4","manifest_url":"https://api.hub.confluent.io/api/plugins/jcustenborder/kafka-connect-redis/versions/0.0.2.4","title":"Kafka Connect Redis","description":"A Kafka Connect connector receiving data from redis.","logo":null,"documentation_url":"https://jcustenborder.github.io/kafka-connect-documentation/","source_url":"https://github.com/jcustenborder/kafka-connect-redis","support":{"provider_name":null,"summary":"Support provided through community involvement.","url":"https://github.com/jcustenborder/kafka-connect-redis/issues","logo":null},"owner":{"username":"jcustenborder","type":null,"name":"Jeremy Custenborder","url":null,"logo":null},"archive":{"name":"jcustenborder-kafka-connect-redis-0.0.2.4.zip","url":"https://d1i4a15mxbxib1.cloudfront.net/api/plugins/jcustenborder/kafka-connect-redis/versions/0.0.2.4/jcustenborder-kafka-connect-redis-0.0.2.4.zip","mime_type":"application/zip","md5":"5075950974093284693fcb16379acd8a","sha1":"91df4df8b5081a2efcd0d390a4f53b74c1492702","asc":null},"docker_image":{"namespace":"jcustenborder","name":"kafka-connect-docker","tag":null,"registry":null},"confluent_verified":null,"features":{"supported_encodings":["any"],"single_message_transforms":true,"confluent_control_center_integration":true,"kafka_connect_api":true,"delivery_guarantee":null},"license":[{"name":"The Apache License, Version 2.0","url":"https://www.apache.org/licenses/LICENSE-2.0","logo":null}],"component_types":["sink"],"release_date":"2018-10-21","tags":["Redis"],"requirements":null,"signatures":null,"last_modified":1540080096000},{"name":"kafka-connect-simulator","version":"0.1.119","manifest_url":"https://api.hub.confluent.io/api/plugins/jcustenborder/kafka-connect-simulator/versions/0.1.119","title":"Kafka Connect Simulator","description":"A Kafka Connect connector for generating test data.","logo":null,"documentation_url":"https://jcustenborder.github.io/kafka-connect-documentation/","source_url":"https://github.com/jcustenborder/kafka-connect-simulator","support":{"provider_name":null,"summary":"Support provided through community involvement.","url":"https://github.com/jcustenborder/kafka-connect-simulator/issues","logo":null},"owner":{"username":"jcustenborder","type":null,"name":"Jeremy Custenborder","url":null,"logo":null},"archive":{"name":"jcustenborder-kafka-connect-simulator-0.1.119.zip","url":"https://d1i4a15mxbxib1.cloudfront.net/api/plugins/jcustenborder/kafka-connect-simulator/versions/0.1.119/jcustenborder-kafka-connect-simulator-0.1.119.zip","mime_type":"application/zip","md5":"65625bd5d72a51335e23e9b15ac8d20e","sha1":"0230a97a9cd3884061702176b54e2e2963b817b9","asc":null},"docker_image":{"namespace":"jcustenborder","name":"kafka-connect-docker","tag":null,"registry":null},"confluent_verified":null,"features":{"supported_encodings":["any"],"single_message_transforms":true,"confluent_control_center_integration":true,"kafka_connect_api":true,"delivery_guarantee":null},"license":[{"name":"Apache License 2.0","url":"https:/github.com/jcustenborder/kafka-connect-simulator/LICENSE","logo":null}],"component_types":["sink","source"],"release_date":"2018-10-29","tags":["Simulator"],"requirements":null,"signatures":null,"last_modified":1540841987000},{"name":"kafka-connect-memcached","version":"0.1.0.9","manifest_url":"https://api.hub.confluent.io/api/plugins/jcustenborder/kafka-connect-memcached/versions/0.1.0.9","title":"Kafka Connect Memcached","description":"A Kafka Connect plugin for writing data to Memcached.","logo":null,"documentation_url":"https://jcustenborder.github.io/kafka-connect-documentation/","source_url":"https://github.com/jcustenborder/kafka-connect-memcached","support":{"provider_name":null,"summary":"Support provided through community involvement.","url":"https://github.com/jcustenborder/kafka-connect-memcached/issues","logo":null},"owner":{"username":"jcustenborder","type":null,"name":"Jeremy Custenborder","url":null,"logo":null},"archive":{"name":"jcustenborder-kafka-connect-memcached-0.1.0.9.zip","url":"https://d1i4a15mxbxib1.cloudfront.net/api/plugins/jcustenborder/kafka-connect-memcached/versions/0.1.0.9/jcustenborder-kafka-connect-memcached-0.1.0.9.zip","mime_type":"application/zip","md5":"d4b30fbd14b59e6210505fdacdb1714c","sha1":"aa772db10d7a07212d4b3c55eaa1951e52f395bf","asc":null},"docker_image":{"namespace":"jcustenborder","name":"kafka-connect-docker","tag":null,"registry":null},"confluent_verified":null,"features":{"supported_encodings":["any"],"single_message_transforms":true,"confluent_control_center_integration":true,"kafka_connect_api":true,"delivery_guarantee":null},"license":[{"name":"Apache License 2.0","url":"https://www.apache.org/licenses/LICENSE-2.0.txt","logo":null}],"component_types":["sink"],"release_date":"2018-10-31","tags":["cache","Memcache"],"requirements":null,"signatures":null,"last_modified":1540996674000},{"name":"kafka-connect-twitter","version":"0.2.32","manifest_url":"https://api.hub.confluent.io/api/plugins/jcustenborder/kafka-connect-twitter/versions/0.2.32","title":"Kafka Connect Twitter","description":"Kafka Connect plugin for streaming data from Twitter to Kafka.","logo":null,"documentation_url":"https://jcustenborder.github.io/kafka-connect-documentation/","source_url":"https://github.com/jcustenborder/kafka-connect-twitter","support":{"provider_name":null,"summary":"Support provided through community involvement.","url":"https://github.com/jcustenborder/kafka-connect-twitter/issues","logo":null},"owner":{"username":"jcustenborder","type":null,"name":"Jeremy Custenborder","url":null,"logo":null},"archive":{"name":"jcustenborder-kafka-connect-twitter-0.2.32.zip","url":"https://d1i4a15mxbxib1.cloudfront.net/api/plugins/jcustenborder/kafka-connect-twitter/versions/0.2.32/jcustenborder-kafka-connect-twitter-0.2.32.zip","mime_type":"application/zip","md5":"6fce53599dca598ef935cae9d0b25f3b","sha1":"93bb8d2adaebf9ce78a49d28da972c330d92adb8","asc":null},"docker_image":{"namespace":"jcustenborder","name":"kafka-connect-docker","tag":null,"registry":null},"confluent_verified":null,"features":{"supported_encodings":["any"],"single_message_transforms":true,"confluent_control_center_integration":true,"kafka_connect_api":true,"delivery_guarantee":null},"license":[{"name":"Apache License 2.0","url":"https:/github.com/jcustenborder/kafka-connect-twitter/LICENSE","logo":null}],"component_types":["source"],"release_date":"2018-11-06","tags":["Social","Twitter"],"requirements":null,"signatures":null,"last_modified":1541543572000},{"name":"kafka-connect-solr","version":"0.1.27","manifest_url":"https://api.hub.confluent.io/api/plugins/jcustenborder/kafka-connect-solr/versions/0.1.27","title":"Kafka Connect Solr","description":"A Kafka Connect connector copying data from Kafka to Solr.","logo":null,"documentation_url":"https://jcustenborder.github.io/kafka-connect-documentation/","source_url":"https://github.com/jcustenborder/kafka-connect-solr","support":{"provider_name":null,"summary":"Support provided through community involvement.","url":"https://github.com/jcustenborder/kafka-connect-solr/issues","logo":null},"owner":{"username":"jcustenborder","type":null,"name":"Jeremy Custenborder","url":null,"logo":null},"archive":{"name":"jcustenborder-kafka-connect-solr-0.1.27.zip","url":"https://d1i4a15mxbxib1.cloudfront.net/api/plugins/jcustenborder/kafka-connect-solr/versions/0.1.27/jcustenborder-kafka-connect-solr-0.1.27.zip","mime_type":"application/zip","md5":"b516c7c4918716a3d55629e8e41a20df","sha1":"4124b5123dbd040b849c28d80917351e76cc0d77","asc":null},"docker_image":{"namespace":"jcustenborder","name":"kafka-connect-docker","tag":null,"registry":null},"confluent_verified":null,"features":{"supported_encodings":["any"],"single_message_transforms":true,"confluent_control_center_integration":true,"kafka_connect_api":true,"delivery_guarantee":null},"license":[{"name":"The Apache License, Version 2.0","url":"https://www.apache.org/licenses/LICENSE-2.0","logo":null}],"component_types":["sink"],"release_date":"2018-10-22","tags":["search","Apache Solr"],"requirements":null,"signatures":null,"last_modified":1540234610000},{"name":"kafka-connect-transform-common","version":"0.1.0.26","manifest_url":"https://api.hub.confluent.io/api/plugins/jcustenborder/kafka-connect-transform-common/versions/0.1.0.26","title":"Kafka Connect Common Transformations","description":"Common transformations for Kafka Connect.","logo":null,"documentation_url":"https://jcustenborder.github.io/kafka-connect-documentation/","source_url":"https://github.com/jcustenborder/kafka-connect-transform-common","support":{"provider_name":null,"summary":"Support provided through community involvement.","url":"https://github.com/jcustenborder/kafka-connect-transform-common/issues","logo":null},"owner":{"username":"jcustenborder","type":null,"name":"Jeremy Custenborder","url":null,"logo":null},"archive":{"name":"jcustenborder-kafka-connect-transform-common-0.1.0.26.zip","url":"https://d1i4a15mxbxib1.cloudfront.net/api/plugins/jcustenborder/kafka-connect-transform-common/versions/0.1.0.26/jcustenborder-kafka-connect-transform-common-0.1.0.26.zip","mime_type":"application/zip","md5":"ff41ed0bd9285cec6c2b373cdb261799","sha1":"4b227dbb38fa21d289539bd311eeddd136a1ab55","asc":null},"docker_image":{"namespace":"jcustenborder","name":"kafka-connect-docker","tag":null,"registry":null},"confluent_verified":null,"features":{"supported_encodings":["any"],"single_message_transforms":true,"confluent_control_center_integration":true,"kafka_connect_api":true,"delivery_guarantee":null},"license":[{"name":"Apache License 2.0","url":"https:/github.com/jcustenborder/kafka-connect-transform-common/LICENSE","logo":null}],"component_types":["source"],"release_date":"2018-10-24","tags":["Social","Twitter"],"requirements":null,"signatures":null,"last_modified":1540403240000},{"name":"kafka-connect-transform-xml","version":"0.1.0.12","manifest_url":"https://api.hub.confluent.io/api/plugins/jcustenborder/kafka-connect-transform-xml/versions/0.1.0.12","title":"Xml Transformation","description":"Kafka Connect transformation for handling Xml data based on a XSD. This transformation will convert text based Xml","logo":null,"documentation_url":"https://jcustenborder.github.io/kafka-connect-documentation/","source_url":"https://github.com/jcustenborder/kafka-connect-transform-xml","support":{"provider_name":null,"summary":"Support provided through community involvement.","url":"https://github.com/jcustenborder/kafka-connect-transform-xml/issues","logo":null},"owner":{"username":"jcustenborder","type":null,"name":"Jeremy Custenborder","url":null,"logo":null},"archive":{"name":"jcustenborder-kafka-connect-transform-xml-0.1.0.12.zip","url":"https://d1i4a15mxbxib1.cloudfront.net/api/plugins/jcustenborder/kafka-connect-transform-xml/versions/0.1.0.12/jcustenborder-kafka-connect-transform-xml-0.1.0.12.zip","mime_type":"application/zip","md5":"01b4bf297d75eeb2efaba6951bc43f8b","sha1":"716fa42fe62d5bab589d6595c9bac10acec2beff","asc":null},"docker_image":{"namespace":"jcustenborder","name":"kafka-connect-docker","tag":null,"registry":null},"confluent_verified":null,"features":{"supported_encodings":["any"],"single_message_transforms":true,"confluent_control_center_integration":true,"kafka_connect_api":true,"delivery_guarantee":null},"license":[{"name":"Apache License 2.0","url":"https://www.apache.org/licenses/LICENSE-2.0.txt","logo":null}],"component_types":["transform"],"release_date":"2018-10-19","tags":["Xml","Transform"],"requirements":null,"signatures":null,"last_modified":1539978996000},{"name":"kafka-connect-aws-lambda","version":"1.0-SNAPSHOT","manifest_url":"https://api.hub.confluent.io/api/plugins/llofberg/kafka-connect-aws-lambda/versions/1.0-SNAPSHOT","title":"Kafka Connect AWS Lambda Sink","description":"The AWS Lambda sink connector calls Lambda functions based on events in Kafka topics.","logo":"https://d1i4a15mxbxib1.cloudfront.net/api/plugins/llofberg/kafka-connect-aws-lambda/versions/1.0-SNAPSHOT/assets/AWS_Lambda_logo.jpg","documentation_url":"https://github.com/llofberg/kafka-connect-aws-lambda/blob/master/README.md","source_url":"https://github.com/llofberg/kafka-connect-aws-lambda","support":null,"owner":{"username":"llofberg","type":null,"name":"Lenny Lofberg","url":"https://github.com/llofberg/","logo":null},"archive":null,"docker_image":null,"confluent_verified":null,"features":{"supported_encodings":["any"],"single_message_transforms":true,"confluent_control_center_integration":true,"kafka_connect_api":true,"delivery_guarantee":null},"license":null,"component_types":["sink"],"release_date":null,"tags":["Cloud","AWS","Lambda"],"requirements":null,"signatures":null,"last_modified":1535074643000},{"name":"xenon-connector","version":"1.0.0","manifest_url":"https://api.hub.confluent.io/api/plugins/levyx/xenon-connector/versions/1.0.0","title":"Levyx Xenon Connector","description":"Levyx offers a high-performance key value storage engine that enables I/O-intensive legacy and Big Data applications to operate Faster, Simpler, and Cheaper.","logo":"https://d1i4a15mxbxib1.cloudfront.net/api/plugins/levyx/xenon-connector/versions/1.0.0/assets/Xenon.jpg","documentation_url":"https://github.com/levyx/kafka-xenon","source_url":"https://github.com/levyx/kafka-xenon","support":null,"owner":{"username":"levyx","type":"Organization","name":"Levyx","url":"https://www.levyx.com/","logo":"https://d1i4a15mxbxib1.cloudfront.net/api/plugins/levyx/xenon-connector/versions/1.0.0/assets/Xenon.jpg"},"archive":null,"docker_image":null,"confluent_verified":{"level":"standard"},"features":{"supported_encodings":["any"],"single_message_transforms":null,"confluent_control_center_integration":null,"kafka_connect_api":null,"delivery_guarantee":null},"license":[{"name":"Apache 2.0","url":null,"logo":null}],"component_types":["source"],"release_date":null,"tags":["Database"],"requirements":null,"signatures":null,"last_modified":1535074643000},{"name":"kafka-connect-irc","version":"5.0.0","manifest_url":"https://api.hub.confluent.io/api/plugins/cjmatta/kafka-connect-irc/versions/5.0.0","title":"Kafka Connect IRC","description":"A Kafka Connect source connector for Internet Relay Chat","logo":"https://d1i4a15mxbxib1.cloudfront.net/api/plugins/cjmatta/kafka-connect-irc/versions/5.0.0/assets/irc-logo.png","documentation_url":"https://github.com/cjmatta/kafka-connect-irc/blob/master/README.md","source_url":null,"support":{"provider_name":null,"summary":null,"url":null,"logo":null},"owner":{"username":"cjmatta","type":"User","name":"Christopher Matta","url":"https://github.com/cjmatta","logo":null},"archive":{"name":"cjmatta-kafka-connect-irc-5.0.0.zip","url":"https://d1i4a15mxbxib1.cloudfront.net/api/plugins/cjmatta/kafka-connect-irc/versions/5.0.0/cjmatta-kafka-connect-irc-5.0.0.zip","mime_type":"application/zip","md5":"8528b3ce7d97b857253705d0656d1dee","sha1":"6cd32389c5cc1448bffbf748ad1de65486d1ba1d","asc":"-----BEGIN PGP SIGNATURE-----\n\niQEzBAABCAAdFiEE30UApyr11ASqDaFFOdgcJ8+naHMFAlupOsMACgkQOdgcJ8+n\naHPUaQf/ftVKn/HFC8ggOMaC92q1qGyGZ5PTe/bm8W5LRZFY6YfbgAwokBbIZusW\ns11qbQkUfMAZcWk4S8GMGpo82Kw3LegACM02k/mK+/xF/aDPUFZTv3wTJfG2wqJA\nQdvvqqbMk0cJAbNPejuQpYsxlJBQ5OlaXTO8EiakVzFl7QfBxQ9xl/+C00fk4gGh\nfzA4DhnNXvXQ4UO+AWpckrTI/p+wpyTQdA48fF+tdlc9e9QhKJFNIsVqW+aiWgtn\nGPcu73jeU63n48ru+S1qWM6onqgDOYTQs4ZT3Ub7B5D0PdN13judPOxeKnb+HmsB\nsY1lbPIJquurB/mdkh5FNcV2W1BAsg==\n=lUb8\n-----END PGP SIGNATURE-----"},"docker_image":{"namespace":null,"name":null,"tag":null,"registry":null},"confluent_verified":null,"features":{"supported_encodings":["any"],"single_message_transforms":true,"confluent_control_center_integration":true,"kafka_connect_api":true,"delivery_guarantee":null},"license":[{"name":"The Apache License, Version 2.0","url":"https://www.apache.org/licenses/LICENSE-2.0","logo":null}],"component_types":["source"],"release_date":"2018-09-24","tags":["chat","Internet Relay Chat","IRC"],"requirements":null,"signatures":null,"last_modified":1537821415000},{"name":"bkatwal-kafka-connect-solr-sink","version":"1.0","manifest_url":"https://api.hub.confluent.io/api/plugins/bkatwal/bkatwal-kafka-connect-solr-sink/versions/1.0","title":"Solr Sink Connector","description":"Consumes plain schemaless JSON data out of Apache Kafka and writes to Apache Solr. Solr must be in <a href=\"https://lucene.apache.org/solr/guide/6_6/solrcloud.html\">SolrCloud mode</a>.","logo":null,"documentation_url":"https://github.com/bkatwal/kafka-solr-sink-connector/wiki/Kafka-connect-solr-sink-connector","source_url":"https://github.com/bkatwal/kafka-solr-sink-connector/","support":null,"owner":{"username":"bkatwal","type":"User","name":"Bikas Katwal.","url":"https://github.com/bkatwal","logo":null},"archive":{"name":"bkatwal-bkatwal-kafka-connect-solr-sink-1.0.zip","url":"https://d1i4a15mxbxib1.cloudfront.net/api/plugins/bkatwal/bkatwal-kafka-connect-solr-sink/versions/1.0/bkatwal-bkatwal-kafka-connect-solr-sink-1.0.zip","mime_type":"application/zip","md5":"39ee81cfa8062660547b8e5624a40d93","sha1":"dd28b98d818acc254c13d882c9e39dfaf2f7f4ff","asc":null},"docker_image":null,"confluent_verified":null,"features":{"supported_encodings":["JSON"],"single_message_transforms":true,"confluent_control_center_integration":true,"kafka_connect_api":true,"delivery_guarantee":null},"license":[{"name":"Apache License, Version 2.0","url":"http://www.apache.org/licenses/LICENSE-2.0","logo":null}],"component_types":["sink"],"release_date":null,"tags":["search","solr","solrcloud"],"requirements":["Solr 6.0 or later"],"signatures":null,"last_modified":1535074635000},{"name":"sap-hana","version":"1.0.0","manifest_url":"https://api.hub.confluent.io/api/plugins/sap/sap-hana/versions/1.0.0","title":"SAP Hana Connector","description":"Kafka Connect SAP is a set of connectors, using the Apache Kafka Connect framework for reliably connecting Kafka with SAP systems","logo":"https://d1i4a15mxbxib1.cloudfront.net/api/plugins/sap/sap-hana/versions/1.0.0/assets/sap-hana-logo.jpg","documentation_url":"https://github.com/SAP/kafka-connect-sap","source_url":"https://github.com/SAP/kafka-connect-sap","support":null,"owner":{"username":"sap","type":"Organization","name":"SAP","url":"https://www.sap.com/","logo":"https://d1i4a15mxbxib1.cloudfront.net/api/plugins/sap/sap-hana/versions/1.0.0/assets/sap-hana-logo.jpg"},"archive":null,"docker_image":null,"confluent_verified":{"level":"standard"},"features":{"supported_encodings":["any"],"single_message_transforms":null,"confluent_control_center_integration":null,"kafka_connect_api":null,"delivery_guarantee":null},"license":[{"name":"Apache 2.0","url":null,"logo":null}],"component_types":["sink","source"],"release_date":null,"tags":["Database"],"requirements":null,"signatures":null,"last_modified":1535074643000},{"name":"goldengate","version":"12.3.0","manifest_url":"https://api.hub.confluent.io/api/plugins/oracle/goldengate/versions/12.3.0","title":"Oracle GoldenGate","description":"Oracle GoldenGate is a comprehensive software package for real-time data integration and replication in heterogeneous IT environments. The product set enables high availability solutions, real-time data integration, transactional change data capture, data replication, transformations, and verification between operational and analytical enterprise systems. Oracle GoldenGate 12c brings extreme performance with simplified configuration and management, tighter integration with Oracle Database, support for cloud environments, expanded heterogeneity, and enhanced security.","logo":"https://d1i4a15mxbxib1.cloudfront.net/api/plugins/oracle/goldengate/versions/12.3.0/assets/oracle.jpg","documentation_url":"http://www.oracle.com/technetwork/middleware/goldengate/documentation/index.html","source_url":null,"support":null,"owner":{"username":"oracle","type":"Organization","name":"Oracle","url":"https://www.oracle.com","logo":"https://d1i4a15mxbxib1.cloudfront.net/api/plugins/oracle/goldengate/versions/12.3.0/assets/oracle.jpg"},"archive":null,"docker_image":null,"confluent_verified":{"level":"standard"},"features":{"supported_encodings":["any"],"single_message_transforms":null,"confluent_control_center_integration":null,"kafka_connect_api":null,"delivery_guarantee":null},"license":[{"name":"Proprietary","url":null,"logo":null}],"component_types":["source"],"release_date":null,"tags":["CDC"],"requirements":null,"signatures":null,"last_modified":1535074643000},{"name":"druid-kafka-indexing-service","version":"2.6.1","manifest_url":"https://api.hub.confluent.io/api/plugins/imply/druid-kafka-indexing-service/versions/2.6.1","title":"Druid Kafka indexing service","description":"The Apache Druid (incubating) Kafka indexing service offers exactly-once consumption guarantees from Kafka. The Kafka indexing service is a native Kafka consumer that streams data from Kafka topics and writes to Druid. This service is a part of core Druid and runs natively  as a part of Druid’s ingestion process.  Apache Druid (incubating) is a high-performance analytics data store to store and query large volumes of real-time, complex, event-driven, operational data.  Imply, the original developer of the Kafka indexing service, is the enterprise version of Druid with additional capabilities for data loading, management, security, and visualization.","logo":"https://d1i4a15mxbxib1.cloudfront.net/api/plugins/imply/druid-kafka-indexing-service/versions/2.6.1/assets/logo.png","documentation_url":"https://docs.imply.io/on-premise/manage-data/ingestion-kafka","source_url":"https://github.com/druid-io/druid/tree/master/extensions-core/kafka-indexing-service","support":{"provider_name":null,"summary":"Support for the Druid Kafka indexing service for Imply is available from the Imply <a style=\"color:#4597cb\" href=\"https://groups.google.com/forum/#!forum/imply-user-group\">user group</a> or from Imply <a style=\"color:#4597cb\" href=\"https://imply.io/contact\">directly</a>.","url":null,"logo":null},"owner":{"username":"imply","type":"Organization","name":"Imply","url":"https://imply.io/","logo":"https://d1i4a15mxbxib1.cloudfront.net/api/plugins/imply/druid-kafka-indexing-service/versions/2.6.1/assets/logo.png"},"archive":null,"docker_image":null,"confluent_verified":{"level":"standard"},"features":{"supported_encodings":["any"],"single_message_transforms":null,"confluent_control_center_integration":null,"kafka_connect_api":null,"delivery_guarantee":null},"license":[{"name":"Apache 2.0","url":"https://github.com/druid-io/druid/blob/master/LICENSE","logo":null}],"component_types":["sink"],"release_date":null,"tags":["Druid","Log","Search","Database","Analytics","Imply"],"requirements":null,"signatures":null,"last_modified":1535074643000},{"name":"hvr-cdc","version":"5.3.0","manifest_url":"https://api.hub.confluent.io/api/plugins/hvr/hvr-cdc/versions/5.3.0","title":"HVR Change Data Capture","description":"HVR is the Only Real-Time Data Integration Solution that Provides You the Necessary Replication Capabilities in a Single Unified Environment.","logo":"https://d1i4a15mxbxib1.cloudfront.net/api/plugins/hvr/hvr-cdc/versions/5.3.0/assets/hvr.png","documentation_url":"https://www.hvr-software.com/wiki/Main_Page","source_url":null,"support":null,"owner":{"username":"hvr","type":"Organization","name":"HVR","url":"https://www.hvr-software.com/","logo":"https://d1i4a15mxbxib1.cloudfront.net/api/plugins/hvr/hvr-cdc/versions/5.3.0/assets/hvr.jpg"},"archive":null,"docker_image":null,"confluent_verified":{"level":"standard"},"features":{"supported_encodings":["any"],"single_message_transforms":null,"confluent_control_center_integration":null,"kafka_connect_api":null,"delivery_guarantee":null},"license":[{"name":"Proprietary","url":null,"logo":null}],"component_types":["source"],"release_date":null,"tags":["CDC"],"requirements":null,"signatures":null,"last_modified":1535074643000},{"name":"kafka-connect-bigquery","version":"1.1.0","manifest_url":"https://api.hub.confluent.io/api/plugins/wepay/kafka-connect-bigquery/versions/1.1.0","title":"BigQuery Sink Connector","description":"A sink connector for writing to Google BigQuery, with support for automatic table creation and schema evolution.","logo":"https://d1i4a15mxbxib1.cloudfront.net/api/plugins/wepay/kafka-connect-bigquery/versions/1.1.0/assets/BigQuery.png","documentation_url":"https://github.com/wepay/kafka-connect-bigquery/wiki","source_url":"https://github.com/wepay/kafka-connect-bigquery","support":null,"owner":{"username":"wepay","type":"Organization","name":"WePay","url":"https://go.wepay.com/","logo":null},"archive":{"name":"wepay-kafka-connect-bigquery-1.1.0.zip","url":"https://d1i4a15mxbxib1.cloudfront.net/api/plugins/wepay/kafka-connect-bigquery/versions/1.1.0/wepay-kafka-connect-bigquery-1.1.0.zip","mime_type":"application/zip","md5":"11f4532760461fe8cf3d145b8ab4e22b","sha1":"4c9964682cb8a3a7c2c7cf32245a5dcaec9f674e","asc":null},"docker_image":null,"confluent_verified":null,"features":{"supported_encodings":["any"],"single_message_transforms":true,"confluent_control_center_integration":true,"kafka_connect_api":true,"delivery_guarantee":null},"license":[{"name":"Apache 2.0","url":"https://github.com/wepay/kafka-connect-bigquery/blob/master/LICENSE.md","logo":null}],"component_types":["sink"],"release_date":null,"tags":["cloud","analytics","data","google","bigquery","warehouse","platform","nosql"],"requirements":null,"signatures":null,"last_modified":1535074643000},{"name":"kafka-connect-telegram","version":"0.2.0","manifest_url":"https://api.hub.confluent.io/api/plugins/fbascheper/kafka-connect-telegram/versions/0.2.0","title":"Kafka Connect Telegram","description":"The Telegram connector allows moving data from Kafka to a Telegram chat. It\nwrites data from a topic in Kafka to the configured chat-id.\nIt's possible to send both text, photo and video messages to chats. For complex\nmodels you must use an Avro model, preferably\nin combination with the kafka-connect-avro-converter.","logo":null,"documentation_url":"https://github.com/fbascheper/kafka-connect-telegram","source_url":"https://github.com/fbascheper/kafka-connect-telegram","support":{"provider_name":null,"summary":null,"url":null,"logo":null},"owner":{"username":"fbascheper","type":"User","name":"Erik-Berndt Scheper.","url":"https://github.com/fbascheper","logo":null},"archive":{"name":"fbascheper-kafka-connect-telegram-0.2.0.zip","url":"https://d1i4a15mxbxib1.cloudfront.net/api/plugins/fbascheper/kafka-connect-telegram/versions/0.2.0/fbascheper-kafka-connect-telegram-0.2.0.zip","mime_type":"application/zip","md5":"3f00fbc7c32e845cd3457d72734d84db","sha1":"44a5ecb8b732afbcaa545f628e974fef7539b68b","asc":"-----BEGIN PGP SIGNATURE-----\n\niQEzBAABCAAdFiEErjt/pkmjD8Yr+C8WcfCIeuOeSDcFAlvsNWEACgkQcfCIeuOe\nSDdCeAgAjWI32Q4gq3kiO4aQB9y7aun20MLaSEZULctw1Fn+KQ/6pJ2Hx0x8Qclp\nAKe5giICtX+HemRiUPOwwcsIlrzmra13aOpjX170KaMLqEsf4wvTK43I2zVYay0F\njRuApzNtNPEIDptqV48QdL6MuCf46u9a6kvKT0OIHO9KV1Ktnozg6soZtSGlGXML\n2r3S+0DFCg/YQUprXJFc/kSnekC/NeUofD1G8TNe8XJhA37Gt9BWzUeKymsw7lB8\nKX07wj7CcunPTfXzC4UOTtBdfFlXshV9rSIhByA753B6b18WWd2MJs4AwSCRIT8U\nKwryOPCbAjS90n80G7xSjS8NrJTF1g==\n=d8Dy\n-----END PGP SIGNATURE-----"},"docker_image":{"namespace":null,"name":null,"tag":null,"registry":null},"confluent_verified":null,"features":{"supported_encodings":["any"],"single_message_transforms":true,"confluent_control_center_integration":true,"kafka_connect_api":true,"delivery_guarantee":null},"license":[{"name":"The Apache License, Version 2.0","url":"http://www.apache.org/licenses/LICENSE-2.0.txt","logo":null}],"component_types":["sink"],"release_date":"2018-11-14","tags":["Telegram","chat"],"requirements":["Telegram Bot API"],"signatures":null,"last_modified":1542227962000},{"name":"kafka-connect-protobuf-converter","version":"2.0.0","manifest_url":"https://api.hub.confluent.io/api/plugins/blueapron/kafka-connect-protobuf-converter/versions/2.0.0","title":"Kafka Connect Protobuf Converter","description":"Proto3 converter for Kafka Connect.","logo":null,"documentation_url":"https://github.com/blueapron/kafka-connect-protobuf-converter/blob/master/README.md","source_url":null,"support":{"provider_name":null,"summary":"Support provided through community involvement.","url":"https://github.com/blueapron/kafka-connect-protobuf-converter/issues","logo":null},"owner":{"username":"blueapron","type":"Organization","name":"Blue Apron, LLC.","url":"https://www.blueapron.com/","logo":null},"archive":{"name":"blueapron-kafka-connect-protobuf-converter-2.0.0.zip","url":"https://d1i4a15mxbxib1.cloudfront.net/api/plugins/blueapron/kafka-connect-protobuf-converter/versions/2.0.0/blueapron-kafka-connect-protobuf-converter-2.0.0.zip","mime_type":"application/zip","md5":"84f20ba3bf5e32fb2c9d2cd6957219f5","sha1":"64b5ad03b5cf967eb773f49ac8ca16443f7e9667","asc":null},"docker_image":{"namespace":null,"name":null,"tag":null,"registry":null},"confluent_verified":null,"features":{"supported_encodings":null,"single_message_transforms":false,"confluent_control_center_integration":false,"kafka_connect_api":false,"delivery_guarantee":null},"license":[{"name":"MIT License","url":"http://www.opensource.org/licenses/mit-license.php","logo":null}],"component_types":["converter"],"release_date":"2018-05-23","tags":["protocol buffers","protobuf","proto3","converter"],"requirements":null,"signatures":null,"last_modified":1535074634000},{"name":"kinetica-connector","version":"6.2.0","manifest_url":"https://api.hub.confluent.io/api/plugins/kinetica/kinetica-connector/versions/6.2.0","title":"Kinetica DB Connector","description":"Kinetica is a distributed, in-memory database accelerated by GPUs that can simultaneously ingest, analyze, and visualize streaming data for truly real-time actionable intelligence.","logo":"https://d1i4a15mxbxib1.cloudfront.net/api/plugins/kinetica/kinetica-connector/versions/6.2.0/assets/logo-whitewhite-on-indigo_27H.png","documentation_url":"https://github.com/kineticadb/kinetica-connector-kafka","source_url":"https://github.com/kineticadb/kinetica-connector-kafka","support":null,"owner":{"username":"kinetica","type":"Organization","name":"Kinetica","url":"https://www.kinetica.com/","logo":"https://d1i4a15mxbxib1.cloudfront.net/api/plugins/kinetica/kinetica-connector/versions/6.2.0/assets/logo-whitewhite-on-indigo_27H.png"},"archive":null,"docker_image":null,"confluent_verified":{"level":"standard"},"features":{"supported_encodings":["any"],"single_message_transforms":null,"confluent_control_center_integration":null,"kafka_connect_api":null,"delivery_guarantee":null},"license":[{"name":"MIT","url":null,"logo":null}],"component_types":["sink","source"],"release_date":null,"tags":["Database"],"requirements":null,"signatures":null,"last_modified":1535074643000},{"name":"push-connector","version":"0.0.1","manifest_url":"https://api.hub.confluent.io/api/plugins/push/push-connector/versions/0.0.1","title":"Push Technologies Connector","description":"The Diffusion Kafka Adaptor is a Kafka Connect plugin for transferring data between Diffusion and Kafka. It includes a source connector for publishing real-time Diffusion topic updates to Kafka topics, as well as a sink connector that broadcasts messages from one or more Kafka topics to Diffusion topics. By using Diffusion, Kafka data can be easily published to large numbers of web or mobile clients over the internet at high throughput and low latency. The Diffusion Kafka Adaptor supports primitive values, arrays, maps and structs, as well as dynamic mapping between Diffusion and Kafka topic paths.","logo":"https://d1i4a15mxbxib1.cloudfront.net/api/plugins/push/push-connector/versions/0.0.1/assets/pushlogo.png","documentation_url":"https://github.com/pushtechnology/diffusion-kafka-connect ","source_url":"https://github.com/pushtechnology/diffusion-kafka-connect","support":null,"owner":{"username":"push","type":"Organization","name":"Push Technologies","url":"https://www.pushtechnology.com/","logo":"https://d1i4a15mxbxib1.cloudfront.net/api/plugins/push/push-connector/versions/0.0.1/assets/pushlogo.png"},"archive":null,"docker_image":null,"confluent_verified":{"level":"standard"},"features":{"supported_encodings":["any"],"single_message_transforms":null,"confluent_control_center_integration":null,"kafka_connect_api":null,"delivery_guarantee":null},"license":[{"name":"Apache 2.0","url":null,"logo":null}],"component_types":["sink","source"],"release_date":null,"tags":["Consumer","Diffusion"],"requirements":null,"signatures":null,"last_modified":1543615033000},{"name":"kafka-connect-http","version":"1.0.0","manifest_url":"https://api.hub.confluent.io/api/plugins/thomaskwscott/kafka-connect-http/versions/1.0.0","title":"Kafka Connect HTTP","description":"The HTTP sink connector allows you to export data from Kafka topics to any HTTP API.","logo":"https://d1i4a15mxbxib1.cloudfront.net/api/plugins/thomaskwscott/kafka-connect-http/versions/1.0.0/assets/http.jpg","documentation_url":"https://thomaskwscott.github.io/kafka-connect-http/","source_url":"https://github.com/thomaskwscott/kafka-connect-http","support":{"provider_name":"Thomas Scott","summary":"Support is provided as best effort only. Please register any issues in the github project.","url":"https://github.com/thomaskwscott/kafka-connect-http/issues","logo":"https://d1i4a15mxbxib1.cloudfront.net/api/plugins/thomaskwscott/kafka-connect-http/versions/1.0.0/assets/thomaskwscott.png"},"owner":{"username":"thomaskwscott","type":"User","name":"Thomas Scott","url":"https://github.com/thomaskwscott","logo":"https://d1i4a15mxbxib1.cloudfront.net/api/plugins/thomaskwscott/kafka-connect-http/versions/1.0.0/assets/thomaskwscott.png"},"archive":{"name":"thomaskwscott-kafka-connect-http-1.0.0.zip","url":"https://d1i4a15mxbxib1.cloudfront.net/api/plugins/thomaskwscott/kafka-connect-http/versions/1.0.0/thomaskwscott-kafka-connect-http-1.0.0.zip","mime_type":"application/zip","md5":"cf8c10d8b7c27f9cd16e918eff6f5bad","sha1":"661a0aafa8fcba3054d41904384ce62b15057bd0","asc":null},"docker_image":{"namespace":null,"name":null,"tag":null,"registry":null},"confluent_verified":null,"features":{"supported_encodings":["any"],"single_message_transforms":true,"confluent_control_center_integration":true,"kafka_connect_api":true,"delivery_guarantee":null},"license":[{"name":"Apache License 2.0","url":"http://www.apache.org/licenses/LICENSE-2.0.html","logo":null}],"component_types":["sink"],"release_date":"2018-07-25","tags":["rest","http"],"requirements":null,"signatures":null,"last_modified":1535074643000},{"name":"greenplum-integration","version":"5.13","manifest_url":"https://api.hub.confluent.io/api/plugins/pivotal/greenplum-integration/versions/5.13","title":"Pivotal Greenplum Integration","description":"Pivotal Greenplum Database is a massively parallel processing database server specially designed to manage large scale analytic data warehouses and business intelligence workloads. The Pivotal Greenplum-Kafka Connector provides high speed, parallel data transfer from Apache Kafka to Greenplum Database to support a streaming ETL pipeline.","logo":"https://d1i4a15mxbxib1.cloudfront.net/api/plugins/pivotal/greenplum-integration/versions/5.13/assets/Pivotal-Greenplum-Logo-FullColor.png","documentation_url":"https://gpdb.docs.pivotal.io/5130/greenplum-kafka/intro.html","source_url":null,"support":null,"owner":{"username":"pivotal","type":"Organization","name":"Pivotal","url":"https://pivotal.io/pivotal-greenplum","logo":"https://d1i4a15mxbxib1.cloudfront.net/api/plugins/pivotal/greenplum-integration/versions/5.13/assets/Pivotal-Greenplum-Logo-FullColor.png"},"archive":null,"docker_image":null,"confluent_verified":{"level":"standard"},"features":{"supported_encodings":["any"],"single_message_transforms":null,"confluent_control_center_integration":null,"kafka_connect_api":null,"delivery_guarantee":null},"license":[{"name":"Proprietary","url":null,"logo":null}],"component_types":["sink"],"release_date":null,"tags":["Database"],"requirements":null,"signatures":null,"last_modified":1543566276000},{"name":"data-replication","version":"11.4.0","manifest_url":"https://api.hub.confluent.io/api/plugins/ibm/data-replication/versions/11.4.0","title":"IBM Data Replication","description":"Data replication is a solution that provides trusted data integration and synchronization to enable you to efficiently manage data growth. It empowers your event-driven business by enriching big data systems and mobile applications through the use of real-time information, even capturing data that is constantly changing.","logo":"https://d1i4a15mxbxib1.cloudfront.net/api/plugins/ibm/data-replication/versions/11.4.0/assets/IBM.jpg","documentation_url":"https://www.ibm.com/support/knowledgecenter/en/SSTRGZ_11.4.0/com.ibm.idr.frontend.doc/pv_welcome.html","source_url":null,"support":null,"owner":{"username":"ibm","type":"Organization","name":"IBM","url":"https://www.ibm.com","logo":"https://d1i4a15mxbxib1.cloudfront.net/api/plugins/ibm/data-replication/versions/11.4.0/assets/IBM.jpg"},"archive":null,"docker_image":null,"confluent_verified":{"level":"standard"},"features":{"supported_encodings":["any"],"single_message_transforms":null,"confluent_control_center_integration":null,"kafka_connect_api":null,"delivery_guarantee":null},"license":[{"name":"Proprietary","url":null,"logo":null}],"component_types":["source"],"release_date":null,"tags":["CDC"],"requirements":null,"signatures":null,"last_modified":1535074643000},{"name":"kafka-connect-phoenix","version":"0.1","manifest_url":"https://api.hub.confluent.io/api/plugins/dhananjaypatkar/kafka-connect-phoenix/versions/0.1","title":"Kafka Connect Phoenix","description":"Kafka connect Sink Connector for Apache Phoenix [SQL layer on top of HBase]","logo":null,"documentation_url":"https://github.com/dhananjaypatkar/kafka-connect-phoenix/wiki/Kafka-Connect-Phoenix","source_url":"https://github.com/dhananjaypatkar/kafka-connect-phoenix","support":{"provider_name":null,"summary":null,"url":null,"logo":null},"owner":{"username":"dhananjaypatkar","type":"User","name":"Dhananjay Patkar","url":null,"logo":null},"archive":{"name":"dhananjaypatkar-kafka-connect-phoenix-0.1.zip","url":"https://d1i4a15mxbxib1.cloudfront.net/api/plugins/dhananjaypatkar/kafka-connect-phoenix/versions/0.1/dhananjaypatkar-kafka-connect-phoenix-0.1.zip","mime_type":"application/zip","md5":"ca227c9d044ccd7f3135376d32d3ccce","sha1":"34023c5dc2977ea769f1f9ecd9201080b3605e4e","asc":null},"docker_image":{"namespace":null,"name":null,"tag":null,"registry":null},"confluent_verified":null,"features":{"supported_encodings":["JSON"],"single_message_transforms":true,"confluent_control_center_integration":false,"kafka_connect_api":true,"delivery_guarantee":null},"license":[{"name":"Apache License, Version 2.0","url":"https://www.apache.org/licenses/LICENSE-2.0.txt","logo":null}],"component_types":["sink"],"release_date":"2018-07-24","tags":["Kafka Connect Sink","Big Data","Phoenix","Apache","HBase"],"requirements":["Apache Phoenix 4.7+"],"signatures":null,"last_modified":1535074643000}]

@ybyzek
Copy link
Contributor

ybyzek commented Dec 19, 2018

@mstolin

  1. No output is shown above. Was there any?
  2. What type of machine/OS do you have?
  3. What is the output of docker version?
  4. Does this command work for you? docker-compose build --no-cache connect (caching disabled)
cp-all-in-one(5.0.1-post): docker-compose build --no-cache connect
Building connect
Downloading context: https://github.com/confluentinc/kafka-connect-datagen/raw/master/Dockerfile-confluenthub     778B
Step 1/3 : FROM confluentinc/cp-kafka-connect:5.0.0
 ---> 7df8759460f7
Step 2/3 : ENV CONNECT_PLUGIN_PATH="/usr/share/java,/usr/share/confluent-hub-components"
 ---> Running in 231cd9016b9f
Removing intermediate container 231cd9016b9f
 ---> bac74ec192f8
Step 3/3 : RUN  confluent-hub install --no-prompt confluentinc/kafka-connect-datagen:0.1.0
 ---> Running in b6ba9e31bb96
Running in a "--no-prompt" mode 
Implicit acceptance of the license below:  
The Apache License, Version 2.0 
https://www.apache.org/licenses/LICENSE-2.0 
Downloading component Kafka Connect Datagen 0.1.0, provided by Confluent, Inc. from Confluent Hub and installing into /usr/share/confluent-hub-components 
Adding installation directory to plugin path in the following files: 
  /etc/kafka/connect-distributed.properties 
  /etc/kafka/connect-standalone.properties 
  /etc/schema-registry/connect-avro-distributed.properties 
  /etc/schema-registry/connect-avro-standalone.properties 
 
Completed 
Removing intermediate container b6ba9e31bb96
 ---> 42306358551a
Successfully built 42306358551a
Successfully tagged confluentinc/kafka-connect-datagen:0.1.0

@mstolin
Copy link
Author

mstolin commented Dec 19, 2018

@ybyzek

  1. What do you mean? There is only the json output i posted for the curl command.
  2. I am running on Ubuntu 18.04.1 LTS
Client:
 Version:           18.06.1-ce
 API version:       1.38
 Go version:        go1.10.4
 Git commit:        e68fc7a
 Built:             Fri Oct 19 19:43:14 2018
 OS/Arch:           linux/amd64
 Experimental:      false

Server:
 Engine:
  Version:          18.06.1-ce
  API version:      1.38 (minimum version 1.12)
  Go version:       go1.10.4
  Git commit:       e68fc7a
  Built:            Thu Sep 27 02:39:50 2018
  OS/Arch:          linux/amd64
  Experimental:     false
  1. No, same error message.

@ybyzek
Copy link
Contributor

ybyzek commented Dec 20, 2018

If you're willing, here is one more suggestion to try: instead of building the connector image locally, run the CP connect image and then manually install the connector.

  1. Remove these lines at: https://github.com/confluentinc/cp-docker-images/blob/5.1.0-post/examples/cp-all-in-one/docker-compose.yml#L51-L54
    image: confluentinc/kafka-connect-datagen:0.1.0
    build:
      context: https://github.com/confluentinc/kafka-connect-datagen/raw/master/Dockerfile-confluenthub
      dockerfile: Dockerfile-confluenthub

and replace it with the following line

    image: confluentinc/cp-kafka-connect:5.0.0
  1. Bring up the environment: docker-compose up -d

  2. Run the install command from within the connect container: docker-compose exec connect confluent-hub install --no-prompt confluentinc/kafka-connect-datagen:0.1.0

@hpinsley
Copy link

hpinsley commented Jan 3, 2019

@ybyzek I'm following the suggestion you made on #646 to come here and try your suggestion, above. After editing the docker-compose.yml file as suggested, here are the results:

C:\dev\ChangeNotificationService\Kafka [spike/AddNewDockerContainers +0 ~1 -0 !]> docker-compose up -d
Creating network "kafka_default" with the default driver
Creating zookeeper ... done
Creating broker    ... done
Creating schema-registry ... done
Creating connect         ... done
Creating rest-proxy      ... done
Creating ksql-server     ... done
Creating control-center  ... done
Creating ksql-cli        ... done
Creating ksql-datagen    ... done
C:\dev\ChangeNotificationService\Kafka [spike/AddNewDockerContainers +0 ~1 -0 !]> docker-compose ps
     Name                    Command               State                        Ports
-----------------------------------------------------------------------------------------------------------
broker            /etc/confluent/docker/run        Up      0.0.0.0:29092->29092/tcp, 0.0.0.0:9092->9092/tcp
connect           /etc/confluent/docker/run        Up      0.0.0.0:8083->8083/tcp, 9092/tcp
control-center    /etc/confluent/docker/run        Up      0.0.0.0:9021->9021/tcp
ksql-cli          /bin/sh                          Up
ksql-datagen      bash -c echo Waiting for K ...   Up
ksql-server       /etc/confluent/docker/run        Up      0.0.0.0:8088->8088/tcp
rest-proxy        /etc/confluent/docker/run        Up      0.0.0.0:8082->8082/tcp
schema-registry   /etc/confluent/docker/run        Up      0.0.0.0:8081->8081/tcp
zookeeper         /etc/confluent/docker/run        Up      0.0.0.0:2181->2181/tcp, 2888/tcp, 3888/tcp
C:\dev\ChangeNotificationService\Kafka [spike/AddNewDockerContainers +0 ~1 -0 !]> docker-compose exec connect confluent-hub install --no-prompt confluentinc/kafka-connect-datagen:0.1.0
Running in a "--no-prompt" mode
Unable to verify Confluent Hub's identity

Error: Security issues
C:\dev\ChangeNotificationService\Kafka [spike/AddNewDockerContainers +0 ~1 -0 !]>

Also, the ksql-datagen container failed to start. The tail of the log contains:

ksql-datagen       | [kafka-admin-client-thread | adminclient-1] WARN org.apache.kafka.clients.NetworkClient - [AdminClient clientId=adminclient-1] Connection to node -1 could not be established. Broker may not be available.
ksql-datagen       | [kafka-admin-client-thread | adminclient-1] WARN org.apache.kafka.clients.NetworkClient - [AdminClient clientId=adminclient-1] Connection to node -1 could not be established. Broker may not be available.
ksql-datagen       | [kafka-admin-client-thread | adminclient-1] WARN org.apache.kafka.clients.NetworkClient - [AdminClient clientId=adminclient-1] Connection to node -1 could not be established. Broker may not be available.
ksql-datagen       | Waiting for Confluent Schema Registry to be ready...
ksql-datagen       | Waiting a few seconds for topic creation to finish...
ksql-datagen       | cp: cannot stat ‘/usr/share/java/monitoring-interceptors/monitoring-interceptors-5.0.1.jar’: No such file or directory

I'm not sure if it is related to the first error.

@ybyzek
Copy link
Contributor

ybyzek commented Jan 3, 2019

Thanks @hpinsley . The error from the ksql-datagen container should be resolved if you use the latest branch 5.1.0-post. If you update to that branch, I think that second error should go away but I don't think it will resolve the first and primary issue regarding connecting to Confluent Hub.

@hpinsley
Copy link

hpinsley commented Jan 3, 2019

I'll try that. Does it matter that your solution for getting connect to come up uses an older image? My main objective is to try out KSQL and KTables.

@ybyzek
Copy link
Contributor

ybyzek commented Jan 3, 2019

@hpinsley : by "older image", I believe you mean this line, correct?

image: confluentinc/cp-kafka-connect:5.0.0

It shouldn't matter, but to rule it out as a possibility, you could try

image: confluentinc/cp-kafka-connect:5.1.0

@hpinsley
Copy link

hpinsley commented Jan 3, 2019

I updated to the latest version, applied your edits, and get similar results with respect to ksql-datagen...

ksql-datagen       | [kafka-admin-client-thread | adminclient-1] WARN org.apache.kafka.clients.NetworkClient - [AdminClient clientId=adminclient-1] Connection to node -1 could not be established. Broker may not be available.
ksql-datagen       | [kafka-admin-client-thread | adminclient-1] WARN org.apache.kafka.clients.NetworkClient - [AdminClient clientId=adminclient-1] Connection to node -1 could not be established. Broker may not be available.
ksql-datagen       | Waiting for Confluent Schema Registry to be ready...
ksql-datagen       | Waiting a few seconds for topic creation to finish...
ksql-datagen       | cp: cannot stat ‘/usr/share/java/monitoring-interceptors/monitoring-interceptors-5.0.1.jar’: No such file or directory

This is what the state is:

C:\dev\ChangeNotificationService\Kafka [spike/TryVersion5.1.0-post +0 ~1 -0 !]> docker-compose ps
     Name                    Command               State                         Ports
------------------------------------------------------------------------------------------------------------
broker            /etc/confluent/docker/run        Up       0.0.0.0:29092->29092/tcp, 0.0.0.0:9092->9092/tcp
connect           /etc/confluent/docker/run        Up       0.0.0.0:8083->8083/tcp, 9092/tcp
control-center    /etc/confluent/docker/run        Up       0.0.0.0:9021->9021/tcp
ksql-cli          /bin/sh                          Up
ksql-datagen      bash -c echo Waiting for K ...   Exit 1
ksql-server       /etc/confluent/docker/run        Up       0.0.0.0:8088->8088/tcp
rest-proxy        /etc/confluent/docker/run        Up       0.0.0.0:8082->8082/tcp
schema-registry   /etc/confluent/docker/run        Up       0.0.0.0:8081->8081/tcp
zookeeper         /etc/confluent/docker/run        Up       0.0.0.0:2181->2181/tcp, 2888/tcp, 3888/tcp

Here is just a docker ps so you can see the image versions I'm running to ensure I didn't make a mistake?

CONTAINER ID        IMAGE                                             COMMAND                  CREATED             STATUS              PORTS                                              NAMES
b5f9f4353234        confluentinc/cp-enterprise-control-center:5.0.1   "/etc/confluent/dock…"   5 minutes ago       Up 5 minutes        0.0.0.0:9021->9021/tcp                             control-center
a3914a772d43        confluentinc/cp-ksql-cli:5.0.1                    "/bin/sh"                5 minutes ago       Up 5 minutes                                                           ksql-cli
3926710835be        confluentinc/cp-ksql-server:5.0.1                 "/etc/confluent/dock…"   5 minutes ago       Up 5 minutes        0.0.0.0:8088->8088/tcp                             ksql-server
2b250a37323c        confluentinc/cp-kafka-rest:latest                 "/etc/confluent/dock…"   5 minutes ago       Up 5 minutes        0.0.0.0:8082->8082/tcp                             rest-proxy
efaf72570879        confluentinc/cp-kafka-connect:5.1.0               "/etc/confluent/dock…"   5 minutes ago       Up 5 minutes        0.0.0.0:8083->8083/tcp, 9092/tcp                   connect
500c33303782        confluentinc/cp-schema-registry:5.0.1             "/etc/confluent/dock…"   5 minutes ago       Up 5 minutes        0.0.0.0:8081->8081/tcp                             schema-registry
33d8255dfb68        confluentinc/cp-enterprise-kafka:5.0.1            "/etc/confluent/dock…"   5 minutes ago       Up 5 minutes        0.0.0.0:9092->9092/tcp, 0.0.0.0:29092->29092/tcp   broker
58f7cf6b82f0        confluentinc/cp-zookeeper:5.0.1                   "/etc/confluent/dock…"   5 minutes ago       Up 5 minutes        2888/tcp, 0.0.0.0:2181->2181/tcp, 3888/tcp         zookeeper

And the install command failed as you predicted (although now with a slightly different error)?

C:\dev\ChangeNotificationService\Kafka [spike/TryVersion5.1.0-post +0 ~1 -0 !]> docker-compose exec connect confluent-hub install --no-prompt confluentinc/kafka-connect-datagen:0.1.0
Running in a "--no-prompt" mode
javax.net.ssl.SSLHandshakeException: Remote host closed connection during handshake

Error: Unknown error

Is ksql-datagen a required component to try out KSQL and KTables? Thanks for your help, btw.

@ybyzek
Copy link
Contributor

ybyzek commented Jan 3, 2019

  1. ksql-datagen and kafka-connect-datagen are NOT required to try out KSQL, but they are helpful to produce data into Kafka topics, so that when you use KSQL, you will see actual data streaming in. However, if you have another way to produce Kafka data, then you can ignore these errors and just generate your own data.

But, if you are relying on ksql-datagen or kafka-connect-datagen to generate data, then we need to proceed troubleshooting:

  1. If you are using the branch 5.1.0-post, there should be no error about monitoring-interceptors-5.0.1.jar (note that the error is referencing version 5.0.1, while it should be using version 5.1.0). See the source here: https://github.com/confluentinc/cp-docker-images/blob/5.1.0-post/examples/cp-all-in-one/docker-compose.yml#L162 . So that suggests you are not using the latest 5.1.0-post branch. Can you double check, perhaps try something like:
git fetch
git checkout 5.1.0-post
git pull
  1. Regarding the Confluent Hub connectivity error on javax.net.ssl.SSLHandshakeException , I wonder if this is a problem with Windows (which is unsupported with the confluent hub client). And if you have the access, perhaps try on mac or linux.

@hpinsley
Copy link

hpinsley commented Jan 3, 2019

Argh... You're right. I'm dyslexic (pulled 5.0.1-post)! Pulled the right one. For the record, the tail of the ksql-datagen log now shows:

ksql-datagen       | [main] INFO org.apache.kafka.common.utils.AppInfoParser - Kafka version : 2.1.0-cp1                                                                                                                                                 
ksql-datagen       | [main] INFO org.apache.kafka.common.utils.AppInfoParser - Kafka commitId : f8d14150885e912e                                                                                                                                         
ksql-datagen       | [kafka-admin-client-thread | adminclient-1] WARN org.apache.kafka.clients.NetworkClient - [AdminClient clientId=adminclient-1] Connection to node -1 (broker/172.22.0.3:9092) could not be established. Broker may not be available.
ksql-datagen       | [kafka-admin-client-thread | adminclient-1] WARN org.apache.kafka.clients.NetworkClient - [AdminClient clientId=adminclient-1] Connection to node -1 (broker/172.22.0.3:9092) could not be established. Broker may not be available.
ksql-datagen       | [kafka-admin-client-thread | adminclient-1] WARN org.apache.kafka.clients.NetworkClient - [AdminClient clientId=adminclient-1] Connection to node -1 (broker/172.22.0.3:9092) could not be established. Broker may not be available.
ksql-datagen       | [kafka-admin-client-thread | adminclient-1] WARN org.apache.kafka.clients.NetworkClient - [AdminClient clientId=adminclient-1] Connection to node -1 (broker/172.22.0.3:9092) could not be established. Broker may not be available.
ksql-datagen       | [kafka-admin-client-thread | adminclient-1] WARN org.apache.kafka.clients.NetworkClient - [AdminClient clientId=adminclient-1] Connection to node -1 (broker/172.22.0.3:9092) could not be established. Broker may not be available.
ksql-datagen       | [kafka-admin-client-thread | adminclient-1] WARN org.apache.kafka.clients.NetworkClient - [AdminClient clientId=adminclient-1] Connection to node -1 (broker/172.22.0.3:9092) could not be established. Broker may not be available.
ksql-datagen       | [kafka-admin-client-thread | adminclient-1] WARN org.apache.kafka.clients.NetworkClient - [AdminClient clientId=adminclient-1] Connection to node -1 (broker/172.22.0.3:9092) could not be established. Broker may not be available.
ksql-datagen       | [kafka-admin-client-thread | adminclient-1] WARN org.apache.kafka.clients.NetworkClient - [AdminClient clientId=adminclient-1] Connection to node -1 (broker/172.22.0.3:9092) could not be established. Broker may not be available.
ksql-datagen       | [kafka-admin-client-thread | adminclient-1] WARN org.apache.kafka.clients.NetworkClient - [AdminClient clientId=adminclient-1] Connection to node -1 (broker/172.22.0.3:9092) could not be established. Broker may not be available.
ksql-datagen       | [kafka-admin-client-thread | adminclient-1] WARN org.apache.kafka.clients.NetworkClient - [AdminClient clientId=adminclient-1] Connection to node -1 (broker/172.22.0.3:9092) could not be established. Broker may not be available.
ksql-datagen       | Waiting for Confluent Schema Registry to be ready...                                                                                                                                                                                
ksql-datagen       | Waiting a few seconds for topic creation to finish...                                                                                                                                                                               
ksql-datagen       | cp: cannot stat '/usr/share/java/monitoring-interceptors/monitoring-interceptors-5.1.0.jar': No such file or directory

At least now it says 5.1.0.jar ;-)

Still have the problem with the install you gave me. I am indeed on Windows but I'm running everything from the Linux containers within Docker for Windows. Perhaps it is a proxy issue?

C:\dev\ChangeNotificationService\Kafka [spike/TryVersion5.1.0-post]> docker-compose exec connect confluent-hub install --no-prompt confluentinc/kafka-connect-datagen:0.1.0
Running in a "--no-prompt" mode
javax.net.ssl.SSLHandshakeException: Remote host closed connection during handshake

Error: Unknown error
C:\dev\ChangeNotificationService\Kafka [spike/TryVersion5.1.0-post]>


@ybyzek
Copy link
Contributor

ybyzek commented Jan 4, 2019

@hpinsley , thank you for reporting the issue regarding the ksql-datagen container. I have filed #664 to address this issue (I know what the problem is). As a temporary workaround, delete this line from your docker-compose.yml file: https://github.com/confluentinc/cp-docker-images/blob/5.1.0-post/examples/cp-all-in-one/docker-compose.yml#L162

As for the issue with connecting to Confluent Hub, my colleague has a similar environment to yours (Ubuntu 18.04 LTS on Windows 10, using Docker version 18.09.0, build 4d60db4 ) and it works for him.

@ybyzek
Copy link
Contributor

ybyzek commented Jan 4, 2019

@hpinsley, I have resolved #664 . If you refresh your 5.1.0-post branch (e.g. git fetch ; git pull), you can grab the changes that fix the problem with ksql-datagen container not starting.

(note: this will NOT fix the issue with connecting to Confluent Hub)

@ybyzek
Copy link
Contributor

ybyzek commented Jan 4, 2019

You may also download the connector directly from Confluent Hub: https://www.confluent.io/connector/kafka-connect-datagen/

@Zannith
Copy link

Zannith commented Apr 3, 2019

Has there been any change regarding the failure to connect? I ran into this issue while following the quickstart guide on Confluent. I am using version 5.2.1-post on Ubuntu 16.04.5 VM and running into an issue when performing docker-compose up -d --build

Step 3/3 : RUN confluent-hub install --no-prompt confluentinc/kafka-connect-datagen:latest
 ---> Running in faf4d4eeb2ef
Running in a "--no-prompt" mode 
java.net.NoRouteToHostException: No route to host (Host unreachable) 
 
Error: Unknown error 
ERROR: Service 'connect' failed to build: The command '/bin/sh -c confluent-hub install --no-prompt confluentinc/kafka-connect-datagen:latest' returned a non-zero code: 7

@arslanakhtar61
Copy link

Hi. I am having the same problem. Am trying to run it on docker for windows.

$ docker-compose up -d
Building connect
Step 1/3 : FROM confluentinc/cp-kafka-connect:5.2.1
 ---> 4fbfbb11e4bf
Step 2/3 : ENV CONNECT_PLUGIN_PATH="/usr/share/java,/usr/share/confluent-hub-components"
 ---> Using cache
 ---> 9f80258fc352
Step 3/3 : RUN confluent-hub install --no-prompt confluentinc/kafka-connect-datagen:latest
 ---> Running in 8bad93795133
Running in a "--no-prompt" mode
java.net.UnknownHostException: api.hub.confluent.io

Error: Unknown error
Service 'connect' failed to build: The command '/bin/sh -c confluent-hub install --no-prompt confluentinc/kafka-connect-datagen:latest' returned a non-zero code: 7

@sadehart
Copy link

For what it's worth, I've ran into a similar issue on MAC OS version 10.14.3.

The issue appears to be directly related to attempting to install datagen.

confluent-hub install --no-prompt confluentinc/kafka-connect-datagen:latest Running in a "--no-prompt" mode javax.net.ssl.SSLHandshakeException: sun.security.validator.ValidatorException: PKIX path building failed: sun.security.provider.certpath.SunCertPathBuilderException: unable to find valid certification path to requested target

@sadehart
Copy link

So a weird series of events just happened. As suggested, I manually downloaded the zip file into my downloads folder using https://www.confluent.io/connector/kafka-connect-datagen/#download. I then made a new folder in confluent-5.2.1/share called "confluent-hub-components". Then, just for fun I tried rerunning confluent-hub install confluentinc/kafka-connect-datagen:0.1.2 and it worked fine. After that ran successfully, docker-compose up -d --build ran without error.

I dunno if it's an issue with the folder not being there or if by manually downloading the thing I generated a valid certificate for the CLI to piggyback off of, but hopefully this provides some helpful info for resolvers (or at least a viable workaround).

@mattdajacob
Copy link

I tried @sadehart work around above but it did not resolve the error when trying to install kafka-connect-datagen using confluent-hub. Any other suggestions?

@ckurdekar
Copy link

You may also download the connector directly from Confluent Hub: https://www.confluent.io/connector/kafka-connect-datagen/

@ybyzek Hey i am facing same issue,
**docker-compose exec connect confluent-hub install --no-prompt confluentinc/kafka-connect-datagen:0.1.0
Running in a "--no-prompt" mode
Unable to verify Confluent Hub's identity

Error: Security issues**

Is it open source and allowed to use along with community edition? or its allowed for enterprise edition only?

@ybyzek
Copy link
Contributor

ybyzek commented Apr 29, 2019

@ckurdekar -- what's the machine/OS version?

@bdoyle807
Copy link

Hi, I'm on a macOS Mojave 10.14.5
openjdk version "1.8.0_212"
OpenJDK Runtime Environment (AdoptOpenJDK)(build 1.8.0_212-b03)
OpenJDK 64-Bit Server VM (AdoptOpenJDK)(build 25.212-b03, mixed mode)

git clone https://github.com/confluentinc/cp-docker-images
cd cp-docker-images/
git checkout 5.2.1-post
cd examples/cp-all-in-one
docker-compose up -d --build

I get

Creating network "cp-all-in-one_default" with the default driver
Building connect
Step 1/3 : FROM confluentinc/cp-kafka-connect:5.2.1
---> 4fbfbb11e4bf
Step 2/3 : ENV CONNECT_PLUGIN_PATH="/usr/share/java,/usr/share/confluent-hub-components"
---> Using cache
---> 7210b0698451
Step 3/3 : RUN confluent-hub install --no-prompt confluentinc/kafka-connect-datagen:latest
---> Running in d83da72bd41d
Running in a "--no-prompt" mode
javax.net.ssl.SSLHandshakeException: sun.security.validator.ValidatorException: PKIX path building failed: sun.security.provider.certpath.SunCertPathBuilderException: unable to find valid certification path to requested target

Error: Unknown error
ERROR: Service 'connect' failed to build: The command '/bin/sh -c confluent-hub install --no-prompt confluentinc/kafka-connect-datagen:latest' returned a non-zero code: 7

Thoughts?
Tried the modifications to docker-compose.yml for lines 51-54, but that didnt work.

Any ideas?

@ybyzek
Copy link
Contributor

ybyzek commented Jun 5, 2019

@bdoyle807 to isolate the issue, one suggestion is to try running the confluent-hub client outside of Docker. To get the confluent-hub CLI, install it following instructions here: https://docs.confluent.io/current/connect/managing/confluent-hub/client.html#installing-c-hub-client

Once you download the CLI, you can try the command below from your host machine to see if it generates the same error:

confluent-hub install confluentinc/kafka-connect-datagen:latest --component-dir .

Would you be willing to try this?

@codemayq
Copy link

codemayq commented Jun 6, 2019

Hi, I am on CentOS Linux release 7.4.1708, Java version is openjdk version "1.8.0_181".
Docker-compose file is latest.

After running docker-compose up -d --build in examples/cp-all-in-one i get the following error:

Step 3/3 : RUN confluent-hub install --no-prompt confluentinc/kafka-connect-datagen:latest
 ---> Running in 61ba06725205
Running in a "--no-prompt" mode
Implicit acceptance of the license below:
Apache License 2.0
https://www.apache.org/licenses/LICENSE-2.0
Downloading component Kafka Connect Datagen 0.1.3, provided by Confluent, Inc. from Confluent Hub and installing into /usr/share/confluent-hub-components
javax.net.ssl.SSLException: Received close_notify during handshake

Error: Unknown error
ERROR: Service 'connect' failed to build: The command '/bin/sh -c confluent-hub install --no-prompt confluentinc/kafka-connect-datagen:latest' returned a non-zero code: 7

I hava tried for many times

I also tried confluent-hub install --no-prompt confluentinc/kafka-connect-datagen:latest, but have the same problem

@ybyzek
Copy link
Contributor

ybyzek commented Jun 6, 2019

@codemayq can you try the suggestion in #654 (comment) to isolate the issue to Docker or not.

@bdoyle807
Copy link

@ybyzek The SSL issue appears to be gone but now this:
confluent-hub install --no-prompt confluentinc/kafka-connect-datagen:latest
Running in a "--no-prompt" mode
Unable to detect Confluent Platform installation. Specify --component-dir and --worker-configs explicitly.

Error: Invalid options or arguments

@ybyzek
Copy link
Contributor

ybyzek commented Jun 6, 2019

@bdoyle807 , it appears we need to specify some dirs. I have updated the comment above and copying here, this is what should be run (note the addition of --component-dir .):

confluent-hub install confluentinc/kafka-connect-datagen:latest --component-dir .

It will still error out, however, after running it, you should see the download:

/tmp: confluent-hub install confluentinc/kafka-connect-datagen:latest --component-dir .
 
Component's license: 
Apache License 2.0 
https://www.apache.org/licenses/LICENSE-2.0 
I agree to the software license agreement (yN) y

Downloading component Kafka Connect Datagen 0.1.3, provided by Confluent, Inc. from Confluent Hub and installing into . 
Unable to detect Confluent Platform installation. Specify --component-dir and --worker-configs explicitly. 
 
Error: Invalid options or arguments 


/tmp: ls confluentinc-kafka-connect-datagen 
total 4
drwxr-xr-x  7 yeva wheel  224 Jun  6 08:32 .
drwxrwxrwt 46 root wheel 1472 Jun  6 08:32 ..
drwxr-xr-x  3 yeva wheel   96 Jun  6 08:32 assets
drwxr-xr-x  7 yeva wheel  224 Jun  6 08:32 doc
drwxr-xr-x  7 yeva wheel  224 Jun  6 08:32 etc
drwxr-xr-x 20 yeva wheel  640 Jun  6 08:32 lib
-rw-r--r--  1 yeva wheel 1380 Jun  6 08:32 manifest.json
/tmp: 

@codemayq
Copy link

codemayq commented Jun 7, 2019

@codemayq can you try the suggestion in #654 (comment) to isolate the issue to Docker or not.

with

confluent-hub install confluentinc/kafka-connect-datagen:latest --component-dir .

it can be completed.

@ybyzek
Copy link
Contributor

ybyzek commented Jun 7, 2019

@codemayq so it seems this is something related to the specific Docker environment.

Short-term workaround: manually mount those jars into the connect Docker container via volumes

Long-term fix: we'll need to investigate further

@ybyzek
Copy link
Contributor

ybyzek commented Jun 7, 2019

Another hypothesis is there may an issue related to a possible proxy interfering with SSL certs. If the users on this GH thread have the availability to do so, here are some additional points to check:

  1. On the host, look at the certs in the output of echo | openssl s_client -connect api.hub.confluent.io:443 -showcerts. In a working setup, it should include this cert below; in a non-working setup, it may not show this cert below if the host is substituting certs.
Server certificate
subject=/C=US/ST=CA/L=Palo Alto/O=Confluent, Inc./OU=Information Technology/CN=*.confluent.io
issuer=/C=US/O=DigiCert Inc/CN=DigiCert SHA2 Secure Server CA
  1. Capture TLS output by issuing the following command from the docker container:
curl -iv https://api.hub.confluent.io/api/plugins
  1. Reattempt the initial problem on another network, for example a guest wifi, that perhaps does not have a proxy

@bdoyle807
Copy link

bdoyle807 commented Jun 10, 2019

@ybyzek the following output from the ssl call, i am on a vpn with a proxy to get out

echo | openssl s_client -connect api.hub.confluent.io:443 -showcerts
CONNECTED(00000006)
write:errno=54
---
no peer certificate available
---
No client certificate CA names sent
---
SSL handshake has read 0 bytes and written 0 bytes
---
New, (NONE), Cipher is (NONE)
Secure Renegotiation IS NOT supported
Compression: NONE
Expansion: NONE
No ALPN negotiated
SSL-Session:
    Protocol  : TLSv1.2
    Cipher    : 0000
    Session-ID: 
    Session-ID-ctx: 
    Master-Key: 
    Start Time: 1560182347
    Timeout   : 7200 (sec)
    Verify return code: 0 (ok)
---

@ybyzek
Copy link
Contributor

ybyzek commented Jun 10, 2019

i am on a vpn with a proxy to get out

@bdoyle807 given the above, perhaps this user's comment would be applicable to your environment? Set the proxy settings as shown here: #731 (comment)

@wjphero
Copy link

wjphero commented Aug 8, 2019

I have the same problem,and add the dns to resolv.conf below:
echo "nameserver xxx.xxx.xxx.xxx" >> resolv.conf
and then everything is OK.

@achintyaakumar
Copy link

I'm getting the same Unable to verify Confluent Hub's identity Error: Security issues problem when I tried to run docker-compose exec connect confluent-hub install --no-prompt confluentinc/kafka-connect-datagen:0.1.0 to manually get the connector. Did anyone figure out why this is happening?
@hpinsley @ybyzek

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

No branches or pull requests