Terraform code to build a Confluent Cloud Standard cluster with Schema Registry, one topic with a schema, and KSQLDB activated
-
Updated
Apr 24, 2023 - HCL
Terraform code to build a Confluent Cloud Standard cluster with Schema Registry, one topic with a schema, and KSQLDB activated
This demo shows how to capture data changes from relational databases (Oracle and PostgreSQL) and stream them to Confluent Cloud, use ksqlDB for real-time stream processing, send enriched data to cloud data warehouses (Snowflake and Amazon Redshift).
This demo shows how to stream data to cloud databases with Confluent. It includes fully-managed connectors (Oracle CDC, RabbitMQ, MongoDB Atlas), ksqlDB/Flink SQL as stream processing engine.
Streaming data pipelines for real-time data warehousing. Includes fully managed connectors (PostgreSQL CDC, Snowflake).
Add a description, image, and links to the ksqldb topic page so that developers can more easily learn about it.
To associate your repository with the ksqldb topic, visit your repo's landing page and select "manage topics."