Samples showing how to create and run an Apache Beam template with a custom Docker image on Google Cloud Dataflow.
This sample shows how to deploy an Apache Beam streaming pipeline that reads JSON encoded messages from Pub/Sub, uses Beam SQL to transform the message data, and writes the results to a BigQuery table.
This sample shows how to deploy an Apache Beam streaming pipeline that reads JSON encoded messages from Apache Kafka, decodes them, and writes them into a BigQuery table.
For this, we need two parts running:
- A Kafka service container accessible through an external IP address. This services publishes messages to a topic.
- An Apache Beam streaming pipeline running in Dataflow Flex Templates. This subscribes to a Kafka topic, consumes the messages that are published to that topic, processes them, and writes them into a BigQuery table.