- On the Confluent Cloud home page, click
Add Cloud Environment
on the right side.- Give the environment a name
- Select a Stream Governance package (Essentials is fine)
- Select an AWS region
- The region selected has to have Flink enabled in it. You can run
confluent flink region list
to see the current list of available regions.
- The region selected has to have Flink enabled in it. You can run
- click
Create cluster on my own
- Select either a Basic or Standard plan
- Select AWS as the cloud and pick the same region you selected in the environment configuration
- With a trial, you can choose to skip payment.
- Give the cluster a name if you want (the default is fine), and click
Launch cluster
- On the success screen, click
Get started
underSetup Client
- This takes you into the
New client
dialog- Select Python as the language
- Click create on both the
Kafka API Key
andSchema Registry API Key
, this automatically fills it into the properties file snippet to the right - Copy the snippet provided and save it for later
- Return to the Environment page by clicking the name of the environment at the top left of the page.
- Click the Flink tab and then click
Create Compute Pool
- Select the AWS region that was used in previous steps
- Give it a name and click through the rest of the dialog
- Once it's finished provisioning, click
Open SQL Workspace
- Open the table DDLs and copy paste them into the window
- Make sure you select the environment (catalog) and cluster (database) that you created previously
- Keep this window open, you'll need it again in a later step.
- Click the Flink tab and then click
- Run
./up.sh
- it will prompt you to paste the snippet you saved from before- This will build a container and then run a tmux window inside of it.
- On the left side is the producer part of the demo, which will prompt you to select a location to pull data from
- As of writing, this has been most tested on Citibike (North America > US > NY > NYC), not all locations have data, but we've tried to remove most of the ones that don't work
- TODO: You can set an environment variable to bypass the city selections
- On the right side is the consumer, which will wait until data is produced into the topics created in the next step
- Once the producer is running, you can now setup the Flink Parts
- Using the Transformation SQL provided, copy paste that into the
SQL Workspace
window that was opened earlier
- Using the Transformation SQL provided, copy paste that into the
- Once that's executed successfully, you can return to the tmux page which will eventually start to populate data into a table showing the status of the various
- This can take a few minutes for the data to start appearing
confluentinc/demo-bikeshare
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.
Folders and files
Name | Name | Last commit message | Last commit date | |
---|---|---|---|---|
Repository files navigation
About
No description, website, or topics provided.
Resources
License
Stars
Watchers
Forks
Packages 0
No packages published