-
Notifications
You must be signed in to change notification settings - Fork 1.8k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[docs] rewrite tutorial doc for mysql-postgres-tutorial #514
Conversation
@wuchong , look forward your review, after that, I will also write a chinese version. |
9656e0f
to
c98cc36
Compare
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks @luoyuxia , I left some comments.
1. Download [Flink 1.13.2](https://downloads.apache.org/flink/flink-1.13.2/flink-1.13.2-bin-scala_2.11.tgz) and unzip it to the directory `flink-1.13.2` | ||
2. Download following JAR package required and put them under `flink-1.13.2/lib/`: | ||
|
||
```Download links are available only for stable releases.``` |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
```Download links are available only for stable releases.``` | |
**Download links are available only for stable releases.** |
``` | ||
We should see the welcome screen of the CLI client. | ||
|
||
![Flink SQL Client](/_static/fig/mysql-postgress-tutorial/flink-sql-client.png "Flink SQL Client") |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This image is not very clear, you can use this one: https://flink.apache.org/img/blog/2020-07-28-flink-sql-demo/image3.png
|
||
![Find enriched Orders](/_static/fig/mysql-postgress-tutorial/kibana-detailed-orders.png "Find enriched Orders") | ||
|
||
Next, do some change in the databases, and then the enriched orders shown in Kibana will be updated after each step in reel time. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Next, do some change in the databases, and then the enriched orders shown in Kibana will be updated after each step in reel time. | |
Next, do some change in the databases, and then the enriched orders shown in Kibana will be updated after each step in real time. |
|
||
### Enriching the orders using the products and shipments tables and write to ElasticSearch |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The title is a bit long, maybe we can simplify to Enriching orders and load to ElasticSearch
.
@@ -1,7 +1,22 @@ | |||
# Streaming ETL from MySQL and Postgres to Elasticsearch | |||
# Building a Streaming Application with Flink Mysql/Postgres CDC |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
# Building a Streaming Application with Flink Mysql/Postgres CDC | |
# Streaming ETL for MySQL and Postgres with Flink CDC |
|
||
1. Create `docker-compose.yml` file using following contents: | ||
This tutorial is to show how to quickly build streaming applications with Flink Mysql/Postgres CDC. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This tutorial is to show how to quickly build streaming applications with Flink Mysql/Postgres CDC. | |
This tutorial is to show how to quickly build streaming ETL for MySQL and Postgres with Flink CDC. |
Assuming we are running an e-commerce business. The product and order data stored in MySQL, the shipment data related to the order is stored in Postgres. | ||
We need to build a streaming application to meet the following requirements: | ||
1. Enrich the orders using the product and shipment table and write enriched orders to ElasticSearch in real time | ||
2. Calculate the GMV(Gross Merchandise Volume) by daily and write to Kafka in real time |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think we can remove the GMV demo to make the blog focus on streaming join.
@@ -174,7 +152,7 @@ docker-compose down | |||
(default,10002,'Hangzhou','Shanghai',false), | |||
(default,10003,'Shanghai','Hangzhou',false); | |||
``` | |||
## Launching the Streaming Application | |||
## Launching the Streaming ETL |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
We can remove this 2nd header and make all the 3rd headers to be 2nd level. This sub-title doesn't provide much information.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM
The current mysql-postgres-tutorial is quite not intuitively, so I rewrite the tutorial to make it intuitively.