-
Notifications
You must be signed in to change notification settings - Fork 13.8k
[FLINK-21440][doc] translate content.zh/docs/try-flink/table_api.md and correct spelling mistake content/docs/try-flink/table_api.md #14985
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
PrivateLi
commented
Feb 22, 2021
- translate Real Time Reporting with the Table API doc to Chinese
- Correct Real Time Reporting with the Table API doc allong with spelling mistake to along with
|
Thanks a lot for your contribution to the Apache Flink project. I'm the @flinkbot. I help the community Automated ChecksLast check on commit 8dc4576 (Mon Feb 22 13:59:39 UTC 2021) Warnings:
Mention the bot in a comment to re-run the automated checks. Review Progress
Please see the Pull Request Review Guide for a full explanation of the review process. DetailsThe Bot is tracking the review progress through labels. Labels are applied according to the order of the review items. For consensus, approval by a Flink committer of PMC member is required Bot commandsThe @flinkbot bot supports the following commands:
|
zhuxiaoshang
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks for your contribution @PrivateLi ,left some comment.
|
|
||
| Apache Flink offers a Table API as a unified, relational API for batch and stream processing, i.e., queries are executed with the same semantics on unbounded, real-time streams or bounded, batch data sets and produce the same results. | ||
| The Table API in Flink is commonly used to ease the definition of data analytics, data pipelining, and ETL applications. | ||
| Apache Flink 提供了 Table API 作为统一的相关 API,用于批处理和流处理,即:对无边界的实时流或有约束的批处理数据集以相同的语义执行查询,并产生相同的结果。Flink 中的 Table API 通常用于简化数据分析,数据管道和ETL应用程序的定义。 |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
ETL 前后留空格
| Apache Flink 提供了 Table API 作为统一的相关 API,用于批处理和流处理,即:对无边界的实时流或有约束的批处理数据集以相同的语义执行查询,并产生相同的结果。Flink 中的 Table API 通常用于简化数据分析,数据管道和ETL应用程序的定义。 | ||
|
|
||
| ## What Will You Be Building? | ||
| ## 您要搭建一个什么系统 |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
https://cwiki.apache.org/confluence/display/FLINK/Flink+Translation+Specifications 第6点,用“你” 而不用“您”,下面也一样。
|
|
||
| If you get stuck, check out the [community support resources](https://flink.apache.org/community.html). | ||
| In particular, Apache Flink's [user mailing list](https://flink.apache.org/community.html#mailing-lists) consistently ranks as one of the most active of any Apache project and a great way to get help quickly. | ||
| 如果遇到困难,可以参考[社区支持资源](https://flink.apache.org/community.html)。 当然也可以在邮件列表提问,Flink 的[用户邮件列表](https://flink.apache.org/community.html#mailing-lists)一直被评为所有Apache项目中最活跃的一个,这也是快速获得帮助的好方法。 |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
注意英文单词前后空格,括号用中文括号
|
|
||
| In this tutorial, you will learn how to build a real-time dashboard to track financial transactions by account. | ||
| The pipeline will read data from Kafka and write the results to MySQL visualized via Grafana. | ||
| 在本教程中,您将学习如何构建实时仪表板以按帐户跟踪财务交易。流程将从Kafka读取数据,并将结果写入通过 Grafana 可视化的 MySQL。 |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
将结果写入 MySQL 并通过 Grafana 提供可视化。
---这样是不是通顺一点
|
|
||
| One of Flink's unique properties is that it provides consistent semantics across batch and streaming. | ||
| This means you can develop and test applications in batch mode on static datasets, and deploy to production as streaming applications. | ||
| Flink 的独特属性之一是,它在批处理和流传输之间提供一致的语义。这意味着您可以在静态数据集上以批处理模式开发和测试应用程序,并作为流应用程序部署到生产中。 |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Flink 的特色之一是,它在批处理和流处理之间提供一致性的语义。
|
|
||
| 现在,有了 Job 设置的框架,您就可以添加一些业务逻辑。目的是建立一个报告,显示每个帐户在一天中每个小时的总支出。这意味着时间戳列需要从毫秒舍入到小时粒度。 | ||
|
|
||
| Flink supports developing relational applications in pure [SQL]({{< ref "docs/dev/table/sql/overview" >}}) or using the [Table API]({{< ref "docs/dev/table/tableApi" >}}). |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
删除这一段英文
| The Table API is a fluent DSL inspired by SQL, that can be written in Python, Java, or Scala and supports strong IDE integration. | ||
| Just like a SQL query, Table programs can select the required fields and group by your keys. | ||
| These features, allong with [built-in functions]({{< ref "docs/dev/table/functions/systemFunctions" >}}) like `floor` and `sum`, you can write this report. | ||
| Flink支持使用纯[SQL]({{< ref "docs/dev/table/sql/overview" >}})或使用[Table API]({{< ref "docs/dev/table/tableApi" >}})。Table API 是受 SQL 启发的流畅 DSL,可以用 Python、Java或 Scala 编写,并支持强大的 IDE 集成。就像 SQL 查询一样,Table 程序可以选择必填字段并按键进行分组。这些特点,结合[内置函数] ({{< ref "docs/dev/table/functions/systemFunctions" >}}) ,如floor和sum,可以编写此报告。 |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Flink支持使用纯[SQL]({{< ref "docs/dev/table/sql/overview" >}})或使用[Table API]({{< ref "docs/dev/table/tableApi" >}})来开发纯关系型应用。
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
利用这些特性,并结合[内置函数] ({{< ref "docs/dev/table/functions/systemFunctions" >}}) ,如 floor 和 sum,就可以编写 report 函数了。
| In a batch context, windows offer a convenient API for grouping records by a timestamp attribute. | ||
| 基于时间的聚合是唯一的,因为与其它属性相反,时间通常在连续流应用程序中向前移动。与用户自定义函数 `floor` 不同,窗口函数是[内部函数](https://en.wikipedia.org/wiki/Intrinsic_function),它允许运行时应用额外的优化。在批处理环境中,窗口函数提供了一种用于按 timestamp 属性对记录进行分组方便的API。 | ||
|
|
||
| Running the test with this implementation will also pass. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
未翻译
|
|
||
| Now with the skeleton of a Job set-up, you are ready to add some business logic. | ||
| The goal is to build a report that shows the total spend for each account across each hour of the day. | ||
| This means the timestamp column needs be be rounded down from millisecond to hour granularity. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
这里有个typo,应该是 ‘to be’ 吧,改下英文原文。
|
update the pull request based on feedback |
|
This PR is being marked as stale since it has not had any activity in the last 180 days. If you are having difficulty finding a reviewer, please reach out to the [community](https://flink.apache.org/what-is-flink/community/). If this PR is no longer valid or desired, please feel free to close it. If no activity occurs in the next 90 days, it will be automatically closed. |
|
This PR has been closed since it has not had any activity in 120 days. |