-
Notifications
You must be signed in to change notification settings - Fork 13.8k
[FLINK-16098] [chinese-translation, Documentation] Translate "Overview" page of "Hive Integration" into Chinese #11391
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
…w" page of "Hive Integration" into Chinese
…w" page of "Hive Integration" into Chinese
|
Thanks a lot for your contribution to the Apache Flink project. I'm the @flinkbot. I help the community Automated ChecksLast check on commit 89406cb (Thu Mar 12 08:37:49 UTC 2020) ✅no warnings Mention the bot in a comment to re-run the automated checks. Review Progress
Please see the Pull Request Review Guide for a full explanation of the review process. DetailsThe Bot is tracking the review progress through labels. Labels are applied according to the order of the review items. For consensus, approval by a Flink committer of PMC member is required Bot commandsThe @flinkbot bot supports the following commands:
|
|
Could you help to review this? @lirui-apache |
|
Is there a problem with this commit? |
|
Thanks @liuzhixing1006 for contribution, review is on the way. |
|
Sorry I overlooked this one. Will take a look. |
docs/dev/table/hive/index.zh.md
Outdated
| 它不仅仅是一个用于大数据分析和ETL场景的SQL引擎,同样它也是一个数据管理平台,可用于发现,定义,和演化数据。 | ||
|
|
||
| Flink offers a two-fold integration with Hive. | ||
| Flink提供了两种和Hive集成的方式。 |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
| Flink提供了两种和Hive集成的方式。 | |
| Flink 与 Hive 的集成包含两个层面。 |
docs/dev/table/hive/index.zh.md
Outdated
|
|
||
| The first is to leverage Hive's Metastore as a persistent catalog with Flink's `HiveCatalog` for storing Flink specific metadata across sessions. | ||
| For example, users can store their Kafka or ElasticSearch tables in Hive Metastore by using `HiveCatalog`, and reuse them later on in SQL queries. | ||
| 第一种方式是利用了Hive的MetaStore作为持久化的目录,和Flink在`HiveCatalog`存储的特定元数据进行跨系统会话。 |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
| 第一种方式是利用了Hive的MetaStore作为持久化的目录,和Flink在`HiveCatalog`存储的特定元数据进行跨系统会话。 | |
| 一是利用了Hive的MetaStore作为持久化的Catalog,用户可通过`HiveCatalog`将不同会话中的Flink元数据存储到Hive Metastore中。 |
docs/dev/table/hive/index.zh.md
Outdated
| The first is to leverage Hive's Metastore as a persistent catalog with Flink's `HiveCatalog` for storing Flink specific metadata across sessions. | ||
| For example, users can store their Kafka or ElasticSearch tables in Hive Metastore by using `HiveCatalog`, and reuse them later on in SQL queries. | ||
| 第一种方式是利用了Hive的MetaStore作为持久化的目录,和Flink在`HiveCatalog`存储的特定元数据进行跨系统会话。 | ||
| 例如,用户可以使用`HiveCatalog`将其Kafka信息或ElasticSearch表存储在Hive Metastore中,并后续在SQL查询中重新使用它们。 |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
| 例如,用户可以使用`HiveCatalog`将其Kafka信息或ElasticSearch表存储在Hive Metastore中,并后续在SQL查询中重新使用它们。 | |
| 例如,用户可以使用`HiveCatalog`将其Kafka表或ElasticSearch表存储在Hive Metastore中,并后续在SQL查询中重新使用它们。 |
docs/dev/table/hive/index.zh.md
Outdated
| 例如,用户可以使用`HiveCatalog`将其Kafka信息或ElasticSearch表存储在Hive Metastore中,并后续在SQL查询中重新使用它们。 | ||
|
|
||
| The second is to offer Flink as an alternative engine for reading and writing Hive tables. | ||
| 第二种方式是将Flink作为读写Hive表的替代引擎。 |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
| 第二种方式是将Flink作为读写Hive表的替代引擎。 | |
| 二是利用Flink来读写Hive的表。 |
docs/dev/table/hive/index.zh.md
Outdated
|
|
||
| The `HiveCatalog` is designed to be “out of the box” compatible with existing Hive installations. | ||
| You do not need to modify your existing Hive Metastore or change the data placement or partitioning of your tables. | ||
| `HiveCatalog`被设计成与现有的Hive安装"开箱即用"兼容。 |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
| `HiveCatalog`被设计成与现有的Hive安装"开箱即用"兼容。 | |
| `HiveCatalog`的设计提供了与Hive良好的兼容性,用户可以"开箱即用"的访问其已有的Hive数仓。 |
docs/dev/table/hive/index.zh.md
Outdated
| #### Using bundled hive jar | ||
| #### 使用捆绑的Hive jar | ||
|
|
||
| 下表列出了所有可捆绑的hive jars。您可以在Flink发行版的`/lib/` 目录中去选择一个。 |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
| 下表列出了所有可捆绑的hive jars。您可以在Flink发行版的`/lib/` 目录中去选择一个。 | |
| 下表列出了所有可用的Hive jar。您可以选择一个并放在Flink发行版的`/lib/` 目录中。 |
docs/dev/table/hive/index.zh.md
Outdated
| </div> | ||
|
|
||
| If you use the hive version of HDP or CDH, you need to refer to the dependency in the previous section and select a similar version. | ||
| 如果使用hive的HDP或CDH版本,则需要参考上一节中的依赖项并选择一个类似的版本。 |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
| 如果使用hive的HDP或CDH版本,则需要参考上一节中的依赖项并选择一个类似的版本。 | |
| 如果使用Hive的HDP或CDH版本,则需要参考上一节中的依赖项并选择一个类似的版本。 |
docs/dev/table/hive/index.zh.md
Outdated
| 如果使用hive的HDP或CDH版本,则需要参考上一节中的依赖项并选择一个类似的版本。 | ||
|
|
||
| And you need to specify selected and supported "hive-version" in yaml, HiveCatalog and HiveModule. | ||
| 并且您需要在yaml,HiveCatalog和HiveModule中指定选择的和受支持的“ hive-version”。 |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
| 并且您需要在yaml,HiveCatalog和HiveModule中指定选择的和受支持的“ hive-version”。 | |
| 并且您需要在定义yaml文件,或者创建HiveCatalog和HiveModule时,指定一个支持的“ hive-version”。 |
docs/dev/table/hive/index.zh.md
Outdated
| 并且您需要在yaml,HiveCatalog和HiveModule中指定选择的和受支持的“ hive-version”。 | ||
|
|
||
| #### Program maven | ||
| ### 程序maven依赖 |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
| ### 程序maven依赖 | |
| ### Maven依赖 |
docs/dev/table/hive/index.zh.md
Outdated
|
|
||
| Connect to an existing Hive installation using the [catalog interface]({{ site.baseurl }}/dev/table/catalogs.html) | ||
| and [HiveCatalog]({{ site.baseurl }}/dev/table/hive/hive_catalog.html) through the table environment or YAML configuration. | ||
| 通过表环境或者YAML配置,使用 [catalog interface]({{ site.baseurl }}/dev/table/catalogs.html) 和[HiveCatalog]({{ site.baseurl }}/dev/table/hive/hive_catalog.html)连接到现有的Hive集群。 |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
| 通过表环境或者YAML配置,使用 [catalog interface]({{ site.baseurl }}/dev/table/catalogs.html) 和[HiveCatalog]({{ site.baseurl }}/dev/table/hive/hive_catalog.html)连接到现有的Hive集群。 | |
| 通过TableEnvironment或者YAML配置,使用 [catalog interface]({{ site.baseurl }}/dev/table/catalogs.html) 和[HiveCatalog]({{ site.baseurl }}/dev/table/hive/hive_catalog.html)连接到现有的Hive集群。 |
|
@liuzhixing1006 Thanks for the PR! I left some comments. |
… expression references a keyword column name This is a followup fix of FLINK-16018. This closes apache#11380
…sConfigOptions.REST_SERVICE_EXPOSED_TYPE This closes apache#11346 .
…s OutOfMemoryError This closes apache#11300.
This commit explicitly enables Sax' NAMESPACES_FEATURE (default true) in JAXP (default false), such that multi part uploads to S3 actually work.
…ding an unbounded source in batch mode This closes apache#11387
…sql cli This closes apache#11392
+ GetOperatorUniqueIDTest + RemoteStreamEnvironmentTest + ReinterpretDataStreamAsKeyedStreamITCase + LocalStreamEnvironmentITCase
…ies to a flink-runtime utils
…o flink-clients This closes apache#10526 .
…-metrics-influxdb With FLINK-12147 we bumped the influxdb-java version from 2.14 to 2.16. At the same time we still have okio and okhttp fixed to an incompatible version. This commit removes the dependency management entries for these dependencies so that the influxdb reporter bundles the correct dependencies.
This commit bumps influxdb-java version from 2.16 to 2.17. This resolves a dependency convergence problem within the influxdb-java dependency. This closes apache#11428.
…oncepts" into Chinese This closes apache#11423
…after re-open This closes apache#11434
…into Chinese. This closes apache#11401
…ge into Chinese. This closes apache#11401
…o Chinese. This closes apache#11401
…Chinese. This closes apache#11401
Introduces unresolved data types for class-based extraction, name-based resolution, and configuration-based RAW types. Unresolved types behave like regular data types but only after they have been resolved. Using unresolved data types in nested structures leads to further unresolved types. Thus, the usage of unresolved types is type-safe in API. This closes apache#11153.
Avoids errors due to incompatible types during planning by preserving the nullability attributes of nested types. This closes apache#11260.
Fixes a classloading bug that was introduced in the last release while updating the code for the new Pipeline abstractions. This closes apache#11438.
…lism() This closes apache#11446.
…rammar This closes apache#11409.
…w" page of "Hive Integration" into Chinese
|
@lirui-apache @JingsongLi Thanks for your advice, I have improved the problem. 1.git add -A Then in the pull request you find that there are a lot of unrelated commits。 |
|
Hi @liuzhixing1006 , you can not use merge. You should use rebase only. |
|
Thank you @JingsongLi for bringing this problem to my attention, There's something wrong with this branch, I created a new pr [https://github.com//pull/11455](New PR). |
Translate "Overview" page of "Hive Integration" into Chinese
jira:https://issues.apache.org/jira/projects/FLINK/issues/FLINK-16098?filter=myopenissues