|
| 1 | +.. Licensed to the Apache Software Foundation (ASF) under one or more |
| 2 | + contributor license agreements. See the NOTICE file distributed with |
| 3 | + this work for additional information regarding copyright ownership. |
| 4 | + The ASF licenses this file to You under the Apache License, Version 2.0 |
| 5 | + (the "License"); you may not use this file except in compliance with |
| 6 | + the License. You may obtain a copy of the License at |
| 7 | +
|
| 8 | +.. http://www.apache.org/licenses/LICENSE-2.0 |
| 9 | +
|
| 10 | +.. Unless required by applicable law or agreed to in writing, software |
| 11 | + distributed under the License is distributed on an "AS IS" BASIS, |
| 12 | + WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. |
| 13 | + See the License for the specific language governing permissions and |
| 14 | + limitations under the License. |
| 15 | +
|
| 16 | +`Flink Table Store`_ |
| 17 | +========== |
| 18 | + |
| 19 | +Flink Table Store is a unified storage to build dynamic tables for both streaming and batch processing in Flink, |
| 20 | +supporting high-speed data ingestion and timely data query. |
| 21 | + |
| 22 | +.. tip:: |
| 23 | + This article assumes that you have mastered the basic knowledge and operation of `Flink Table Store`_. |
| 24 | + For the knowledge about Flink Table Store not mentioned in this article, |
| 25 | + you can obtain it from its `Official Documentation`_. |
| 26 | + |
| 27 | +By using kyuubi, we can run SQL queries towards Flink Table Store which is more |
| 28 | +convenient, easy to understand, and easy to expand than directly using |
| 29 | +trino to manipulate Flink Table Store. |
| 30 | + |
| 31 | +Flink Table Store Integration |
| 32 | +------------------- |
| 33 | + |
| 34 | +To enable the integration of kyuubi trino sql engine and Flink Table Store, you need to: |
| 35 | + |
| 36 | +- Referencing the Flink Table Store :ref:`dependencies<trino-flink-table-store-deps>` |
| 37 | +- Setting the trino extension and catalog :ref:`configurations<trino-flink-table-store-conf>` |
| 38 | + |
| 39 | +.. _trino-flink-table-store-deps: |
| 40 | + |
| 41 | +Dependencies |
| 42 | +************ |
| 43 | + |
| 44 | +The **classpath** of kyuubi trino sql engine with Flink Table Store supported consists of |
| 45 | + |
| 46 | +1. kyuubi-trino-sql-engine-|release|.jar, the engine jar deployed with Kyuubi distributions |
| 47 | +2. a copy of trino distribution |
| 48 | +3. flink-table-store-trino-<version>.jar (example: flink-table-store-trino-0.2.jar), which code can be found in the `Source Code`_ |
| 49 | +4. flink-shaded-hadoop-2-uber-2.8.3-10.0.jar, which code can be found in the `Pre-bundled Hadoop 2.8.3`_ |
| 50 | + |
| 51 | +In order to make the Flink Table Store packages visible for the runtime classpath of engines, we can use these methods: |
| 52 | + |
| 53 | +1. Build the flink-table-store-trino-<version>.jar by reference to `Flink Table Store Trino README`_ |
| 54 | +2. Put the flink-table-store-trino-<version>.jar and flink-shaded-hadoop-2-uber-2.8.3-10.0.jar packages into ``$TRINO_SERVER_HOME/plugin/tablestore`` directly |
| 55 | + |
| 56 | +.. warning:: |
| 57 | + Please mind the compatibility of different Flink Table Store and Trino versions, which can be confirmed on the page of `Flink Table Store multi engine support`_. |
| 58 | + |
| 59 | +.. _trino-flink-table-store-conf: |
| 60 | + |
| 61 | +Configurations |
| 62 | +************** |
| 63 | + |
| 64 | +To activate functionality of Flink Table Store, we can set the following configurations: |
| 65 | + |
| 66 | +Catalogs are registered by creating a catalog properties file in the $TRINO_SERVER_HOME/etc/catalog directory. |
| 67 | +For example, create $TRINO_SERVER_HOME/etc/catalog/tablestore.properties with the following contents to mount the tablestore connector as the tablestore catalog: |
| 68 | + |
| 69 | +.. code-block:: properties |
| 70 | +
|
| 71 | + connector.name=tablestore |
| 72 | + warehouse=file:///tmp/warehouse |
| 73 | +
|
| 74 | +Flink Table Store Operations |
| 75 | +------------------ |
| 76 | + |
| 77 | +Flink Table Store supports reading table store tables through Trino. |
| 78 | +A common scenario is to write data with Flink and read data with Trino. |
| 79 | +You can follow this document `Flink Table Store Quick Start`_ to write data to a table store table |
| 80 | +and then use kyuubi trino sql engine to query the table with the following SQL ``SELECT`` statement. |
| 81 | + |
| 82 | + |
| 83 | +.. code-block:: sql |
| 84 | +
|
| 85 | + SELECT * FROM tablestore.default.t1 |
| 86 | +
|
| 87 | +
|
| 88 | +.. _Flink Table Store: https://nightlies.apache.org/flink/flink-table-store-docs-stable/ |
| 89 | +.. _Flink Table Store Quick Start: https://nightlies.apache.org/flink/flink-table-store-docs-stable/docs/try-table-store/quick-start/ |
| 90 | +.. _Official Documentation: https://nightlies.apache.org/flink/flink-table-store-docs-stable/ |
| 91 | +.. _Source Code: https://github.com/JingsongLi/flink-table-store-trino |
| 92 | +.. _Flink Table Store multi engine support: https://nightlies.apache.org/flink/flink-table-store-docs-stable/docs/engines/overview/ |
| 93 | +.. _Pre-bundled Hadoop 2.8.3: https://repo.maven.apache.org/maven2/org/apache/flink/flink-shaded-hadoop-2-uber/2.8.3-10.0/flink-shaded-hadoop-2-uber-2.8.3-10.0.jar |
| 94 | +.. _Flink Table Store Trino README: https://github.com/JingsongLi/flink-table-store-trino#readme |
0 commit comments