Skip to content

Commit

Permalink
1、FlinkX升级至1.11
Browse files Browse the repository at this point in the history
2、删除kafka09插件
  • Loading branch information
kanata163 committed Jan 13, 2021
1 parent 9cfa7ed commit e2035bf
Show file tree
Hide file tree
Showing 43 changed files with 548 additions and 1,461 deletions.
36 changes: 0 additions & 36 deletions docs/example/kafka09_stream.json

This file was deleted.

9 changes: 6 additions & 3 deletions docs/questions.md
Original file line number Diff line number Diff line change
Expand Up @@ -12,9 +12,12 @@
./install_jars.sh
```

### 2.FlinkX版本需要与Flink版本保持一致
1.8_release版本对应flink1.8
1.10_release版本对应flink1.10 版本
### 2.FlinkX版本需要与Flink版本保持一致,最好小版本也保持一致
| FlinkX分支 | Flink版本 |
| --- | --- |
| 1.8_release | Flink1.8.3 |
| 1.10_release | Flink1.10.1 |
| 1.11_release | Flink1.11.3 |
不对应在standalone和yarn session模式提交时,会报错:
Caused by: java.io.InvalidClassException: org.apache.flink.api.common.operators.ResourceSpec; incompatible types for field cpuCores

Expand Down
10 changes: 5 additions & 5 deletions docs/quickstart.md
Original file line number Diff line number Diff line change
Expand Up @@ -10,14 +10,14 @@ cd flinkx
2.直接下载源码

```
wget https://github.com/DTStack/flinkx/archive/1.10_release.zip
unzip 1.10_release.zip
cd 1.10_release
wget https://github.com/DTStack/flinkx/archive/1.11_release.zip
unzip 1.11_release.zip
cd 1.11_release
```

3.直接下载源码和编译好的插件包(推荐)
```
wget https://github.com/DTStack/flinkx/releases/download/1.10.4/flinkx.7z
wget https://github.com/DTStack/flinkx/releases/download/1.11.0/flinkx.7z
7za x flinkx.7z
cd flinkx
```
Expand Down Expand Up @@ -253,7 +253,7 @@ bin/flinkx \
| **jobid** | 任务名称 ||| Flink Job |
| **pluginRoot** | 插件根目录地址,也就是打包后产生的pluginRoot目录。 ||| $FLINKX_HOME/syncplugins |
| **flinkconf** | flink配置文件所在的目录(单机模式下不需要) | $FLINK_HOME/conf || $FLINK_HOME/conf |
| **flinkLibJar** | flink lib所在的目录(单机模式下不需要),如/opt/dtstack/flink-1.10.1/lib | $FLINK_HOME/lib || $FLINK_HOME/lib |
| **flinkLibJar** | flink lib所在的目录(单机模式下不需要),如/opt/dtstack/flink-1.11.3/lib | $FLINK_HOME/lib || $FLINK_HOME/lib |
| **yarnconf** | Hadoop配置文件(包括hdfs和yarn)所在的目录 | $HADOOP_HOME/etc/hadoop || $HADOOP_HOME/etc/hadoop |
| **queue** | yarn队列,如default ||| default |
| **pluginLoadMode** | yarn session模式插件加载方式 | 1.**classpath**:提交任务时不上传插件包,需要在yarn-node节点pluginRoot目录下部署插件包,但任务启动速度较快<br />2.**shipfile**:提交任务时上传pluginRoot目录下部署插件包的插件包,yarn-node节点不需要部署插件包,任务启动速度取决于插件包的大小及网络环境 || shipfile |
Expand Down
53 changes: 7 additions & 46 deletions docs/realTime/reader/kafkareader.md
Original file line number Diff line number Diff line change
@@ -1,14 +1,14 @@
# Kafka Reader

## 一、插件名称
kafka插件存在四个版本,根据kafka版本的不同,插件名称也略有不同。具体对应关系如下表所示:
kafka插件存在三个版本,根据kafka版本的不同,插件名称也略有不同。具体对应关系如下表所示:

| kafka版本 | 插件名称 |
| --- | --- |
| kafka 0.9 | kafka09reader |
| kafka 0.10 | kafka10reader |
| kafka 0.11 | kafka11reader |
| kafka 1.0及以后 | kafkareader |
注:从FlinkX1.11版本开始不再支持kafka 0.9



Expand Down Expand Up @@ -125,9 +125,7 @@ kafka插件存在四个版本,根据kafka版本的不同,插件名称也略
- 必选:是
- 字段类型:Map
- 默认值:无
- 注意:
- kafka09 reader插件: consumerSettings必须至少包含`zookeeper.connect`参数
- kafka09 reader以外的插件:consumerSettings必须至少包含`bootstrap.servers`参数
- 注意:consumerSettings必须至少包含`bootstrap.servers`参数
- 如:
```json
{
Expand All @@ -139,44 +137,7 @@ kafka插件存在四个版本,根据kafka版本的不同,插件名称也略


## 三、配置示例
#### 1、kafka09
```json
{
"job" : {
"content" : [ {
"reader" : {
"parameter" : {
"topic" : "kafka09",
"groupId" : "default",
"codec" : "text",
"encoding": "UTF-8",
"blankIgnore": false,
"consumerSettings" : {
"zookeeper.connect" : "localhost:2181/kafka09"
}
},
"name" : "kafka09reader"
},
"writer" : {
"parameter" : {
"print" : true
},
"name" : "streamwriter"
}
} ],
"setting" : {
"restore" : {
"isRestore" : false,
"isStream" : true
},
"speed" : {
"channel" : 1
}
}
}
}
```
#### 2、kafka10
#### 1、kafka10
```json
{
"job": {
Expand Down Expand Up @@ -215,7 +176,7 @@ kafka插件存在四个版本,根据kafka版本的不同,插件名称也略
}
}
```
#### 3、kafka11
#### 2、kafka11
```json
{
"job" : {
Expand Down Expand Up @@ -252,7 +213,7 @@ kafka插件存在四个版本,根据kafka版本的不同,插件名称也略
}
}
```
#### 4、kafka
#### 3、kafka
```json
{
"job" : {
Expand Down Expand Up @@ -291,7 +252,7 @@ kafka插件存在四个版本,根据kafka版本的不同,插件名称也略
}
}
```
#### 5、kafka->Hive
#### 4、kafka->Hive
```json
{
"job": {
Expand Down
81 changes: 8 additions & 73 deletions docs/realTime/writer/kafkawriter.md
Original file line number Diff line number Diff line change
@@ -1,15 +1,14 @@
# Kafka Writer

## 一、插件名称
kafka插件存在四个版本,根据kafka版本的不同,插件名称也略有不同。具体对应关系如下表所示:
kafka插件存在三个版本,根据kafka版本的不同,插件名称也略有不同。具体对应关系如下表所示:

| kafka版本 | 插件名称 |
| --- | --- |
| kafka 0.9 | kafka09writer |
| kafka 0.10 | kafka10writer |
| kafka 0.11 | kafka11writer |
| kafka 1.0及以后 | kafkawriter |

注:从FlinkX1.11版本开始不再支持kafka 0.9


## 二、参数说明
Expand All @@ -30,29 +29,12 @@ kafka插件存在四个版本,根据kafka版本的不同,插件名称也略

<br />

- **encoding**
- 描述:编码
- 注意:该参数只对kafka09reader插件有效
- 必选:否
- 字段类型:String
- 默认值:UTF-8

<br />

- **brokerList**
- 描述:kafka broker地址列表
- 注意:该参数只对kafka09writer插件有效
- 必选:kafka09writer必选,其它kafka writer插件不用填
- 字段类型:String
- 默认值:无

<br />

- **producerSettings**
- 描述:kafka连接配置,支持所有`org.apache.kafka.clients.producer.ProducerConfig`中定义的配置
- 必选:对于非kafka09 writer插件,该参数必填,且producerSettings中至少包含`bootstrap.servers`参数
- 必选:
- 字段类型:Map
- 默认值:无
- 注意:producerSettings中至少包含`bootstrap.servers`参数

<br />

Expand All @@ -68,54 +50,7 @@ kafka插件存在四个版本,根据kafka版本的不同,插件名称也略


## 三、配置示例
#### 1、kafka09
```json
{
"job": {
"content": [{
"reader": {
"name": "streamreader",
"parameter": {
"column": [
{
"name": "id",
"type": "id"
},
{
"name": "user_id",
"type": "int"
},
{
"name": "name",
"type": "string"
}
],
"sliceRecordCount" : ["100"]
}
},
"writer" : {
"parameter": {
"timezone": "UTC",
"topic": "kafka09",
"encoding": "UTF-8",
"brokerList": "0.0.0.1:9092",
"tableFields": ["id","user_id","name"]
},
"name": "kafka09writer"
}
} ],
"setting": {
"restore" : {
"isStream" : true
},
"speed" : {
"channel" : 1
}
}
}
}
```
#### 2、kafka10
#### 1、kafka10
```json
{
"job": {
Expand Down Expand Up @@ -163,7 +98,7 @@ kafka插件存在四个版本,根据kafka版本的不同,插件名称也略
}
}
```
#### 3、kafka11
#### 2、kafka11
```json
{
"job": {
Expand Down Expand Up @@ -212,7 +147,7 @@ kafka插件存在四个版本,根据kafka版本的不同,插件名称也略
}
}
```
#### 4、kafka
#### 3、kafka
```json
{
"job": {
Expand Down Expand Up @@ -260,7 +195,7 @@ kafka插件存在四个版本,根据kafka版本的不同,插件名称也略
}
}
```
#### 5、MySQL->kafka
#### 4、MySQL->kafka
```json
{
"job" : {
Expand Down
33 changes: 8 additions & 25 deletions flinkx-core/pom.xml
Original file line number Diff line number Diff line change
Expand Up @@ -27,7 +27,7 @@
<dependency>
<groupId>org.slf4j</groupId>
<artifactId>slf4j-log4j12</artifactId>
<version>1.7.10</version>
<version>1.7.30</version>
</dependency>

<dependency>
Expand All @@ -51,32 +51,21 @@

<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-runtime-web_2.11</artifactId>
<artifactId>flink-streaming-java_${scala.binary.version}</artifactId>
<version>${flink.version}</version>
</dependency>

<!--Use this dependency if you are using the DataStream API-->
<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-streaming-java_2.11</artifactId>
<version>${flink.version}</version>
<exclusions>
<exclusion>
<groupId>org.xerial.snappy</groupId>
<artifactId>snappy-java</artifactId>
</exclusion>
</exclusions>
</dependency>

<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-clients_2.11</artifactId>
<artifactId>flink-clients_${scala.binary.version}</artifactId>
<version>${flink.version}</version>
</dependency>

<!--flink与Hadoop mapReduce兼容包-->
<!--https://ci.apache.org/projects/flink/flink-docs-stable/dev/batch/hadoop_compatibility.html-->
<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-hadoop-compatibility_2.11</artifactId>
<artifactId>flink-hadoop-compatibility_${scala.binary.version}</artifactId>
<version>${flink.version}</version>
<exclusions>
<exclusion>
Expand All @@ -86,15 +75,9 @@
</exclusions>
</dependency>

<dependency>
<groupId>commons-cli</groupId>
<artifactId>commons-cli</artifactId>
<version>1.2</version>
</dependency>

<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-yarn_2.11</artifactId>
<artifactId>flink-yarn_${scala.binary.version}</artifactId>
<version>${flink.version}</version>
<exclusions>
<exclusion>
Expand All @@ -106,7 +89,7 @@

<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-queryable-state-runtime_2.11</artifactId>
<artifactId>flink-queryable-state-runtime_${scala.binary.version}</artifactId>
<version>${flink.version}</version>
</dependency>

Expand Down

0 comments on commit e2035bf

Please sign in to comment.