Skip to content

Commit cb8721f

Browse files
pan3793yanghua
authored andcommitted
[KYUUBI #1629] Flink backend implementation
### _Why are the changes needed?_ This PR covers #1619. Overall, this PR contains the following changs, 1. change `build/dist` script to support flink sql engine 2. enhance `externals/flink-sql-engine/pom.xml` to support create a shaded jar 3. simplify `externals/kyuubi-flink-sql-engine/bin/flink-sql-engine.sh` 4. introduce `FlinkSQLEngine`(flink sql engine entrypoint) and `FlinkProcessBuilder`(kyuubi server launcher) 5. add ut in kyuubi server side After this PR, we can run the basic query e.g. `select now()` from beeline and get result, and Kyuubi Server can auto launch flink engine if there is no proper one. The Flink engine also supports other engine share levels defined in Kyuubi. The implementation based on Flink 1.14 codebase. ### _How was this patch tested?_ - [ ] Add some test cases that check the changes thoroughly including negative and positive cases if possible - [ ] Add screenshots for manual tests if appropriate - [x] [Run test](https://kyuubi.readthedocs.io/en/latest/develop_tools/testing.html#running-tests) locally before make a pull request Closes #1629 from pan3793/flink-backend. Closes #1629 b7e5f0e [Cheng Pan] revert tgz name change a4496c3 [Cheng Pan] Fix reflection 3b4e86a [Cheng Pan] deps 68efa42 [Cheng Pan] log 8a9e37f [Cheng Pan] nit 10fb2bc [Cheng Pan] CI c1560fd [Cheng Pan] nit 303e2f1 [Cheng Pan] Restore log conf d84720b [Cheng Pan] SessionContext b258d81 [Cheng Pan] cleanup 16edd52 [Cheng Pan] Cleanup 9ae5455 [Cheng Pan] Fix CI 25b6b57 [Cheng Pan] hadoop-client-api c12b5ca [Cheng Pan] Server UT pass 502d3f0 [Cheng Pan] pass dac4323 [yanghua] Laungh local flink engine container successfully Lead-authored-by: Cheng Pan <chengpan@apache.org> Co-authored-by: yanghua <yanghua1127@gmail.com> Signed-off-by: Kent Yao <yao@apache.org>
1 parent e1587ee commit cb8721f

File tree

32 files changed

+1249
-238
lines changed

32 files changed

+1249
-238
lines changed

.github/workflows/master.yml

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -110,6 +110,7 @@ jobs:
110110
name: unit-tests-log
111111
path: |
112112
**/target/unit-tests.log
113+
**/kyuubi-flink-sql-engine.log*
113114
**/kyuubi-spark-sql-engine.log*
114115
**/target/scalastyle-output.xml
115116

.github/workflows/nightly.yml

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -54,4 +54,5 @@ jobs:
5454
name: unit-tests-log
5555
path: |
5656
**/target/unit-tests.log
57+
**/kyuubi-flink-sql-engine.log*
5758
**/kyuubi-spark-sql-engine.log*

build/dist

Lines changed: 25 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -30,7 +30,7 @@ set -e
3030
KYUUBI_HOME="$(cd "`dirname "$0"`/.."; pwd)"
3131
DISTDIR="$KYUUBI_HOME/dist"
3232
MAKE_TGZ=false
33-
# TODO: add FLINK_PROVIDED option
33+
FLINK_PROVIDED=false
3434
SPARK_PROVIDED=false
3535
NAME=none
3636
MVN="$KYUUBI_HOME/build/mvn"
@@ -62,6 +62,9 @@ while (( "$#" )); do
6262
--tgz)
6363
MAKE_TGZ=true
6464
;;
65+
--flink-provided)
66+
FLINK_PROVIDED=true
67+
;;
6568
--spark-provided)
6669
SPARK_PROVIDED=true
6770
;;
@@ -124,6 +127,11 @@ SCALA_VERSION=$("$MVN" help:evaluate -Dexpression=scala.binary.version $@ 2>/dev
124127
| grep -v "WARNING"\
125128
| tail -n 1)
126129

130+
FLINK_VERSION=$("$MVN" help:evaluate -Dexpression=flink.version $@ 2>/dev/null\
131+
| grep -v "INFO"\
132+
| grep -v "WARNING"\
133+
| tail -n 1)
134+
127135
SPARK_VERSION=$("$MVN" help:evaluate -Dexpression=spark.version $@ 2>/dev/null\
128136
| grep -v "INFO"\
129137
| grep -v "WARNING"\
@@ -144,7 +152,7 @@ HIVE_VERSION=$("$MVN" help:evaluate -Dexpression=hive.version $@ 2>/dev/null\
144152
| grep -v "WARNING"\
145153
| tail -n 1)
146154

147-
echo "Building Kyuubi package of version $VERSION against Spark version - $SPARK_VERSION"
155+
echo "Building Kyuubi package of version $VERSION against Flink $FLINK_VERSION, Spark $SPARK_VERSION"
148156

149157
SUFFIX="-$NAME"
150158
if [[ "$NAME" == "none" ]]; then
@@ -163,7 +171,7 @@ fi
163171

164172
MVN_DIST_OPT="-DskipTests"
165173
if [[ "$SPARK_PROVIDED" == "true" ]]; then
166-
MVN_DIST_OPT="$MVN_DIST_OPT -Pspark-provided"
174+
MVN_DIST_OPT="$MVN_DIST_OPT -Pflink-provided,spark-provided"
167175
fi
168176

169177
BUILD_COMMAND=("$MVN" clean install $MVN_DIST_OPT $@)
@@ -178,6 +186,8 @@ rm -rf "$DISTDIR"
178186
mkdir -p "$DISTDIR/pid"
179187
mkdir -p "$DISTDIR/logs"
180188
mkdir -p "$DISTDIR/work"
189+
mkdir -p "$DISTDIR/externals/engines/flink"
190+
mkdir -p "$DISTDIR/externals/engines/flink/lib"
181191
mkdir -p "$DISTDIR/externals/engines/spark"
182192
mkdir -p "$DISTDIR/beeline-jars"
183193
echo "Kyuubi $VERSION $GITREVSTRING built for" > "$DISTDIR/RELEASE"
@@ -205,6 +215,12 @@ for jar in $(ls "$DISTDIR/jars/"); do
205215
done
206216
cd -
207217

218+
# Copy flink engines
219+
cp -r "$KYUUBI_HOME/externals/kyuubi-flink-sql-engine/bin/" "$DISTDIR/externals/engines/flink/bin/"
220+
chmod a+x "$DISTDIR/externals/engines/flink/bin/flink-sql-engine.sh"
221+
cp -r "$KYUUBI_HOME/externals/kyuubi-flink-sql-engine/conf/" "$DISTDIR/externals/engines/flink/conf/"
222+
cp "$KYUUBI_HOME/externals/kyuubi-flink-sql-engine/target/kyuubi-flink-sql-engine_${SCALA_VERSION}-${VERSION}.jar" "$DISTDIR/externals/engines/flink/lib"
223+
208224
# Copy spark engines
209225
cp "$KYUUBI_HOME/externals/kyuubi-spark-sql-engine/target/kyuubi-spark-sql-engine_${SCALA_VERSION}-${VERSION}.jar" "$DISTDIR/externals/engines/spark"
210226

@@ -225,6 +241,12 @@ for SPARK_EXTENSION_VERSION in ${SPARK_EXTENSION_VERSIONS[@]}; do
225241
fi
226242
done
227243

244+
if [[ "$FLINK_PROVIDED" != "true" ]]; then
245+
# Copy flink binary dist
246+
cp -r "$KYUUBI_HOME/externals/kyuubi-download/target/flink-$FLINK_VERSION/" \
247+
"$DISTDIR/externals/flink-$FLINK_VERSION/"
248+
fi
249+
228250
if [[ "$SPARK_PROVIDED" != "true" ]]; then
229251
# Copy spark binary dist
230252
cp -r "$KYUUBI_HOME/externals/kyuubi-download/target/spark-$SPARK_VERSION-bin-hadoop${SPARK_HADOOP_VERSION}$HIVE_VERSION_SUFFIX/" \

docs/deployment/settings.md

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -300,6 +300,7 @@ kyuubi\.session\.check<br>\.interval|<div style='width: 65pt;word-wrap: break-wo
300300
kyuubi\.session\.conf<br>\.ignore\.list|<div style='width: 65pt;word-wrap: break-word;white-space: normal'></div>|<div style='width: 170pt;word-wrap: break-word;white-space: normal'>A comma separated list of ignored keys. If the client connection contains any of them, the key and the corresponding value will be removed silently during engine bootstrap and connection setup. Note that this rule is for server-side protection defined via administrators to prevent some essential configs from tampering but will not forbid users to set dynamic configurations via SET syntax.</div>|<div style='width: 30pt'>seq</div>|<div style='width: 20pt'>1.2.0</div>
301301
kyuubi\.session\.conf<br>\.restrict\.list|<div style='width: 65pt;word-wrap: break-word;white-space: normal'></div>|<div style='width: 170pt;word-wrap: break-word;white-space: normal'>A comma separated list of restricted keys. If the client connection contains any of them, the connection will be rejected explicitly during engine bootstrap and connection setup. Note that this rule is for server-side protection defined via administrators to prevent some essential configs from tampering but will not forbid users to set dynamic configurations via SET syntax.</div>|<div style='width: 30pt'>seq</div>|<div style='width: 20pt'>1.2.0</div>
302302
kyuubi\.session\.engine<br>\.check\.interval|<div style='width: 65pt;word-wrap: break-word;white-space: normal'>PT1M</div>|<div style='width: 170pt;word-wrap: break-word;white-space: normal'>The check interval for engine timeout</div>|<div style='width: 30pt'>duration</div>|<div style='width: 20pt'>1.0.0</div>
303+
kyuubi\.session\.engine<br>\.flink\.main\.resource|<div style='width: 65pt;word-wrap: break-word;white-space: normal'>&lt;undefined&gt;</div>|<div style='width: 170pt;word-wrap: break-word;white-space: normal'>The package used to create Flink SQL engine remote job. If it is undefined, Kyuubi will use the default</div>|<div style='width: 30pt'>string</div>|<div style='width: 20pt'>1.4.0</div>
303304
kyuubi\.session\.engine<br>\.idle\.timeout|<div style='width: 65pt;word-wrap: break-word;white-space: normal'>PT30M</div>|<div style='width: 170pt;word-wrap: break-word;white-space: normal'>engine timeout, the engine will self-terminate when it's not accessed for this duration. 0 or negative means not to self-terminate.</div>|<div style='width: 30pt'>duration</div>|<div style='width: 20pt'>1.0.0</div>
304305
kyuubi\.session\.engine<br>\.initialize\.timeout|<div style='width: 65pt;word-wrap: break-word;white-space: normal'>PT3M</div>|<div style='width: 170pt;word-wrap: break-word;white-space: normal'>Timeout for starting the background engine, e.g. SparkSQLEngine.</div>|<div style='width: 30pt'>duration</div>|<div style='width: 20pt'>1.0.0</div>
305306
kyuubi\.session\.engine<br>\.launch\.async|<div style='width: 65pt;word-wrap: break-word;white-space: normal'>true</div>|<div style='width: 170pt;word-wrap: break-word;white-space: normal'>When opening kyuubi session, whether to launch backend engine asynchronously. When true, the Kyuubi server will set up the connection with the client without delay as the backend engine will be created asynchronously.</div>|<div style='width: 30pt'>boolean</div>|<div style='width: 20pt'>1.4.0</div>
Lines changed: 69 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,69 @@
1+
#!/usr/bin/env bash
2+
################################################################################
3+
# Licensed to the Apache Software Foundation (ASF) under one
4+
# or more contributor license agreements. See the NOTICE file
5+
# distributed with this work for additional information
6+
# regarding copyright ownership. The ASF licenses this file
7+
# to you under the Apache License, Version 2.0 (the
8+
# "License"); you may not use this file except in compliance
9+
# with the License. You may obtain a copy of the License at
10+
#
11+
# http://www.apache.org/licenses/LICENSE-2.0
12+
#
13+
# Unless required by applicable law or agreed to in writing, software
14+
# distributed under the License is distributed on an "AS IS" BASIS,
15+
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
16+
# See the License for the specific language governing permissions and
17+
# limitations under the License.
18+
################################################################################
19+
20+
################################################################################
21+
# Adopted from "flink" bash script
22+
################################################################################
23+
24+
if [[ -z "$FLINK_HOME" || ! -d "$FLINK_HOME" ]]; then
25+
(>&2 echo "Invalid FLINK_HOME: ${FLINK_HOME:-unset}")
26+
exit 1
27+
fi
28+
29+
FLINK_SQL_ENGINE_HOME="$(cd `dirname $0`/..; pwd)"
30+
if [[ "$FLINK_SQL_ENGINE_HOME" == "$KYUUBI_HOME/externals/engines/flink" ]]; then
31+
FLINK_SQL_ENGINE_CONF_DIR="$FLINK_SQL_ENGINE_HOME/conf"
32+
FLINK_SQL_ENGINE_LIB_DIR="$FLINK_SQL_ENGINE_HOME/lib"
33+
FLINK_SQL_ENGINE_LOG_DIR="$KYUUBI_LOG_DIR"
34+
FLINK_SQL_ENGINE_JAR=$(find "$FLINK_SQL_ENGINE_LIB_DIR" -regex ".*/kyuubi-flink-sql-engine_.*\.jar")
35+
FLINK_HADOOP_CLASSPATH="$INTERNAL_HADOOP_CLASSPATHS"
36+
else
37+
echo -e "\nFLINK_SQL_ENGINE_HOME $FLINK_SQL_ENGINE_HOME doesn't match production directory, assuming in development environment..."
38+
FLINK_SQL_ENGINE_CONF_DIR="$FLINK_SQL_ENGINE_HOME/conf"
39+
FLINK_SQL_ENGINE_LIB_DIR="$FLINK_SQL_ENGINE_HOME/target"
40+
FLINK_SQL_ENGINE_LOG_DIR="$FLINK_SQL_ENGINE_HOME/target"
41+
FLINK_SQL_ENGINE_JAR=$(find "$FLINK_SQL_ENGINE_LIB_DIR" -regex '.*/kyuubi-flink-sql-engine_.*\.jar$' | grep -v '\-javadoc.jar$' | grep -v '\-tests.jar$')
42+
_FLINK_SQL_ENGINE_HADOOP_CLIENT_JARS=$(find $FLINK_SQL_ENGINE_LIB_DIR -regex '.*/hadoop-client-.*\.jar$' | tr '\n' ':')
43+
FLINK_HADOOP_CLASSPATH="${_FLINK_SQL_ENGINE_HADOOP_CLIENT_JARS%:}"
44+
fi
45+
46+
# do NOT let config.sh detect FLINK_HOME
47+
_FLINK_HOME_DETERMINED=1 . "$FLINK_HOME/bin/config.sh"
48+
49+
FLINK_IDENT_STRING=${FLINK_IDENT_STRING:-"$USER"}
50+
FLINK_SQL_CLIENT_JAR=$(find "$FLINK_OPT_DIR" -regex ".*flink-sql-client.*.jar")
51+
CC_CLASSPATH=`constructFlinkClassPath`
52+
53+
FULL_CLASSPATH="$FLINK_SQL_ENGINE_JAR:$FLINK_SQL_CLIENT_JAR:$CC_CLASSPATH:$FLINK_HADOOP_CLASSPATH"
54+
55+
log_file="$FLINK_SQL_ENGINE_LOG_DIR/kyuubi-flink-sql-engine-$FLINK_IDENT_STRING-$HOSTNAME.log"
56+
log_setting=(
57+
-Dlog.file="$log_file"
58+
-Dlog4j.configurationFile=file:"$FLINK_SQL_ENGINE_CONF_DIR/log4j.properties"
59+
-Dlog4j.configuration=file:"$FLINK_SQL_ENGINE_CONF_DIR/log4j.properties"
60+
-Dlogback.configurationFile=file:"$FLINK_SQL_ENGINE_CONF_DIR/logback.xml"
61+
)
62+
63+
if [ -n "$FLINK_SQL_ENGINE_JAR" ]; then
64+
exec $JAVA_RUN ${FLINK_SQL_ENGINE_DYNAMIC_ARGS} "${log_setting[@]}" -cp ${FULL_CLASSPATH} \
65+
org.apache.kyuubi.engine.flink.FlinkSQLEngine "$@"
66+
else
67+
(>&2 echo "[ERROR] Flink SQL Engine JAR file 'kyuubi-flink-sql-engine*.jar' should be located in $FLINK_SQL_ENGINE_LIB_DIR.")
68+
exit 1
69+
fi
Lines changed: 43 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,43 @@
1+
################################################################################
2+
# Licensed to the Apache Software Foundation (ASF) under one
3+
# or more contributor license agreements. See the NOTICE file
4+
# distributed with this work for additional information
5+
# regarding copyright ownership. The ASF licenses this file
6+
# to you under the Apache License, Version 2.0 (the
7+
# "License"); you may not use this file except in compliance
8+
# with the License. You may obtain a copy of the License at
9+
#
10+
# http://www.apache.org/licenses/LICENSE-2.0
11+
#
12+
# Unless required by applicable law or agreed to in writing, software
13+
# distributed under the License is distributed on an "AS IS" BASIS,
14+
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
15+
# See the License for the specific language governing permissions and
16+
# limitations under the License.
17+
################################################################################
18+
19+
# This affects logging for both kyuubi-flink-sql-engine and Flink
20+
log4j.rootLogger=INFO, CA
21+
22+
#Console Appender
23+
log4j.appender.CA=org.apache.log4j.ConsoleAppender
24+
log4j.appender.CA.layout=org.apache.log4j.PatternLayout
25+
log4j.appender.CA.layout.ConversionPattern=%d{HH:mm:ss.SSS} %p %c: %m%n
26+
log4j.appender.CA.Threshold = FATAL
27+
28+
# Log all infos in the given file
29+
log4j.appender.file=org.apache.log4j.FileAppender
30+
log4j.appender.file.file=${log.file}
31+
log4j.appender.file.append=false
32+
log4j.appender.file.layout=org.apache.log4j.PatternLayout
33+
log4j.appender.file.layout.ConversionPattern=%d{yyyy-MM-dd HH:mm:ss,SSS} %-5p %-60c %x - %m%n
34+
35+
#File Appender
36+
log4j.appender.FA=org.apache.log4j.FileAppender
37+
log4j.appender.FA.append=false
38+
log4j.appender.FA.file=target/unit-tests.log
39+
log4j.appender.FA.layout=org.apache.log4j.PatternLayout
40+
log4j.appender.FA.layout.ConversionPattern=%d{HH:mm:ss.SSS} %t %p %c{2}: %m%n
41+
42+
# Set the logger level of File Appender to DEBUG
43+
log4j.appender.FA.Threshold = DEBUG
Lines changed: 32 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,32 @@
1+
<!--
2+
~ Licensed to the Apache Software Foundation (ASF) under one
3+
~ or more contributor license agreements. See the NOTICE file
4+
~ distributed with this work for additional information
5+
~ regarding copyright ownership. The ASF licenses this file
6+
~ to you under the Apache License, Version 2.0 (the
7+
~ "License"); you may not use this file except in compliance
8+
~ with the License. You may obtain a copy of the License at
9+
~
10+
~ http://www.apache.org/licenses/LICENSE-2.0
11+
~
12+
~ Unless required by applicable law or agreed to in writing, software
13+
~ distributed under the License is distributed on an "AS IS" BASIS,
14+
~ WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
15+
~ See the License for the specific language governing permissions and
16+
~ limitations under the License.
17+
-->
18+
19+
<configuration>
20+
<appender name="file" class="ch.qos.logback.core.FileAppender">
21+
<file>${log.file}</file>
22+
<append>false</append>
23+
<encoder>
24+
<pattern>%d{yyyy-MM-dd HH:mm:ss.SSS} [%thread] %-5level %logger{60} %X{sourceThread} - %msg%n</pattern>
25+
</encoder>
26+
</appender>
27+
28+
<!-- # This affects logging for both kyuubi-flink-sql-engine and Flink -->
29+
<root level="INFO">
30+
<appender-ref ref="file"/>
31+
</root>
32+
</configuration>

externals/kyuubi-flink-sql-engine/pom.xml

Lines changed: 112 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -147,4 +147,116 @@
147147
</dependency>
148148
</dependencies>
149149

150+
<build>
151+
<outputDirectory>target/scala-${scala.binary.version}/classes</outputDirectory>
152+
<testOutputDirectory>target/scala-${scala.binary.version}/test-classes</testOutputDirectory>
153+
<plugins>
154+
<plugin>
155+
<groupId>org.apache.maven.plugins</groupId>
156+
<artifactId>maven-shade-plugin</artifactId>
157+
<configuration>
158+
<shadedArtifactAttached>false</shadedArtifactAttached>
159+
<artifactSet>
160+
<includes>
161+
<include>org.apache.kyuubi:kyuubi-common_${scala.binary.version}</include>
162+
<include>org.apache.kyuubi:kyuubi-ha_${scala.binary.version}</include>
163+
<include>com.fasterxml.jackson.core:*</include>
164+
<include>com.fasterxml.jackson.module:*</include>
165+
<include>com.google.guava:failureaccess</include>
166+
<include>com.google.guava:guava</include>
167+
<include>commons-codec:commons-codec</include>
168+
<include>io.netty:netty-all</include>
169+
<include>org.apache.commons:commons-lang3</include>
170+
<include>org.apache.curator:curator-client</include>
171+
<include>org.apache.curator:curator-framework</include>
172+
<include>org.apache.curator:curator-recipes</include>\
173+
<include>org.apache.hive:hive-service-rpc</include>
174+
<include>org.apache.thrift:*</include>
175+
<include>org.apache.zookeeper:*</include>
176+
</includes>
177+
</artifactSet>
178+
<relocations>
179+
<relocation>
180+
<pattern>com.fasterxml.jackson</pattern>
181+
<shadedPattern>${kyuubi.shade.packageName}.com.fasterxml.jackson</shadedPattern>
182+
<includes>
183+
<include>com.fasterxml.jackson.**</include>
184+
</includes>
185+
</relocation>
186+
<relocation>
187+
<pattern>org.apache.curator</pattern>
188+
<shadedPattern>${kyuubi.shade.packageName}.org.apache.curator</shadedPattern>
189+
<includes>
190+
<include>org.apache.curator.**</include>
191+
</includes>
192+
</relocation>
193+
<relocation>
194+
<pattern>com.google.common</pattern>
195+
<shadedPattern>${kyuubi.shade.packageName}.com.google.common</shadedPattern>
196+
<includes>
197+
<include>com.google.common.**</include>
198+
</includes>
199+
</relocation>
200+
<relocation>
201+
<pattern>org.apache.commons</pattern>
202+
<shadedPattern>${kyuubi.shade.packageName}.org.apache.commons</shadedPattern>
203+
<includes>
204+
<include>org.apache.commons.**</include>
205+
</includes>
206+
</relocation>
207+
<relocation>
208+
<pattern>io.netty</pattern>
209+
<shadedPattern>${kyuubi.shade.packageName}.io.netty</shadedPattern>
210+
<includes>
211+
<include>io.netty.**</include>
212+
</includes>
213+
</relocation>
214+
<relocation>
215+
<pattern>org.apache.hive.service.rpc.thrift</pattern>
216+
<shadedPattern>${kyuubi.shade.packageName}.org.apache.hive.service.rpc.thrift</shadedPattern>
217+
<includes>
218+
<include>org.apache.hive.service.rpc.thrift.**</include>
219+
</includes>
220+
</relocation>
221+
<relocation>
222+
<pattern>org.apache.thrift</pattern>
223+
<shadedPattern>${kyuubi.shade.packageName}.org.apache.thrift</shadedPattern>
224+
<includes>
225+
<include>org.apache.thrift.**</include>
226+
</includes>
227+
</relocation>
228+
<relocation>
229+
<pattern>org.apache.zookeeper</pattern>
230+
<shadedPattern>${kyuubi.shade.packageName}.org.apache.zookeeper</shadedPattern>
231+
<includes>
232+
<include>org.apache.zookeeper.**</include>
233+
</includes>
234+
</relocation>
235+
</relocations>
236+
</configuration>
237+
<executions>
238+
<execution>
239+
<phase>package</phase>
240+
<goals>
241+
<goal>shade</goal>
242+
</goals>
243+
</execution>
244+
</executions>
245+
</plugin>
246+
247+
<plugin>
248+
<groupId>org.apache.maven.plugins</groupId>
249+
<artifactId>maven-jar-plugin</artifactId>
250+
<executions>
251+
<execution>
252+
<id>prepare-test-jar</id>
253+
<phase>test-compile</phase>
254+
<goals>
255+
<goal>test-jar</goal>
256+
</goals>
257+
</execution>
258+
</executions>
259+
</plugin>
260+
</plugins>
261+
</build>
150262
</project>

0 commit comments

Comments
 (0)