Skip to content

Commit

Permalink
[SPARK-34110][BUILD] Upgrade Zookeeper to 3.6.2
Browse files Browse the repository at this point in the history
### What changes were proposed in this pull request?

This PR upgrade Zookeeper to 3.6.2.

### Why are the changes needed?

To make Spark running on jdk 14, otherwise:
```
21/01/13 20:25:32,533 WARN [Driver-SendThread(apache-spark-zk-3.vip.hadoop.com:2181)] zookeeper.ClientCnxn:1164 : Session 0x0 for server apache-spark-zk-3.vip.hadoop.com/<unresolved>:2181, unexpected error, closing socket connection and attempting reconnect
java.lang.IllegalArgumentException: Unable to canonicalize address apache-spark-zk-3.vip.hadoop.com/<unresolved>:2181 because it's not resolvable
	at org.apache.zookeeper.SaslServerPrincipal.getServerPrincipal(SaslServerPrincipal.java:65)
	at org.apache.zookeeper.SaslServerPrincipal.getServerPrincipal(SaslServerPrincipal.java:41)
	at org.apache.zookeeper.ClientCnxn$SendThread.startConnect(ClientCnxn.java:1001)
	at org.apache.zookeeper.ClientCnxn$SendThread.run(ClientCnxn.java:1060)
```

Please see [ZOOKEEPER-3779](https://issues.apache.org/jira/browse/ZOOKEEPER-3779) for more details.

### Does this PR introduce _any_ user-facing change?

No.

### How was this patch tested?

Manual test:
1. Replace zookeeper-3.4.14.jar with zookeeper-3.6.2.jar and zookeeper-jute-3.6.2.jar
2. Run Spark on jdk 14. Hadoop 2.7 with HADOOP-12760, Hive 1.2.1 and Zookeeper server version is 3.4.6.
    Some key configurations:
    ```
    # spark-defaults.conf
    spark.yarn.appMasterEnv.JAVA_HOME              /apache/releases/jdk-14.0.2
    spark.executorEnv.JAVA_HOME                    /apache/releases/jdk-14.0.2
    # spark-env.sh
    export JAVA_HOME=/apache/releases/jdk-14.0.2
    ```

Jenkins Tests
- Hadoop 3.2: https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/134048/testReport
- Hadoop 2.7: https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/134063/testReport

Closes #31177 from wangyum/SPARK-34110.

Authored-by: Yuming Wang <yumwang@ebay.com>
Signed-off-by: Dongjoon Hyun <dhyun@apple.com>
  • Loading branch information
wangyum authored and dongjoon-hyun committed Jan 16, 2021
1 parent f354883 commit d6906b3
Show file tree
Hide file tree
Showing 3 changed files with 10 additions and 4 deletions.
3 changes: 2 additions & 1 deletion dev/deps/spark-deps-hadoop-2.7-hive-2.3
Original file line number Diff line number Diff line change
Expand Up @@ -240,5 +240,6 @@ xml-apis/1.4.01//xml-apis-1.4.01.jar
xmlenc/0.52//xmlenc-0.52.jar
xz/1.5//xz-1.5.jar
zjsonpatch/0.3.0//zjsonpatch-0.3.0.jar
zookeeper/3.4.14//zookeeper-3.4.14.jar
zookeeper-jute/3.6.2//zookeeper-jute-3.6.2.jar
zookeeper/3.6.2//zookeeper-3.6.2.jar
zstd-jni/1.4.8-1//zstd-jni-1.4.8-1.jar
3 changes: 2 additions & 1 deletion dev/deps/spark-deps-hadoop-3.2-hive-2.3
Original file line number Diff line number Diff line change
Expand Up @@ -207,5 +207,6 @@ velocity/1.5//velocity-1.5.jar
xbean-asm7-shaded/4.15//xbean-asm7-shaded-4.15.jar
xz/1.5//xz-1.5.jar
zjsonpatch/0.3.0//zjsonpatch-0.3.0.jar
zookeeper/3.4.14//zookeeper-3.4.14.jar
zookeeper-jute/3.6.2//zookeeper-jute-3.6.2.jar
zookeeper/3.6.2//zookeeper-3.6.2.jar
zstd-jni/1.4.8-1//zstd-jni-1.4.8-1.jar
8 changes: 6 additions & 2 deletions pom.xml
Original file line number Diff line number Diff line change
Expand Up @@ -123,7 +123,7 @@
<hadoop.version>3.2.2</hadoop.version>
<protobuf.version>2.5.0</protobuf.version>
<yarn.version>${hadoop.version}</yarn.version>
<zookeeper.version>3.4.14</zookeeper.version>
<zookeeper.version>3.6.2</zookeeper.version>
<curator.version>2.13.0</curator.version>
<hive.group>org.apache.hive</hive.group>
<hive.classifier>core</hive.classifier>
Expand Down Expand Up @@ -1534,7 +1534,11 @@
</exclusion>
<exclusion>
<groupId>io.netty</groupId>
<artifactId>netty</artifactId>
<artifactId>netty-handler</artifactId>
</exclusion>
<exclusion>
<groupId>io.netty</groupId>
<artifactId>netty-transport-native-epoll</artifactId>
</exclusion>
<exclusion>
<groupId>com.github.spotbugs</groupId>
Expand Down

0 comments on commit d6906b3

Please sign in to comment.