-
Notifications
You must be signed in to change notification settings - Fork 1.1k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[feat]Support spark3.3+ and spark2.2- compile #4301
Conversation
@@ -187,12 +187,21 @@ | |||
<artifactId>linkis-rpc</artifactId> | |||
<version>${project.version}</version> | |||
</dependency> | |||
<dependency> | |||
<groupId>net.sf.py4j</groupId> | |||
<artifactId>py4j</artifactId> |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Here py4j is compatible and should not be introduced separately, because spark jars already have this dependency, which will cause low version conflicts
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
have set the dependency as provided
@@ -58,7 +60,7 @@ class JdbcSink extends DataCalcSink[JdbcSinkConfig] with Logging { | |||
.repartition(1) | |||
.foreachPartition((_: Iterator[Row]) => { | |||
val jdbcOptions = new JDBCOptions(options) | |||
val conn: Connection = JdbcUtils.createConnectionFactory(jdbcOptions)() | |||
val conn: Connection = createConnectionFactory(jdbcOptions)() |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Replace with DriverManager.getConnection(config.getUrl, config.getUser, config.getPassword)
here.
break() | ||
} | ||
}) | ||
} |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Is this right?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM.
LGTM. |
This reverts commit f5a9bfb.
What is the purpose of the change
1、after spark 3.3.0 JdbcUtils#createConnectionFactory was moved, we can improve code to support spark3.3 +
2、OrcFileFormat may be lack(take before spark2.2.1) ,we can improve code to support lower spark version
3、HiveTableRelation may be lack(take before spark2.2.1) ,we can improve code to support lower spark version
4、querytimeout may be lack(take before spark2.4.0) ,we can improve code to support lower spark version
5、authtoken may be lack(before spark2.4.0) ,we can improve code to support lower spark version
Related issues/PRs
Related issues: #4298
Related pr: #4301
Brief change log
1、rewrite
JdbcUtils#createConnectionFactory
in linkis for supporting for higher version spark(methodcreateConnectionFactory
was moved at https://issues.apache.org/jira/browse/SPARK-38361)2、use reflectiong to keep
OrcFileFormat
code compatability for higher version spark3、comment
HiveTableRelation
to keepHiveTableRelation
code compatability for lower version spark(further check at
https://github.com/apache/spark/blob/v2.2.1/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/catalog/interface.scala
https://github.com/apache/spark/blob/v2.2.0/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/catalog/interface.scala)
4、decision spark version to keep
querytimeout
code compatability for lower version spark4、use higher py4jversion to keep
authtoken
code compatability for lower version pysparkplease note: for lower version if compile failed for netty, you may try to compile source code with -Dnetty.version=4.1.51.Final
Checklist