Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add Kryo registration for two classes required for Spark 2.3.0. #1897

Merged
merged 1 commit into from
Feb 7, 2018
Merged
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
Original file line number Diff line number Diff line change
Expand Up @@ -283,9 +283,11 @@ class ADAMKryoRegistrator extends KryoRegistrator with Logging {
try {
kryo.register(Class.forName("org.apache.spark.internal.io.FileCommitProtocol$TaskCommitMessage"))
kryo.register(Class.forName("org.apache.spark.sql.execution.datasources.FileFormatWriter$WriteTaskResult"))
kryo.register(Class.forName("org.apache.spark.sql.execution.datasources.BasicWriteTaskStats"))
kryo.register(Class.forName("org.apache.spark.sql.execution.datasources.ExecutedWriteSummary"))
} catch {
case cnfe: java.lang.ClassNotFoundException => {
log.info("Did not find Spark internal class. This is expected for Spark 1.")
log.info("Did not find Spark internal class. This is expected for earlier Spark versions.")
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Can you say spark < 2.3.0 or is that incorrect?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

So the first two kryo.register in this try were already there from a Spark1 to Spark2 incompatibility and the two I add i this PR are for a <2.3 to a 2.3 incompatibility.
But I think just saying <2.3 as you suggest is probably fine, will do unless @heuermh comments otherwise.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

TBH the errors with Spark 1 won't happen any more, since we don't support it. There are a few other places in this file that could be updated as well, but that is a low priority.

}
}
kryo.register(classOf[org.apache.spark.sql.catalyst.expressions.UnsafeRow])
Expand Down