Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Spark: Fix NPEs in spark value converter #4663

Merged
merged 6 commits into from Apr 29, 2022

Conversation

edgarRd
Copy link
Contributor

@edgarRd edgarRd commented Apr 28, 2022

This PR adds tests for SparkValueConverter across spark versions and fixed several NPE instances when using null in a Spark row converting to an Iceberg record using different schemas, including:

  • Spark 2.4 with Scala 2.11 fails to unwrap null values for maps and lists
  • Spark 3.1 is affected by SPARK-37654; this should be fixed in 3.1.3 but that requires a version upgrade
  • All versions fail to handle null structs within a row

@github-actions github-actions bot added the spark label Apr 28, 2022
return convertedMap;
} catch (NullPointerException npe) {
// Scala 2.11 fix: Catch NPE as internal value could be null and scala wrapper does not
// evaluate until iteration.
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Can you explain this a bit more? I don't understand what is happening in Scala 2.11. Is it that the object returned by map.entrySet() is non-null, but calling iterator on it will result in NPE?

Is there an alternative to this? If the value returned by Iterable.iterator() is null, can we catch that directly rather than using a catch block?

Copy link
Contributor Author

@edgarRd edgarRd Apr 29, 2022

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Unfortunately, I don't think we can access the underlying map that is null. The line throwing the NPE is https://github.com/scala/scala/blob/2.11.x/src/library/scala/collection/convert/Wrappers.scala#L184 - here the underlying.iterator throws NPE as underlying is null, similar to any other Map method call that uses underlying such as Map#size().
In fact, Map#entrySet returns a valid anonymous set but when the iterator() method is called the NPE is raised.

I wonder if with reflection you can check for nullity on underlying but I think that's no better approach. Open to suggestions. Thanks!

@@ -95,6 +107,10 @@ public static Object convert(Type type, Object object) {
}

private static Record convert(Types.StructType struct, Row row) {
if (row == null) {
return null;
}
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This looks reasonable to me.

Copy link
Contributor

@rdblue rdblue left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Overall this looks fine, but I'd like to avoid catching NPE if possible.

@rdblue rdblue merged commit d496ca1 into apache:master Apr 29, 2022
@rdblue
Copy link
Contributor

rdblue commented Apr 29, 2022

Thanks, @edgarRd! I'll mark this for inclusion in 0.13.2 as well.

@rdblue rdblue added this to the Iceberg 0.13.2 Release milestone Apr 29, 2022
@edgarRd edgarRd deleted the erod--npe-spark2-value-converter branch May 3, 2022 02:09
nastra pushed a commit to nastra/iceberg that referenced this pull request May 16, 2022
rdblue pushed a commit that referenced this pull request May 17, 2022
Co-authored-by: Edgar Rodriguez <edgar.rodriguez@airbnb.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

2 participants