Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[SPARK-21445] Make IntWrapper and LongWrapper in UTF8String Serializable #18660

Closed
wants to merge 1 commit into from

Conversation

brkyvz
Copy link
Contributor

@brkyvz brkyvz commented Jul 17, 2017

What changes were proposed in this pull request?

Making those two classes will avoid Serialization issues like below:

Caused by: java.io.NotSerializableException: org.apache.spark.unsafe.types.UTF8String$IntWrapper
Serialization stack:
    - object not serializable (class: org.apache.spark.unsafe.types.UTF8String$IntWrapper, value: org.apache.spark.unsafe.types.UTF8String$IntWrapper@326450e)
    - field (class: org.apache.spark.sql.catalyst.expressions.Cast$$anonfun$castToInt$1, name: result$2, type: class org.apache.spark.unsafe.types.UTF8String$IntWrapper)
    - object (class org.apache.spark.sql.catalyst.expressions.Cast$$anonfun$castToInt$1, <function1>)

How was this patch tested?

  • Manual testing
  • Unit test

@zsxwing
Copy link
Member

zsxwing commented Jul 17, 2017

Is it safe to just ignore them? Maybe we should recover them in readExternal/read method?

@brkyvz
Copy link
Contributor Author

brkyvz commented Jul 18, 2017

I don't think we're actually trying to ship these values anywhere. We're failing to serialize the task during WholeStageCodegenExec:

at org.apache.spark.util.ClosureCleaner$.ensureSerializable(ClosureCleaner.scala:298)
    at org.apache.spark.util.ClosureCleaner$.org$apache$spark$util$ClosureCleaner$$clean(ClosureCleaner.scala:288)
    at org.apache.spark.util.ClosureCleaner$.clean(ClosureCleaner.scala:108)
    at org.apache.spark.SparkContext.clean(SparkContext.scala:2310)
    at org.apache.spark.rdd.RDD$$anonfun$mapPartitionsWithIndex$1.apply(RDD.scala:841)
    at org.apache.spark.rdd.RDD$$anonfun$mapPartitionsWithIndex$1.apply(RDD.scala:840)
    at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
    at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:112)
    at org.apache.spark.rdd.RDD.withScope(RDD.scala:362)
    at org.apache.spark.rdd.RDD.mapPartitionsWithIndex(RDD.scala:840)
    at org.apache.spark.sql.execution.WholeStageCodegenExec.doExecute(WholeStageCodegenExec.scala:389)

@brkyvz
Copy link
Contributor Author

brkyvz commented Jul 18, 2017

cc @cloud-fan and @tejasapatil who have more info based on:
c96d14a

@zsxwing
Copy link
Member

zsxwing commented Jul 18, 2017

I don't think we're actually trying to ship these values anywhere.

I see. They are static classes.

@cloud-fan
Copy link
Contributor

good catch! LGTM

@SparkQA
Copy link

SparkQA commented Jul 18, 2017

Test build #79689 has finished for PR 18660 at commit d220290.

  • This patch passes all tests.
  • This patch merges cleanly.
  • This patch adds the following public classes (experimental):
  • public static class LongWrapper implements Serializable
  • public static class IntWrapper implements Serializable

asfgit pushed a commit that referenced this pull request Jul 18, 2017
## What changes were proposed in this pull request?

Making those two classes will avoid Serialization issues like below:
```
Caused by: java.io.NotSerializableException: org.apache.spark.unsafe.types.UTF8String$IntWrapper
Serialization stack:
    - object not serializable (class: org.apache.spark.unsafe.types.UTF8String$IntWrapper, value: org.apache.spark.unsafe.types.UTF8String$IntWrapper326450e)
    - field (class: org.apache.spark.sql.catalyst.expressions.Cast$$anonfun$castToInt$1, name: result$2, type: class org.apache.spark.unsafe.types.UTF8String$IntWrapper)
    - object (class org.apache.spark.sql.catalyst.expressions.Cast$$anonfun$castToInt$1, <function1>)
```

## How was this patch tested?

- [x] Manual testing
- [ ] Unit test

Author: Burak Yavuz <brkyvz@gmail.com>

Closes #18660 from brkyvz/serializableutf8.

(cherry picked from commit 26cd2ca)
Signed-off-by: Wenchen Fan <wenchen@databricks.com>
@brkyvz
Copy link
Contributor Author

brkyvz commented Jul 18, 2017

I couldn't write an easy reproduction for the bug :(

@cloud-fan
Copy link
Contributor

thanks, merging to master!

@brkyvz I think it's fine, this bug is very obvious.

@asfgit asfgit closed this in 26cd2ca Jul 18, 2017
@brkyvz
Copy link
Contributor Author

brkyvz commented Jul 18, 2017

Also merged to branch-2.2

@brkyvz
Copy link
Contributor Author

brkyvz commented Jul 18, 2017

thanks @cloud-fan

MatthewRBruce pushed a commit to Shopify/spark that referenced this pull request Jul 31, 2018
## What changes were proposed in this pull request?

Making those two classes will avoid Serialization issues like below:
```
Caused by: java.io.NotSerializableException: org.apache.spark.unsafe.types.UTF8String$IntWrapper
Serialization stack:
    - object not serializable (class: org.apache.spark.unsafe.types.UTF8String$IntWrapper, value: org.apache.spark.unsafe.types.UTF8String$IntWrapper326450e)
    - field (class: org.apache.spark.sql.catalyst.expressions.Cast$$anonfun$castToInt$1, name: result$2, type: class org.apache.spark.unsafe.types.UTF8String$IntWrapper)
    - object (class org.apache.spark.sql.catalyst.expressions.Cast$$anonfun$castToInt$1, <function1>)
```

## How was this patch tested?

- [x] Manual testing
- [ ] Unit test

Author: Burak Yavuz <brkyvz@gmail.com>

Closes apache#18660 from brkyvz/serializableutf8.

(cherry picked from commit 26cd2ca)
Signed-off-by: Wenchen Fan <wenchen@databricks.com>
@brkyvz brkyvz deleted the serializableutf8 branch February 3, 2019 20:54
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants