Skip to content

feat: Support Spark Expression Encode#4315

Open
YutaLin wants to merge 13 commits into
apache:mainfrom
YutaLin:3183_support_spark_expression_encode
Open

feat: Support Spark Expression Encode#4315
YutaLin wants to merge 13 commits into
apache:mainfrom
YutaLin:3183_support_spark_expression_encode

Conversation

@YutaLin
Copy link
Copy Markdown

@YutaLin YutaLin commented May 13, 2026

Which issue does this PR close?

Closes #3183

Rationale for this change

Support expression Encode

What changes are included in this PR?

  • Add StringEncode in string serde
  • Update shims in spark3.4/3.5/4.0/4.1/4.2 to catch Encode

How are these changes tested?

Add encode.sql and run it in spark 3.4/3.5/4.0

Comment thread spark/src/main/spark-4.0/org/apache/comet/shims/CometExprShim.scala Outdated
Comment thread spark/src/main/scala/org/apache/comet/serde/strings.scala Outdated
@andygrove
Copy link
Copy Markdown
Member

Thanks @YutaLin. LGTM overall. Could you address feedback, then I'll kick off CI

@YutaLin
Copy link
Copy Markdown
Author

YutaLin commented May 13, 2026

Hi @andygrove, thanks for the review!
I've extract encode method and add null check.

About "Spark accepts utf8 as an alias for UTF-8", spark only supports alias before 3.5, because it uses JDK Charset.forName. After 4.0, it has a whitelist check, so it doesn't support alias. I'd suggest we keep only utf-8 now, WDYT?

https://spark.apache.org/docs/4.0.0/sql-migration-guide.html#upgrading-from-spark-sql-35-to-40

Since Spark 4.0, the encode() and decode() functions support only the following charsets ‘US-ASCII’, ‘ISO-8859-1’, ‘UTF-8’, ‘UTF-16BE’, ‘UTF-16LE’, ‘UTF-16’, ‘UTF-32’. To restore the previous behavior when the function accepts charsets of the current JDK used by Spark, set spark.sql.legacy.javaCharsets to true.

@YutaLin YutaLin requested a review from andygrove May 13, 2026 22:17
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

[Feature] Support Spark expression: encode

2 participants