Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Release for Scala 2.12.14 #282

Closed
dongjoon-hyun opened this issue May 29, 2021 · 4 comments
Closed

Release for Scala 2.12.14 #282

dongjoon-hyun opened this issue May 29, 2021 · 4 comments

Comments

@dongjoon-hyun
Copy link

No description provided.

@dongjoon-hyun
Copy link
Author

dongjoon-hyun commented May 29, 2021

Hi, @SethTisue .
I saw the repo is ready for 2.12.15 nightly, but I cannot find an Maven artifact for 2.12.14 release. Could you publish it?

@SethTisue
Copy link
Member

@raboof ?

@raboof
Copy link
Contributor

raboof commented May 30, 2021

On its way to central!

@raboof raboof closed this as completed May 30, 2021
@dongjoon-hyun
Copy link
Author

Thank you, @raboof and @SethTisue !

dongjoon-hyun added a commit to apache/spark that referenced this issue May 30, 2021
### What changes were proposed in this pull request?

This PR is the 4th try to upgrade Scala 2.12.x in order to see the feasibility.
- #27929 (Upgrade Scala to 2.12.11, wangyum )
- #30940 (Upgrade Scala to 2.12.12, viirya )
- #31223 (Upgrade Scala to 2.12.13, dongjoon-hyun )

Note that Scala 2.12.14 has the following fix for Apache Spark community.
- Fix cyclic error in runtime reflection (protobuf), a regression that prevented Spark upgrading to 2.12.13

REQUIREMENTS:
- [x] `silencer` library is released via ghik/silencer#66
- [x] `genjavadoc` library is released via lightbend/genjavadoc#282

### Why are the changes needed?

Apache Spark was stuck to 2.12.10 due to the regression in Scala 2.12.11/2.12.12/2.12.13. This will bring all the bug fixes.
- https://github.com/scala/scala/releases/tag/v2.12.14
- https://github.com/scala/scala/releases/tag/v2.12.13
- https://github.com/scala/scala/releases/tag/v2.12.12
- https://github.com/scala/scala/releases/tag/v2.12.11

### Does this PR introduce _any_ user-facing change?

Yes, but this is a bug-fixed version.

### How was this patch tested?

Pass the CIs.

Closes #32697 from dongjoon-hyun/SPARK-31168.

Authored-by: Dongjoon Hyun <dhyun@apple.com>
Signed-off-by: Dongjoon Hyun <dhyun@apple.com>
wangyum added a commit to apache/spark that referenced this issue May 26, 2023
* [SPARK-31168][BUILD] Upgrade Scala to 2.12.14

### What changes were proposed in this pull request?

This PR is the 4th try to upgrade Scala 2.12.x in order to see the feasibility.
- #27929 (Upgrade Scala to 2.12.11, wangyum )
- #30940 (Upgrade Scala to 2.12.12, viirya )
- #31223 (Upgrade Scala to 2.12.13, dongjoon-hyun )

Note that Scala 2.12.14 has the following fix for Apache Spark community.
- Fix cyclic error in runtime reflection (protobuf), a regression that prevented Spark upgrading to 2.12.13

REQUIREMENTS:
- [x] `silencer` library is released via ghik/silencer#66
- [x] `genjavadoc` library is released via lightbend/genjavadoc#282

### Why are the changes needed?

Apache Spark was stuck to 2.12.10 due to the regression in Scala 2.12.11/2.12.12/2.12.13. This will bring all the bug fixes.
- https://github.com/scala/scala/releases/tag/v2.12.14
- https://github.com/scala/scala/releases/tag/v2.12.13
- https://github.com/scala/scala/releases/tag/v2.12.12
- https://github.com/scala/scala/releases/tag/v2.12.11

### Does this PR introduce _any_ user-facing change?

Yes, but this is a bug-fixed version.

### How was this patch tested?

Pass the CIs.

Closes #32697 from dongjoon-hyun/SPARK-31168.

Authored-by: Dongjoon Hyun <dhyun@apple.com>
Signed-off-by: Dongjoon Hyun <dhyun@apple.com>

(cherry picked from commit 6c4b60f)

* [SPARK-31168][BUILD] Upgrade Scala to 2.12.14

### What changes were proposed in this pull request?

This PR is the 4th try to upgrade Scala 2.12.x in order to see the feasibility.
- #27929 (Upgrade Scala to 2.12.11, wangyum )
- #30940 (Upgrade Scala to 2.12.12, viirya )
- #31223 (Upgrade Scala to 2.12.13, dongjoon-hyun )

Note that Scala 2.12.14 has the following fix for Apache Spark community.
- Fix cyclic error in runtime reflection (protobuf), a regression that prevented Spark upgrading to 2.12.13

REQUIREMENTS:
- [x] `silencer` library is released via ghik/silencer#66
- [x] `genjavadoc` library is released via lightbend/genjavadoc#282

### Why are the changes needed?

Apache Spark was stuck to 2.12.10 due to the regression in Scala 2.12.11/2.12.12/2.12.13. This will bring all the bug fixes.
- https://github.com/scala/scala/releases/tag/v2.12.14
- https://github.com/scala/scala/releases/tag/v2.12.13
- https://github.com/scala/scala/releases/tag/v2.12.12
- https://github.com/scala/scala/releases/tag/v2.12.11

### Does this PR introduce _any_ user-facing change?

Yes, but this is a bug-fixed version.

### How was this patch tested?

Pass the CIs.

Closes #32697 from dongjoon-hyun/SPARK-31168.

Authored-by: Dongjoon Hyun <dhyun@apple.com>
Signed-off-by: Dongjoon Hyun <dhyun@apple.com>

(cherry picked from commit 6c4b60f)

* [SPARK-36759][BUILD] Upgrade Scala to 2.12.15

### What changes were proposed in this pull request?

This PR aims to upgrade Scala to 2.12.15 to support Java 17/18 better.

### Why are the changes needed?

Scala 2.12.15 improves compatibility with JDK 17 and 18:

https://github.com/scala/scala/releases/tag/v2.12.15

- Avoids IllegalArgumentException in JDK 17+ for lambda deserialization
- Upgrades to ASM 9.2, for JDK 18 support in optimizer

### Does this PR introduce _any_ user-facing change?

Yes, this is a Scala version change.

### How was this patch tested?

Pass the CIs

Closes #33999 from dongjoon-hyun/SPARK-36759.

Authored-by: Dongjoon Hyun <dongjoon@apache.org>
Signed-off-by: Dongjoon Hyun <dongjoon@apache.org>

(cherry picked from commit 16f1f71)

* [SPARK-36759][BUILD][FOLLOWUP] Update version in scala-2.12 profile and doc

### What changes were proposed in this pull request?

This is a follow-up to fix the leftover during switching the Scala version.

### Why are the changes needed?

This should be consistent.

### Does this PR introduce _any_ user-facing change?

No.

### How was this patch tested?

This is not tested by UT. We need to check manually. There is no more `2.12.14`.
```
$ git grep 2.12.14
R/pkg/tests/fulltests/test_sparkSQL.R:               c(as.Date("2012-12-14"), as.Date("2013-12-15"), as.Date("2014-12-16")))
data/mllib/ridge-data/lpsa.data:3.5307626,0.987291634724086 -0.36279314978779 -0.922212414640967 0.232904453212813 -0.522940888712441 1.79270085261407 0.342627053981254 1.26288870310799
sql/hive/src/test/resources/data/files/over10k:-3|454|65705|4294967468|62.12|14.32|true|mike white|2013-03-01 09:11:58.703087|40.18|joggying
```

Closes #34020 from dongjoon-hyun/SPARK-36759-2.

Authored-by: Dongjoon Hyun <dongjoon@apache.org>
Signed-off-by: Dongjoon Hyun <dongjoon@apache.org>

(cherry picked from commit adbea25)

* [SPARK-39414][BUILD] Upgrade Scala to 2.12.16

### What changes were proposed in this pull request?
This PR aims to upgrade Scala to 2.12.16

### Why are the changes needed?
This version bring some bug fix and  start to try to support Java 19

scala/scala@v2.12.15...v2.12.16

- [Upgrade to asm 9.3, for JDK19 support](scala/scala#10000)
- [Fix codegen for MH.invoke etc under JDK 17 -release](scala/scala#9930)
- [Deprecation related SecurityManager on JDK 17 ](scala/scala#9775)

### Does this PR introduce _any_ user-facing change?
Yes, this is a Scala version change.

### How was this patch tested?
Pass Github Actions

Closes #36807 from LuciferYang/SPARK-39414.

Authored-by: yangjie01 <yangjie01@baidu.com>
Signed-off-by: Dongjoon Hyun <dongjoon@apache.org>

(cherry picked from commit ed875a8)

* fix

* fix

* fix

* fix
�[0m[�[31merror�[0m] �[0m[warn] /home/jenkins/workspace/spark-sql-catalyst-3.0/core/src/main/scala/org/apache/spark/scheduler/SpillableTaskResultGetter.scala:36: non-variable type argument org.apache.spark.scheduler.DirectTaskResult[_] in type pattern scala.runtime.NonLocalReturnControl[org.apache.spark.scheduler.DirectTaskResult[_]] is unchecked since it is eliminated by erasure�[0m
�[0m[�[31merror�[0m] �[0m[warn] private[spark] class SpillableTaskResultGetter(sparkEnv: SparkEnv, scheduler: TaskSchedulerImpl)�[0m
�[0m[�[31merror�[0m] �[0m[warn] �[0m

* fix
�[0m[�[31merror�[0m] �[0m[warn] /home/jenkins/workspace/spark-sql-catalyst-3.0/mllib/src/main/scala/org/apache/spark/ml/feature/RFormulaParser.scala:287: match may not be exhaustive.�[0m
�[0m[�[31merror�[0m] �[0mIt would fail on the following input: ~(~(_, (x: String forSome x not in "^")), _)�[0m
�[0m[�[31merror�[0m] �[0m[warn]   private val pow: Parser[Term] = term ~ "^" ~ "^[1-9]\\d*".r ^^ {�[0m
�[0m[�[31merror�[0m] �[0m[warn] �[0m
�[0m[�[31merror�[0m] �[0m[warn] /home/jenkins/workspace/spark-sql-catalyst-3.0/mllib/src/main/scala/org/apache/spark/ml/feature/RFormulaParser.scala:301: match may not be exhaustive.�[0m
�[0m[�[31merror�[0m] �[0mIt would fail on the following input: ~(~(_, (x: String forSome x not in "~")), _)�[0m
�[0m[�[31merror�[0m] �[0m[warn]     (label ~ "~" ~ expr) ^^ { case r ~ "~" ~ t => ParsedRFormula(r, t.asTerms.terms) }�[0m
�[0m[�[31merror�[0m] �[0m[warn] �[0m

* fix

Co-authored-by: Dongjoon Hyun <dhyun@apple.com>
Co-authored-by: Dongjoon Hyun <dongjoon@apache.org>
Co-authored-by: yangjie01 <yangjie01@baidu.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants