Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[FLINK-10911][scala-shell] Enable flink-scala-shell with Scala 2.12 #11895

Closed
wants to merge 2 commits into from

Conversation

zjffdu
Copy link
Contributor

@zjffdu zjffdu commented Apr 24, 2020

What is the purpose of the change

This PR is to enable flink-scala-shell with scala-2.12. Previous we disable it because flink scala shell doesn't work with scala 2.12, but now it seems not true. The CI is passed and I also verify it in my local machine. flink scala-shell works well with scala 2.12.

Brief change log

Just enable flink-scala-shell with scala 2.12 in pom file

Verifying this change

This change is already covered by existing tests.

Does this pull request potentially affect one of the following parts:

  • Dependencies (does it add or upgrade a dependency): (no)
  • The public API, i.e., is any changed class annotated with @Public(Evolving): (no)
  • The serializers: (no)
  • The runtime per-record code paths (performance sensitive): (no)
  • Anything that affects deployment or recovery: JobManager (and its components), Checkpointing, Kubernetes/Yarn/Mesos, ZooKeeper: (no)
  • The S3 file system connector: (no)

Documentation

  • Does this pull request introduce a new feature? (no)
  • If yes, how is the feature documented? (not applicable)

@flinkbot
Copy link
Collaborator

flinkbot commented Apr 24, 2020

Thanks a lot for your contribution to the Apache Flink project. I'm the @flinkbot. I help the community
to review your pull request. We will use this comment to track the progress of the review.

Automated Checks

Last check on commit f112d10 (Fri May 28 09:13:12 UTC 2021)

Warnings:

  • 2 pom.xml files were touched: Check for build and licensing issues.
  • No documentation files were touched! Remember to keep the Flink docs up to date!
  • This pull request references an unassigned Jira ticket. According to the code contribution guide, tickets need to be assigned before starting with the implementation work.

Mention the bot in a comment to re-run the automated checks.

Review Progress

  • ❓ 1. The [description] looks good.
  • ❓ 2. There is [consensus] that the contribution should go into to Flink.
  • ❓ 3. Needs [attention] from.
  • ❓ 4. The change fits into the overall [architecture].
  • ❓ 5. Overall code [quality] is good.

Please see the Pull Request Review Guide for a full explanation of the review process.


The Bot is tracking the review progress through labels. Labels are applied according to the order of the review items. For consensus, approval by a Flink committer of PMC member is required Bot commands
The @flinkbot bot supports the following commands:

  • @flinkbot approve description to approve one or more aspects (aspects: description, consensus, architecture and quality)
  • @flinkbot approve all to approve all aspects
  • @flinkbot approve-until architecture to approve everything until architecture
  • @flinkbot attention @username1 [@username2 ..] to require somebody's attention
  • @flinkbot disapprove architecture to remove an approval you gave earlier

@flinkbot
Copy link
Collaborator

flinkbot commented Apr 24, 2020

CI report:

Bot commands The @flinkbot bot supports the following commands:
  • @flinkbot run travis re-run the last Travis build
  • @flinkbot run azure re-run the last Azure build

@zjffdu zjffdu changed the title [FLINK-10911] Flink's flink-scala-shell is not working with Scala 2.12 [WIP] [FLINK-10911] Flink's flink-scala-shell is not working with Scala 2.12 Apr 24, 2020
@zjffdu
Copy link
Contributor Author

zjffdu commented Apr 24, 2020

@flinkbot run azure

@zentol
Copy link
Contributor

zentol commented Apr 24, 2020

Can you give a quick note on what the goal of this PR is?

@zjffdu zjffdu changed the title [FLINK-10911] Flink's flink-scala-shell is not working with Scala 2.12 [FLINK-10911] Flink's flink-scala-shell is not working with Scala 2.12 [WIP] Apr 24, 2020
@zjffdu
Copy link
Contributor Author

zjffdu commented Apr 24, 2020

@zentol It is WIP PR, I can run scala shell 2.12 successfully in my local machine. I just want to run it in CI to check whether it can pass CI.

@zjffdu
Copy link
Contributor Author

zjffdu commented Apr 25, 2020

@flinkbot run azure

@zjffdu zjffdu changed the title [FLINK-10911] Flink's flink-scala-shell is not working with Scala 2.12 [WIP] [FLINK-10911] Flink's flink-scala-shell is not working with Scala 2.12 Apr 25, 2020
@zjffdu zjffdu changed the title [FLINK-10911] Flink's flink-scala-shell is not working with Scala 2.12 [FLINK-10911] Enable flink-scala-shell with Scala 2.12 Apr 25, 2020
@zjffdu zjffdu changed the title [FLINK-10911] Enable flink-scala-shell with Scala 2.12 [FLINK-10911][scala-shell] Enable flink-scala-shell with Scala 2.12 Apr 25, 2020
@zjffdu
Copy link
Contributor Author

zjffdu commented Apr 26, 2020

@tillrohrmann @aljoscha @zentol Could you help review this PR ? Thanks

@aljoscha aljoscha self-assigned this Apr 27, 2020
Copy link
Contributor

@tillrohrmann tillrohrmann left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks for creating this PR @zjffdu. I think CI is not conclusive at this point since it did not run the Scala 2.12 profiles. Please make sure that these builds have run. Otherwise we cannot move forward with this PR.

@aljoscha
Copy link
Contributor

Ah sorry @tillrohrmann! While you were writing your message I was merging the PR. I was assuming that azure and travis run the Scala 2.12 profiles.

What should we do?

@aljoscha
Copy link
Contributor

I'm running the test locally now.

@aljoscha
Copy link
Contributor

The tests don't pass, I reverted the change on master. Sorry about this, I was too eager in merging.

@zjffdu
Copy link
Contributor Author

zjffdu commented Apr 27, 2020

@aljoscha What kind of error do you see ?

@aljoscha
Copy link
Contributor

The assertions in ScalaShellITCase fail because the invoked programs produce null pointer exceptions. I didn't look further, so I don't know why the exceptions are thrown.

@zjffdu
Copy link
Contributor Author

zjffdu commented Apr 27, 2020

It is weird, I remember I ran test successfully before, let me try it again.
BTW, do you mean all the tests of scala 2.12 is not enabled in CI ? I thought only tests for flink scala shell is disabled, and I enable it in this PR.

@zentol
Copy link
Contributor

zentol commented Apr 27, 2020

We only use scala 2.11 on CI; this is fully independent of whether the scala-shell tests run or not.

@zjffdu
Copy link
Contributor Author

zjffdu commented Apr 28, 2020

@aljoscha I found the reason why I ran it successfully last time. I only run mvn test, but should run mvn verify, otherwise the tests under scala folder is not executed.

Regarding this issue, I found that after I use repl-class-based, CI can pass with scala-2.12. Maybe it is related with this (https://users.scala-lang.org/t/is-this-a-known-repl-bug/3271). But it causes the case class can not be used in scala-shell, which I talked with you before. How about we fix the case class issue later ? Because I think it is fixable, spark scala shell support case class, we can borrow some ideas from spark.

@aljoscha
Copy link
Contributor

@zentol How do you mean? When building with Scala 2.11 flink-scala-shell is excluded, so the tests for the shell are not run on a Scala 2.12 build.

@zentol
Copy link
Contributor

zentol commented Apr 28, 2020

@aljoscha yes? In any case we only run 2.11 on CI, that's all I was saying. This was a response to the previous comment from Jeff:

BTW, do you mean all the tests of scala 2.12 is not enabled in CI ? I thought only tests for flink scala shell is disabled, and I enable it in this PR.

@aljoscha
Copy link
Contributor

@zjffdu Is spark also using repl-class-based? If yes, do you know how they manage to support case classes.

@zentol Yes, flink-scala-shell is excluded when the scala-2.12 profile is active, which is what this PR changes.

@zjffdu
Copy link
Contributor Author

zjffdu commented Apr 29, 2020

@aljoscha Right, spark use repl-class-based. Regarding how spark support case classes, I need further investigation.

@zjffdu
Copy link
Contributor Author

zjffdu commented May 11, 2020

@aljoscha I found that the case class issue is caused by FLINK-10493 which introduce ScalaCaseClassSerializer. In our internal flink branch which is based on 1.5 that has no ScalaCaseClassSerializer it works even when I use repl-class-based in scala shell.

@aljoscha
Copy link
Contributor

@zjffdu Should we still try and get this in for Flink 1.12? Sorry for the very long delay on this one! 😱

@zjffdu
Copy link
Contributor Author

zjffdu commented Oct 30, 2020

Sorry @aljoscha I am afraid I don't have time to get it into 1.12, let's try that in the next release.

@aljoscha
Copy link
Contributor

aljoscha commented Nov 2, 2020

No worries. 😃

@aljoscha aljoscha removed their assignment Mar 16, 2021
@aalexandrov
Copy link
Contributor

aalexandrov commented Mar 26, 2021

Just to clarify - after building a scala-shell with the changes from this PR, and testing as follows:

  1. Start scala-shell:
bin/start-scala-shell.sh local
  1. Try the following Scala code:
scala> case class Pair(x: Int, y: Int)
defined class Pair

scala> benv.fromElements(1,2,3).map(x => Pair(x,x)).collect()
java.lang.IllegalArgumentException: requirement failed:
The class Pair is an instance class, meaning it is not a member of a
toplevel object, or of an object contained in a toplevel object,
therefore it requires an outer instance to be instantiated, but we don't have a
reference to the outer instance. Please consider changing the outer class to an object.

  at scala.Predef$.require(Predef.scala:277)
  at org.apache.flink.api.scala.typeutils.ScalaCaseClassSerializer$.lookupConstructor(ScalaCaseClassSerializer.scala:97)
  at org.apache.flink.api.scala.typeutils.ScalaCaseClassSerializer.<init>(ScalaCaseClassSerializer.scala:46)
  ... 70 elided

The problem is that with -Yrepl-class-based enabled, UDTs (such as case classes) are defined within a nested class, and as such cannot be easily serialized (the ScalaCaseClassSerializer invoked by the org.apache.flink.api.scala.createTypeInformation throws a very detailed error explaining this).

I am not sure what were the use-cases that lead the Scala team (and Spark) to adopt -Yrepl-class-based over the traditional object-based wrapping - it seems that nesting objects in objects is more robust w.r.t. serialization.

FYI in Scala 2.13 the default is class-based and it needs to be explicitly disabled with -Yrepl-class-based:false.

@aljoscha / @zjffdu: do you know any particular scenarios where object-based REPL is problematic?

@zjffdu
Copy link
Contributor Author

zjffdu commented Apr 25, 2021

@aalexandrov Thanks for looking into this issue, here's 2 tickets that are related
https://issues.apache.org/jira/browse/SPARK-1199
https://issues.apache.org/jira/browse/SPARK-17103

@AHeise
Copy link
Contributor

AHeise commented Sep 28, 2021

Closing this PR due to https://issues.apache.org/jira/browse/FLINK-24360 / #17340

@AHeise AHeise closed this Sep 28, 2021
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
8 participants