-
Notifications
You must be signed in to change notification settings - Fork 40
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
support spark 3.0 #59
Conversation
All Feel free to tinker around with it if anyone wants. |
1ef9c9c
to
05a9047
Compare
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
With CREATE TEMPORARY TABLE
-> CREATE TEMPORARY VIEW
change all tests pass.
Thanks for the improvement .
src/test/scala/com/github/saurfang/sas/spark/DefaultSourceSuite.scala
Outdated
Show resolved
Hide resolved
05a9047
to
81b1549
Compare
Can someone check if the jar compiled for NOTE: run |
Actually that will fail, because Maven has no 2.11 jars for Spark 3.0.0. I think we should consider dropping support for Spark 2.X with version 3.X of this library. (And just tell people to use 2.X if they use Spark 2.X) |
@saurfang thoughts about dropping |
81b1549
to
5a4ab1c
Compare
my +1 cent on dropping Spark 2.x Scala 2.12 support in Spark 2.4 was experimental as per release notes https://spark.apache.org/releases/spark-release-2-4-0.html @srowen can comment here better. If somebody needs to use spark-sas7bdat with Spark 2.x, they can just reference the current version. The new version would then be for Spark 3.x Thanks! |
So, I think ideally you simply set up the build matrix to test Scala 2.11 + Spark 2.4, and Scala 2.12 + Spark 3. That is trivial to declare with Travis; I can show you examples. If this is a library that doesn't change much, it's coherent to say users of Spark 2 can use current versions and users of Spark 3 can use future versions and just deal with it. However I think the code cross-complies readily. What I would suggest dropping now is support for Scala 2.10 and Spark 2.3 or earlier. No real point going forward. |
+1 to what Sean said - this looks like a much better option |
+1 on setting build matrix for 2.4 and 3.0 if we can fix tests to work for both. Otherwise I'm peaceful dropping support for 2.x since I don't think we have any plans for major features anyway. |
@saurfang Could you pls build the JAR ? I don't have permission to build the JAR |
@pkolli-caredx what do you mean you don't have permission? You can pull this branch and build it. If it's hard I could do it for you but should be that simple. |
@srowen Could you pls build the JAR and share it, that would be great |
@pkolli-caredx try this assembly JAR: https://drive.google.com/file/d/1xj3XYjdzmNTtqnra96pPIdgYcdXVpyBd/view?usp=sharing |
808940d
to
35569c5
Compare
35569c5
to
8cf820f
Compare
@saurfang got this working with a proper test matrix in travis, just need you to:
|
much appreciate it as always @thesuperzapper |
This PR implements support for Spark 3.0 (Scala 2.12)
To get this merged we will need to: