-
Notifications
You must be signed in to change notification settings - Fork 134
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Upgrade Spark to version 1.0.0 and bump sbt to 0.13.5 #30
Conversation
@tsindot This looks good to me, but I think we'll have to publish a new release after this. (There are jars available on bin tray, but we need to update README) @kelvinchu @dan-null can you guys have a look and merge? |
Hi, I had compiled the job server for Apache Spark 1.0, on Scala 2.10 > However there is a critical issue that, the RDD's created by Job-A gets automatically cleaned up by the spark system, before you can use it on Job-B. See this thread for the more details - http://apache-spark-user-list.1001560.n3.nabble.com/RDD-Cleanup-td9182.html |
+1 it would be great to use the job server with Spark 1.0.0/1.0.1. @premdass what was the result of your testing with shared contexts? |
@tsindot Do you want to move the PR to our new location? |
Yes will do. -Todd |
Do you want this pushed up to 1.0.2 of Spark or just the 1.0.0 release? I can do either, seems like 1.0.2 is appropriate but wanted to validate first before going there. -Todd |
1.0.2 sounds like a good idea! |
Yes to 1.0.2!! Thanks. -Evan
|
Create PR on the new project, spark-jobserver/spark-jobserver#2. lmk, if there are any issues or concerns. -Todd |
Is it possible to get the spark-jobserver upgraded to support Spark Version 1.0.0. I have made the basic changes and all 104 test pass. lmk if there is anything else I can do to aid in getting this upgraded.