-
Notifications
You must be signed in to change notification settings - Fork 29
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
problem with private[spark] functions? #7
Comments
Strange, this problem should be resolved: Yours Peng On 12/23/2014 12:31 PM, Mathieu wrote:
|
Confirmed, this is a bug caused by bypassing some steps in SparkImport (as classloaders in executors are different from that in master). |
Hi @mathieu1 , Thanks a lot for your prompt. This is getting more and more interesting. First I would like to confirm you are not running on scala_2.11? Apparently spark-repl setup a ClassLoader server in this case to synch between driver & executor (scala_2.10.4 doesn't have this feature). The problem only happens when new class is defined in interpreter and its instances being collected from executor. It doesn't matter where the class is instantiated. There is no problem printing it locally. Further more, its always thrown by ExecutorUrlClassLoader which is only used in SparkSubmit. So maybe there are some secret hacking in spark-shell.sh which I didn't scrutinize. I'll get back to you once I advance, please keep me informed. Also, have you tried it on NFlab zeppelin & ibm Spark-kernel? Do they work? |
Indeed I compiled and ran averything on scala 2.10.4 Regarding your last question : I tried IBM's spark-kernel and it worked :) Their dependency on an old version of zeromq (2) made it nontrivial for me to setup however. I haven't managed to use NFlab's zeppelin at all so far. |
aha, looks like its working on zeppelin as well, I'll likely switch my backend to them in the future. |
I still get this error in 1.3.0, looks like there is going be some serious hacking to the class loader. I'll simply copy @benjaminlaird's test script here, in case it got deleted by original author. case class Circle(rad:Float) |
…becomes private visualization upgraded to be compatible with dataframe api fix a bug: problem with private[spark] functions? #7, interpreter now use class server correctly display dsl are moved into a new package fix 2 bugs in display dataframe as table.
ok problem fixed. Turns out to be easier than I think. |
I'm running into a problem while executing the standard spark/graphX example in ISpark, see this notebook.
Using Spark 1.1.1 with "local[2]" master and IPython Notebook 2.3, I get the following error:
A similar error has already been brought up by @benjaminlaird also using ISpark, see his much simpler code.
I suppose this problem has to do with the
ExecutorURLClassLoader
class beingprivate[spark]
(seeExecutorURLClassLoader.scala
)Of course, all the code runs fine on the standard
spark-shell
. The same issue happens on the spark backend for IScala from @hvanhovellThe text was updated successfully, but these errors were encountered: