Join GitHub today
GitHub is home to over 31 million developers working together to host and review code, manage projects, and build software together.Sign up
Scala ClassLoader breaks nio FileSystemProvider API #10247
However, Scala uses a system ClassLoader that doesn't search among JARs on the classpath, resulting in it being impossible to use custom
When the same example JAR is run with
I'm currently planning to use this workaround to call
This SO provides basically the same analysis and diagnosis.
Imported From: https://issues.scala-lang.org/browse/SI-10247?orig=1
Or, use the API where you can provide the class loader to find providers which are not "installed."
This example shows loading a test provider from a build dir.
Or specifying loader:
Thanks for these workarounds, @som-snytt!
Unfortunately I'm not sure that either one works very well for me:
Can we also have some conversation about why Scala doesn't have this Just Work? I assume there is no Good Reason for it, but rather that it's an unfortunate side-effect of something well under the hood of what most Scala users know or care to know about… is there any hope of "fixing" this?
Not sure I follow; is using
I assume the latter, which seems like a non-starter to me, at least until we get a straight answer about why Scala is doing this and what fixing the problem at its source would entail.
Sorry, I'm missing what this is supposed to solve or why.
Per the above, I'd like to discuss why Scala is doing this and what it would take to fix it, though I appreciate the thorough exploration of the space of possible work-arounds
You can drop your
I just spent some time refreshing my understanding: basically, you're saying that invoking the runner script with
Just guessing, but that might be because the same script is used for both
Here's an issue: lampepfl/dotty#44
In 2.13, they want more flexible module handling, and also use the Java 9 modules, so now might be a good time to start or join a conversation on their discussion site, or mention it on gitter. In fact, I'll go mention it now.
To reiterate, this isn't an issue with
The confusing options are
You could also consider using
As an update, I've been using a wrapper around
That got me unblocked, but I don't think a world where [everyone who wants to use JSR203 libraries from Scala] has to [use my library or roll their own similar library] is desirable.
I'm also a little confused that this hasn't come up more widely / there aren't others mentioning that they've run into this; I thought folks I work with in the ADAM universe (cf. linked issue above) would have, but perhaps they and everyone else primarily use the analogous HDFS FileSystem APIs (that JSR203 was meant to mimic/replace, IIUC)?
Anyway, I'll defer to y'all about what level of fixing, further documenting recommended workarounds, #wontfix'ing, etc. is the right outcome here, thanks.
I think this is more of a Spark problem (and/or bad design in the JDK):
I suppose it is because people don't use the
java [-Xms, -Xmx, ...] -cp [~full classpath, including scala & your fs~] YourMainClass
And that will make everything work just fine. I don't want the
(As for the dev experience, in sbt you need to enable forking when running/testing, and then everything works the same. It is annoying that it doesn't Just Work™ in the REPL though.)
I have no idea how Spark does all this, or if they allow users to easily inject stuff onto the classpath of Spark itself, but that's what you would need.
I do use REPL scripting on my server, so it should just work. Right now it looks pretty broken. With the need for Java 9 support, it's a good time to revisit what does it think it's doing?
Here's the REPL class loader, which is more precise than
It wasn't intended to put
I did a quick munge of the script that just puts the Scala user
I don't know that there is any benefit in the current set-up, where a special class loader takes over. The runner code can still use a
I haven't looked at whether sbt supports forking when running the console. Hopefully any wrinkle could be ironed out. Similarly, folks did work for Spark to support adding jars to the compiler class path, so this is a natural use case. Maybe some future REPL will support that in a natural way.