New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

SI-6502 Reenables loading jars into the running REPL (regression in 2.10) #4051

Merged
merged 8 commits into from Nov 18, 2014

Conversation

Projects
None yet
8 participants
@heathermiller
Member

heathermiller commented Oct 14, 2014

Fixes SI-6502, reenables loading jars into the running REPL (regression in 2.10). This PR allows adding a jar to the compile and runtime classpaths without resetting the REPL state (crucial for Spark SPARK-3257).

This follows the lead taken by @som-snytt in PR #3986, which differentiates two jar-loading behaviors (muddled by cp):

  • adding jars and replaying REPL expressions (using replay)
  • adding jars without resetting the REPL (deprecated cp, introduced require)
    This PR implements require (left unimplemented in #3986)

This PR is a simplification of a similar approach taken by @gkossakowski in #3884. In this attempt, we check first to make sure that a jar is only added if it only contains new classes/traits/objects, otherwise we emit an error. This differs from the old invalidation approach which also tracked deleted classpath entries.

@heathermiller

This comment has been minimized.

Show comment
Hide comment
@heathermiller

heathermiller Oct 14, 2014

Member

Review by anyone interested. Especially those I pummeled with questions: @dragos, @phaller, @gkossakowski

Member

heathermiller commented Oct 14, 2014

Review by anyone interested. Especially those I pummeled with questions: @dragos, @phaller, @gkossakowski

@scala-jenkins scala-jenkins added this to the 2.11.5 milestone Oct 14, 2014

@heathermiller heathermiller changed the title from Fixes SI-6502, reenables loading jars into the running REPL (regression in 2.10) to SI-6502 Reenables loading jars into the running REPL (regression in 2.10) Oct 14, 2014

@retronym

This comment has been minimized.

Show comment
Hide comment
@retronym

retronym Oct 14, 2014

Member

It is probably best to submit a squashed PR and point reviewers to the fine grained commits on a branch. That means less work for our build 🐱 :)

Member

retronym commented Oct 14, 2014

It is probably best to submit a squashed PR and point reviewers to the fine grained commits on a branch. That means less work for our build 🐱 :)

@heathermiller

This comment has been minimized.

Show comment
Hide comment
@heathermiller

heathermiller Oct 14, 2014

Member

commits squashed :)

Member

heathermiller commented Oct 14, 2014

commits squashed :)

@retronym

This comment has been minimized.

Show comment
Hide comment
@retronym

retronym Oct 14, 2014

Member

I'd like to see this built on top of our existing abstractions for JAR file handling (AbstractFile).

scala> def flatten(f: AbstractFile): Iterator[AbstractFile] = if (f.isClassContainer) f.iterator.flatMap(flatten) else Iterator(f)
flatten: (f: scala.reflect.io.AbstractFile)Iterator[scala.reflect.io.AbstractFile]

scala> val jar = AbstractFile.getDirectory(new java.io.File("/code/scala/build/pack/lib/scala-library.jar"))
jar: scala.reflect.io.AbstractFile = /code/scala/build/pack/lib/scala-library.jar

scala> flatten(jar).map(_.path).take(15).mkString("\n")
res7: String =
rootdoc.txt
META-INF/MANIFEST.MF
library.properties
scala/deprecatedName.class
scala/Function1$mcVI$sp$class.class
scala/Function0$mcJ$sp$class.class
scala/Predef$StringAdd$.class
scala/Function2$mcDID$sp$class.class
scala/Function2$mcJJD$sp.class
scala/Product2$mcJD$sp$class.class
scala/Product2$mcDD$sp$class.class
scala/Function2$mcFJI$sp$class.class
scala/PartialFunction$OrElse.class
scala/Product19$class.class
scala/Enumeration$Value.class

As an added bonus, you can then probably support directory based classpath entries without writing additional code:

scala> val dir = AbstractFile.getDirectory(new java.io.File("/code/scala/build/quick/classes/library"))
dir: scala.reflect.io.AbstractFile = PlainFile(PlainFile(), PlainFile(), PlainFile(PlainFile(PlainFile(), PlainFile(), PlainFile(), PlainFile(), PlainFile(), PlainFile(), PlainFile(), PlainFile(PlainFile(), PlainFile(), PlainFile(), PlainFile(), PlainFile(), PlainFile(), PlainFile(), PlainFile(), PlainFile(), PlainFile(), PlainFile(), PlainFile()), PlainFile(), PlainFile(), PlainFile(), PlainFile(), PlainFile(), PlainFile(), PlainFile(PlainFile(), PlainFile()), PlainFile(), PlainFile()), PlainFile(), PlainFile(), PlainFile(), PlainFile(), PlainFile(), PlainFile(), PlainFile(), PlainFile(), PlainFile(), PlainFile(), PlainFile(), PlainFile(), PlainFile(), PlainFile(), PlainFile(), PlainFile(), PlainFile(), PlainFile(), PlainFile(), PlainFile(), PlainFile(), PlainFile(), PlainFile(), Plain...


scala> flatten(dir).map(_.path).take(15).mkString("\n")
res6: String =
/code/scala/build/quick/classes/library/library.properties
/code/scala/build/quick/classes/library/rootdoc.txt
/code/scala/build/quick/classes/library/scala/annotation/Annotation.class
/code/scala/build/quick/classes/library/scala/annotation/bridge.class
/code/scala/build/quick/classes/library/scala/annotation/ClassfileAnnotation.class
/code/scala/build/quick/classes/library/scala/annotation/compileTimeOnly.class
/code/scala/build/quick/classes/library/scala/annotation/elidable$.class
/code/scala/build/quick/classes/library/scala/annotation/elidable.class
/code/scala/build/quick/classes/library/scala/annotation/implicitNotFound.class
/code/scala/build/quick/classes/library/scala/annotation/meta/beanGetter.class
/code/scala/build/quick/classes/library/scala/annotation/meta...
Member

retronym commented Oct 14, 2014

I'd like to see this built on top of our existing abstractions for JAR file handling (AbstractFile).

scala> def flatten(f: AbstractFile): Iterator[AbstractFile] = if (f.isClassContainer) f.iterator.flatMap(flatten) else Iterator(f)
flatten: (f: scala.reflect.io.AbstractFile)Iterator[scala.reflect.io.AbstractFile]

scala> val jar = AbstractFile.getDirectory(new java.io.File("/code/scala/build/pack/lib/scala-library.jar"))
jar: scala.reflect.io.AbstractFile = /code/scala/build/pack/lib/scala-library.jar

scala> flatten(jar).map(_.path).take(15).mkString("\n")
res7: String =
rootdoc.txt
META-INF/MANIFEST.MF
library.properties
scala/deprecatedName.class
scala/Function1$mcVI$sp$class.class
scala/Function0$mcJ$sp$class.class
scala/Predef$StringAdd$.class
scala/Function2$mcDID$sp$class.class
scala/Function2$mcJJD$sp.class
scala/Product2$mcJD$sp$class.class
scala/Product2$mcDD$sp$class.class
scala/Function2$mcFJI$sp$class.class
scala/PartialFunction$OrElse.class
scala/Product19$class.class
scala/Enumeration$Value.class

As an added bonus, you can then probably support directory based classpath entries without writing additional code:

scala> val dir = AbstractFile.getDirectory(new java.io.File("/code/scala/build/quick/classes/library"))
dir: scala.reflect.io.AbstractFile = PlainFile(PlainFile(), PlainFile(), PlainFile(PlainFile(PlainFile(), PlainFile(), PlainFile(), PlainFile(), PlainFile(), PlainFile(), PlainFile(), PlainFile(PlainFile(), PlainFile(), PlainFile(), PlainFile(), PlainFile(), PlainFile(), PlainFile(), PlainFile(), PlainFile(), PlainFile(), PlainFile(), PlainFile()), PlainFile(), PlainFile(), PlainFile(), PlainFile(), PlainFile(), PlainFile(), PlainFile(PlainFile(), PlainFile()), PlainFile(), PlainFile()), PlainFile(), PlainFile(), PlainFile(), PlainFile(), PlainFile(), PlainFile(), PlainFile(), PlainFile(), PlainFile(), PlainFile(), PlainFile(), PlainFile(), PlainFile(), PlainFile(), PlainFile(), PlainFile(), PlainFile(), PlainFile(), PlainFile(), PlainFile(), PlainFile(), PlainFile(), PlainFile(), Plain...


scala> flatten(dir).map(_.path).take(15).mkString("\n")
res6: String =
/code/scala/build/quick/classes/library/library.properties
/code/scala/build/quick/classes/library/rootdoc.txt
/code/scala/build/quick/classes/library/scala/annotation/Annotation.class
/code/scala/build/quick/classes/library/scala/annotation/bridge.class
/code/scala/build/quick/classes/library/scala/annotation/ClassfileAnnotation.class
/code/scala/build/quick/classes/library/scala/annotation/compileTimeOnly.class
/code/scala/build/quick/classes/library/scala/annotation/elidable$.class
/code/scala/build/quick/classes/library/scala/annotation/elidable.class
/code/scala/build/quick/classes/library/scala/annotation/implicitNotFound.class
/code/scala/build/quick/classes/library/scala/annotation/meta/beanGetter.class
/code/scala/build/quick/classes/library/scala/annotation/meta...
/** Is given package class a system package class that cannot be invalidated?
*/
private def isSystemPackageClass(pkg: Symbol) =

This comment has been minimized.

@retronym

retronym Oct 15, 2014

Member
private def isSystemPackageClass(pkg: Symbol) = pkg == RootClass || (pkg.hasTransOwner(ScalaPackageClass) && !pkg.hasTransOwner(ScalaToolsPackageClass)

Noting that:

scala> ScalaPackageClass.hasTransOwner(ScalaPackageClass)
res4: Boolean = true
@retronym

retronym Oct 15, 2014

Member
private def isSystemPackageClass(pkg: Symbol) = pkg == RootClass || (pkg.hasTransOwner(ScalaPackageClass) && !pkg.hasTransOwner(ScalaToolsPackageClass)

Noting that:

scala> ScalaPackageClass.hasTransOwner(ScalaPackageClass)
res4: Boolean = true

This comment has been minimized.

@heathermiller

heathermiller Oct 15, 2014

Member

Done, thanks!

@heathermiller

heathermiller Oct 15, 2014

Member

Done, thanks!

if (elems.size == 1) elems.head
else new MergedClassPath(elems, classPath.context)
val oldEntries = mkClassPath(subst.keys)
val newEntries = mkClassPath(subst.values)

This comment has been minimized.

@retronym

retronym Oct 15, 2014

Member

Is it a problem if that oldEntries and newEntries might have a in a non-deterministic order (given that subst is not a linked map)?

@retronym

retronym Oct 15, 2014

Member

Is it a problem if that oldEntries and newEntries might have a in a non-deterministic order (given that subst is not a linked map)?

This comment has been minimized.

@heathermiller

heathermiller Oct 16, 2014

Member

I’m not sure, actually. To be safe, I just changed subst to be a TreeMap instead and defined an implicit ordering. So, this should force things to be in a deterministic order now.

@heathermiller

heathermiller Oct 16, 2014

Member

I’m not sure, actually. To be safe, I just changed subst to be a TreeMap instead and defined an implicit ordering. So, this should force things to be in a deterministic order now.

@heathermiller

This comment has been minimized.

Show comment
Hide comment
@heathermiller

heathermiller Oct 15, 2014

Member

RE: @retronym:

I'd like to see this built on top of our existing abstractions for JAR file handling (AbstractFile).

Good idea. Done

Though it’ll still take a bit more work to be able to additionally deal with directories. This switch isn’t enough – I’m still having a ton of trouble reading in classes from dirs (for some reason, actual source is expected.)

Member

heathermiller commented Oct 15, 2014

RE: @retronym:

I'd like to see this built on top of our existing abstractions for JAR file handling (AbstractFile).

Good idea. Done

Though it’ll still take a bit more work to be able to additionally deal with directories. This switch isn’t enough – I’m still having a ton of trouble reading in classes from dirs (for some reason, actual source is expected.)

@heathermiller

This comment has been minimized.

Show comment
Hide comment
@heathermiller

heathermiller Oct 16, 2014

Member

All feedback addressed, I believe.

Member

heathermiller commented Oct 16, 2014

All feedback addressed, I believe.

@retronym

This comment has been minimized.

Show comment
Hide comment
@retronym

retronym Oct 17, 2014

Member

How has this been tested? Are we able to add automated tests?

Member

retronym commented Oct 17, 2014

How has this been tested? Are we able to add automated tests?

@heathermiller

This comment has been minimized.

Show comment
Hide comment
@heathermiller

heathermiller Oct 17, 2014

Member

Manually, where I take a bunch of sometimes overlapping jars and load them in different ways and try to use them. If you can suggest a way for me to automatically test arbitrary jar loading (I add random jars to the test suite...? Not sure that makes sense) I'd be happy to add a test case.

Member

heathermiller commented Oct 17, 2014

Manually, where I take a bunch of sometimes overlapping jars and load them in different ways and try to use them. If you can suggest a way for me to automatically test arbitrary jar loading (I add random jars to the test suite...? Not sure that makes sense) I'd be happy to add a test case.

@retronym

This comment has been minimized.

Show comment
Hide comment
@retronym

retronym Oct 17, 2014

Member

Sometime manual testing is okay, but you should include a transcript of the test process in the pull request.

We do have a mechanism for using JARs in tests:

% find test  | grep desired
test/benchmarks/lib/jsr166_and_extra.jar.desired.sha1
test/files/codelib/code.jar.desired.sha1
test/files/lib/annotations.jar.desired.sha1
test/files/lib/enums.jar.desired.sha1
test/files/lib/genericNest.jar.desired.sha1
test/files/lib/jsoup-1.3.1.jar.desired.sha1
test/files/lib/macro210.jar.desired.sha1
test/files/lib/methvsfield.jar.desired.sha1
test/files/lib/nest.jar.desired.sha1
test/files/speclib/instrumented.jar.desired.sha1

One needs authorization to upload the JARs into the repository (in Github we just have the signature, and the ANT build downloads them). I haven't personally done this before and don't have the credentials.

I know that @xeno-by has done this, though. For example, in b10f45a, he depends on JAR in a new test case.

An alternative, which may be better in this case, is to use a run test case to programmatically generate the JAR files. Take a look at test/files/run/t6440b.scala for an example of compiling a few batches of code. You could use s"-d ${testOutput.toFile.getPath}/jar1.jar" as the options to a global instance to emit some code to a JAR.

Once the JARs were prepared, you could programatically drive the interpreter.

I know this represents another half a day of hacking about, but this feature seems subtle enough to warrant that.

Member

retronym commented Oct 17, 2014

Sometime manual testing is okay, but you should include a transcript of the test process in the pull request.

We do have a mechanism for using JARs in tests:

% find test  | grep desired
test/benchmarks/lib/jsr166_and_extra.jar.desired.sha1
test/files/codelib/code.jar.desired.sha1
test/files/lib/annotations.jar.desired.sha1
test/files/lib/enums.jar.desired.sha1
test/files/lib/genericNest.jar.desired.sha1
test/files/lib/jsoup-1.3.1.jar.desired.sha1
test/files/lib/macro210.jar.desired.sha1
test/files/lib/methvsfield.jar.desired.sha1
test/files/lib/nest.jar.desired.sha1
test/files/speclib/instrumented.jar.desired.sha1

One needs authorization to upload the JARs into the repository (in Github we just have the signature, and the ANT build downloads them). I haven't personally done this before and don't have the credentials.

I know that @xeno-by has done this, though. For example, in b10f45a, he depends on JAR in a new test case.

An alternative, which may be better in this case, is to use a run test case to programmatically generate the JAR files. Take a look at test/files/run/t6440b.scala for an example of compiling a few batches of code. You could use s"-d ${testOutput.toFile.getPath}/jar1.jar" as the options to a global instance to emit some code to a JAR.

Once the JARs were prepared, you could programatically drive the interpreter.

I know this represents another half a day of hacking about, but this feature seems subtle enough to warrant that.

@xeno-by

This comment has been minimized.

Show comment
Hide comment
@xeno-by

xeno-by Oct 17, 2014

Member

Can tell you guys how to do this stuff. We could do this via email.

Member

xeno-by commented Oct 17, 2014

Can tell you guys how to do this stuff. We could do this via email.

@mpociecha

This comment has been minimized.

Show comment
Hide comment
@mpociecha

mpociecha Oct 17, 2014

Member

@heathermiller Do you mean certain concrete jars? Or maybe is only this overlapping important? Theoretically you could generate some jars (see https://github.com/mpociecha/scala/blob/flat-classpath/test/files/run/various-flat-classpath-types.scala).

Member

mpociecha commented Oct 17, 2014

@heathermiller Do you mean certain concrete jars? Or maybe is only this overlapping important? Theoretically you could generate some jars (see https://github.com/mpociecha/scala/blob/flat-classpath/test/files/run/various-flat-classpath-types.scala).

@heathermiller

This comment has been minimized.

Show comment
Hide comment
@heathermiller

heathermiller Oct 24, 2014

Member

Hey guys, thanks for all the pointers. I think all comments are now addressed (clean ups as suggested by @retronym applied, and 4 tests that programmatically generate jars are now included)

Member

heathermiller commented Oct 24, 2014

Hey guys, thanks for all the pointers. I think all comments are now addressed (clean ups as suggested by @retronym applied, and 4 tests that programmatically generate jars are now included)

@retronym

This comment has been minimized.

Show comment
Hide comment
@retronym

retronym Oct 24, 2014

Member

LGTM!

Member

retronym commented Oct 24, 2014

LGTM!

@heathermiller

This comment has been minimized.

Show comment
Hide comment
@heathermiller

heathermiller Oct 24, 2014

Member

woot! :) thanks for the time you spent on this, Jason!

Member

heathermiller commented Oct 24, 2014

woot! :) thanks for the time you spent on this, Jason!

@retronym

This comment has been minimized.

Show comment
Hide comment
@retronym

retronym Nov 5, 2014

Member

That sounds okay to me. But perhaps we could review and discuss that as followup pull request?

Member

retronym commented Nov 5, 2014

That sounds okay to me. But perhaps we could review and discuss that as followup pull request?

@heathermiller

This comment has been minimized.

Show comment
Hide comment
@heathermiller

heathermiller Nov 6, 2014

Member

I can break it out if you'd like – though it complicates this companion/concurrent PR to Spark that depends on this PR being merged :/

Member

heathermiller commented Nov 6, 2014

I can break it out if you'd like – though it complicates this companion/concurrent PR to Spark that depends on this PR being merged :/

@retronym

This comment has been minimized.

Show comment
Hide comment
@retronym

retronym Nov 6, 2014

Member

No problems, feel free to add it as an extra commit to this PR.

Looking more closely at mergeUrlsIntoClassPath, I think there are a few improvements.

  • It only uses platform.classpath, how about it a member on ClassPath to make it usable without a JavaPlatform?
  • The if/else/else to create the right sort of AbstractFile looks like it should be factored out to a new factory method on AbstractFile. AbstractFile.getURL is currently unused within the compiler codebase, and might be a good candidate to generalize/document to meet the needs.
Member

retronym commented Nov 6, 2014

No problems, feel free to add it as an extra commit to this PR.

Looking more closely at mergeUrlsIntoClassPath, I think there are a few improvements.

  • It only uses platform.classpath, how about it a member on ClassPath to make it usable without a JavaPlatform?
  • The if/else/else to create the right sort of AbstractFile looks like it should be factored out to a new factory method on AbstractFile. AbstractFile.getURL is currently unused within the compiler codebase, and might be a good candidate to generalize/document to meet the needs.
SI-6502 Refactorings suggested by review
- Moves mergeUrlsIntoClassPath from Global into ClassPath
- Revises and documents AbstractFile.getURL
@heathermiller

This comment has been minimized.

Show comment
Hide comment
@heathermiller

heathermiller Nov 10, 2014

Member

All good ideas! I just addressed your feedback and added a new commit with the suggested refactorings. Let me know if anything looks amiss!

Thanks :)

Member

heathermiller commented Nov 10, 2014

All good ideas! I just addressed your feedback and added a new commit with the suggested refactorings. Let me know if anything looks amiss!

Thanks :)

@gkossakowski

This comment has been minimized.

Show comment
Hide comment
@gkossakowski

gkossakowski Nov 10, 2014

Member

Hey guys! I'm a bit late to the party (my Mac was down for almost 2 weeks) but I'll try to give my feedback in next 2-3 days.

Member

gkossakowski commented Nov 10, 2014

Hey guys! I'm a bit late to the party (my Mac was down for almost 2 weeks) but I'll try to give my feedback in next 2-3 days.

@retronym

This comment has been minimized.

Show comment
Hide comment
@retronym

retronym Nov 10, 2014

Member

LGTM

A note for next time: It would be easier to review if that were two commits: one that refactored the method in-situ, and a second that moved it. That way we can see the diff more clearly.

Member

retronym commented Nov 10, 2014

LGTM

A note for next time: It would be easier to review if that were two commits: one that refactored the method in-situ, and a second that moved it. That way we can see the diff more clearly.

@retronym

This comment has been minimized.

Show comment
Hide comment
@retronym

retronym Nov 12, 2014

Member

PLS REBUILD ALL

Member

retronym commented Nov 12, 2014

PLS REBUILD ALL

@scala-jenkins

This comment has been minimized.

Show comment
Hide comment
@scala-jenkins

scala-jenkins Nov 12, 2014

(kitty-note-to-self: ignore 62669006)
🐱 Roger! Rebuilding pr-scala for 8192571, a84abd0, f65c430, 04ee526, d045dde, be3eb58, 9e56c7a, 24a2ef9. 🚨

scala-jenkins commented Nov 12, 2014

(kitty-note-to-self: ignore 62669006)
🐱 Roger! Rebuilding pr-scala for 8192571, a84abd0, f65c430, 04ee526, d045dde, be3eb58, 9e56c7a, 24a2ef9. 🚨

@heathermiller

This comment has been minimized.

Show comment
Hide comment
@heathermiller

heathermiller Nov 12, 2014

Member

True, good point. Duly noted :)

Member

heathermiller commented Nov 12, 2014

True, good point. Duly noted :)

@heathermiller

This comment has been minimized.

Show comment
Hide comment
@heathermiller

heathermiller Nov 12, 2014

Member

Thanks again for the detailed review!

Member

heathermiller commented Nov 12, 2014

Thanks again for the detailed review!

@heathermiller

This comment has been minimized.

Show comment
Hide comment
@heathermiller

heathermiller Nov 17, 2014

Member

Ping on this – just wondering when this will be merged? I ask because this PR is blocking Spark from moving to 2.11 (as things stood when I was in CA 1.5 weeks ago).

Member

heathermiller commented Nov 17, 2014

Ping on this – just wondering when this will be merged? I ask because this PR is blocking Spark from moving to 2.11 (as things stood when I was in CA 1.5 weeks ago).

@heathermiller

This comment has been minimized.

Show comment
Hide comment
@heathermiller

heathermiller Nov 17, 2014

Member

The sooner we get a SNAPSHOT or something, the better – gives us more time to have the new REPL support in the Spark 1.2 branch and more time to have this tested before 1.2 is released.

Member

heathermiller commented Nov 17, 2014

The sooner we get a SNAPSHOT or something, the better – gives us more time to have the new REPL support in the Spark 1.2 branch and more time to have this tested before 1.2 is released.

@retronym

This comment has been minimized.

Show comment
Hide comment
@retronym

retronym Nov 17, 2014

Member

@gkossakowski Do you still plan to review this? If you don't have time in the next day or two, I propose we merge this as is.

Member

retronym commented Nov 17, 2014

@gkossakowski Do you still plan to review this? If you don't have time in the next day or two, I propose we merge this as is.

@gkossakowski

This comment has been minimized.

Show comment
Hide comment
@gkossakowski

gkossakowski Nov 17, 2014

Member

Unfortunately, I won't have time for detailed review. The infra work and dealing with backlog is consuming all of my time. I trust review powers of @retronym, though. :)

There's only one thing that concerns me: we are adding a bunch of logic back to Global. I know originally the classpath invalidation logic was in Global but that was a mistake. I think this logic should live in a separate class with properly spelled dependencies. I'd propose moving this logic into a class that is referenced from Global through composition. If Spark expects certain methods to live directly in Global, let's just add deprecated forwarders. This would be part of the quest for reducing Global's surface area.

The refactoring work can be done in separate PR (there's no need to block this one any further!) but I'd to see it happen. @heathermiller, WDYT?

Member

gkossakowski commented Nov 17, 2014

Unfortunately, I won't have time for detailed review. The infra work and dealing with backlog is consuming all of my time. I trust review powers of @retronym, though. :)

There's only one thing that concerns me: we are adding a bunch of logic back to Global. I know originally the classpath invalidation logic was in Global but that was a mistake. I think this logic should live in a separate class with properly spelled dependencies. I'd propose moving this logic into a class that is referenced from Global through composition. If Spark expects certain methods to live directly in Global, let's just add deprecated forwarders. This would be part of the quest for reducing Global's surface area.

The refactoring work can be done in separate PR (there's no need to block this one any further!) but I'd to see it happen. @heathermiller, WDYT?

@gkossakowski

This comment has been minimized.

Show comment
Hide comment
@gkossakowski

gkossakowski Nov 17, 2014

Member

See also #4060 that is directly related to this work. It would benefit from clearly delineating the boundary between classpath invalidation and the rest of the compiler.

Member

gkossakowski commented Nov 17, 2014

See also #4060 that is directly related to this work. It would benefit from clearly delineating the boundary between classpath invalidation and the rest of the compiler.

@heathermiller

This comment has been minimized.

Show comment
Hide comment
@heathermiller

heathermiller Nov 18, 2014

Member

Maybe later, sure, but this PR has been blocking me for over a month now. Would be nice to merge it...

Member

heathermiller commented Nov 18, 2014

Maybe later, sure, but this PR has been blocking me for over a month now. Would be nice to merge it...

retronym added a commit that referenced this pull request Nov 18, 2014

Merge pull request #4051 from heathermiller/repl-cp-fix2
SI-6502 Reenables loading jars into the running REPL (regression in 2.10)

@retronym retronym merged commit b2ba80a into scala:2.11.x Nov 18, 2014

1 check passed

default pr-scala Took 91 min.
Details
@heathermiller

This comment has been minimized.

Show comment
Hide comment
@heathermiller

heathermiller Nov 18, 2014

Member

Thanks very much!! :)

Member

heathermiller commented Nov 18, 2014

Thanks very much!! :)

lrytz added a commit to lrytz/scala that referenced this pull request Mar 22, 2016

Support :require when using the flat classpath representation.
:require was re-incarnated in scala#4051,
it seems to be used by the spark repl. This commit makes it work when
using the flat classpath representation.

lrytz added a commit to lrytz/scala that referenced this pull request Mar 22, 2016

Support :require when using the flat classpath representation.
:require was re-incarnated in scala#4051,
it seems to be used by the spark repl. This commit makes it work when
using the flat classpath representation.

@scabug scabug referenced this pull request Apr 7, 2017

Closed

:cp does not work #6502

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment