Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Zinc fails to delete stale .class files resulting in broken incremental compilation #1234

Closed
jackkoenig opened this issue Aug 10, 2023 · 7 comments · Fixed by #1293
Closed

Comments

@jackkoenig
Copy link

steps

I have written up a repository with instructions to reproduce: https://github.com/jackkoenig/zinc-bug-example

They're written for Linux x86_64 or MacOs x86_64, the only platform-specific aspect is that you need 2 versions of the JVM, but it's easy enough to tweak the instructions to your liking so long as you properly set JAVA_HOME and PATH to use the differing versions as described in the instructions.

problem

TLDR: Zinc is not deleting .class files from a previous compilation, I think the .class file deletion logic when you have an empty Analysis is to blame.

See Summary (https://github.com/jackkoenig/zinc-bug-example#summary) and Commentary (https://github.com/jackkoenig/zinc-bug-example#commentary) in my linked repository:

expectation

Zinc should properly delete stale .class files so that incremental compilation will succeed.

notes

@eed3si9n
Copy link
Member

@jackkoenig Thanks for the detailed bug report!

@eed3si9n
Copy link
Member

This prevents the stale example/foo/B.class from being deleted, thus it is included on the classpath for the incremental compilation which fails.

This looks like the most instructive observation around how this bug is happening.
If this is mostly around the corner case of total invalidation caused by changing JVM version etc, probably the easier thing to aim for would be to wipe out the output directory (and/or the output JAR file)? Overall, the management of moving *.class files during the incremental compilation is handled by https://github.com/sbt/zinc/blob/develop/internal/zinc-core/src/main/scala/sbt/internal/inc/ClassFileManager.scala, so if we can implement the wipe-out-if-Analysis-is-empty logic there that would be ideal.

@jackkoenig
Copy link
Author

If this is mostly around the corner case of total invalidation caused by changing JVM version etc, probably the easier thing to aim for would be to wipe out the output directory (and/or the output JAR file)?

Makes sense to me, I will give that a go in the next few days!

@Friendseeker
Copy link
Member

Friendseeker commented Nov 30, 2023

I am thinking if the following change would fix the issue. Using observation from jackkoenig, IncrementalCompilerImpl.prevAnalysis dropping previous Analysis and returning Analysis.empty is the root cause of the issue.

Then to fix it, I am thinking if we can delete all products from previous analysis before dropping previous analysis, as follows:

     previousSetup match {
       // The dummy output needs to be changed to .jar for this to work again.
       // case _ if compileToJarSwitchedOn(mixedCompiler.config)             => Analysis.empty
       case Some(prev) if equiv.equiv(prev, currentSetup)                   => previousAnalysis
       case Some(prev) if !equivPairs.equiv(prev.extra, currentSetup.extra) =>
+        val classFileManager =
+          ClassFileManager.getClassFileManager(incOptions, output, outputJarContent)
+        val products = previousAnalysis.asInstanceOf[Analysis].relations.allProducts
+        classFileManager.delete(products.map(converter.toVirtualFile).toArray)
         Analysis.empty

I want to request for assistance validating the above fix. Specifically, I want to run https://github.com/jackkoenig/zinc-bug-example with a custom zinc build.

@eed3si9n Would you mind to advise me on how can this be done?

@jackkoenig
Copy link
Author

@Friendseeker the unit test I wrote in #1235 does a pretty good job of testing the issue, although I guess it would still be good to check my repo as an integration test.

@Friendseeker
Copy link
Member

Friendseeker commented Nov 30, 2023

@Friendseeker the unit test I wrote in #1235 does a pretty good job of testing the issue, although I guess it would still be good to check my repo as an integration test.

Thanks for the info! I guess in the worst case, we can... merge first and integration test later after a new sbt version is released....

@Friendseeker
Copy link
Member

Friendseeker commented Nov 30, 2023

Just as documentation for future peers, I had finally figured it out how to use a custom zinc build!

It is a hack but works for manual integration testing purpose.

When sbt is invoked, it looks up dependency jars in

~/.sbt/boot/scala-[scala-version]/org.scala-sbt/sbt/[sbt-version]

Inside there, there are various zinc jars.

Therefore, on zinc repo, run publishLocal, and then use these published jars to replace the zinc jars in ~/.sbt/boot/scala-[scala-version]/org.scala-sbt/sbt/[sbt-version] suffices.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
3 participants