Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Out Of Memory with dagger 2.25.2 #1645

Closed
lucas34 opened this issue Oct 24, 2019 · 16 comments
Closed

Out Of Memory with dagger 2.25.2 #1645

lucas34 opened this issue Oct 24, 2019 · 16 comments
Assignees

Comments

@lucas34
Copy link

lucas34 commented Oct 24, 2019

Hi, I recently tried to update to dagger 2.25.2 from 2.24. I'm having OOM during kapt.
I tried changing the configuration with way more memory and I notice that the annotation processing is around 2 to 3x slower compare to 2.24.
The project is pretty huge, almost fully in kotlin and have hundreds of modules

EDIT:
Having the issue with gradle plugin 3.5.1 but does not happen with 3.6.0-beta1

e: java.lang.IllegalStateException: failed to analyze: java.lang.OutOfMemoryError: GC overhead limit exceeded
	at org.jetbrains.kotlin.analyzer.AnalysisResult.throwIfError(AnalysisResult.kt:56)
	at org.jetbrains.kotlin.cli.jvm.compiler.KotlinToJVMBytecodeCompiler.compileModules$cli(KotlinToJVMBytecodeCompiler.kt:182)
	at org.jetbrains.kotlin.cli.jvm.K2JVMCompiler.doExecute(K2JVMCompiler.kt:164)
	at org.jetbrains.kotlin.cli.jvm.K2JVMCompiler.doExecute(K2JVMCompiler.kt:54)
	at org.jetbrains.kotlin.cli.common.CLICompiler.execImpl(CLICompiler.kt:84)
	at org.jetbrains.kotlin.cli.common.CLICompiler.execImpl(CLICompiler.kt:42)
	at org.jetbrains.kotlin.cli.common.CLITool.exec(CLITool.kt:104)
	at org.jetbrains.kotlin.daemon.CompileServiceImpl.compile(CompileServiceImpl.kt:1558)
	at sun.reflect.GeneratedMethodAccessor109.invoke(Unknown Source)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:498)
	at sun.rmi.server.UnicastServerRef.dispatch(UnicastServerRef.java:357)
	at sun.rmi.transport.Transport$1.run(Transport.java:200)
	at sun.rmi.transport.Transport$1.run(Transport.java:197)
	at java.security.AccessController.doPrivileged(Native Method)
	at sun.rmi.transport.Transport.serviceCall(Transport.java:196)
	at sun.rmi.transport.tcp.TCPTransport.handleMessages(TCPTransport.java:573)
	at sun.rmi.transport.tcp.TCPTransport$ConnectionHandler.run0(TCPTransport.java:834)
	at sun.rmi.transport.tcp.TCPTransport$ConnectionHandler.lambda$run$0(TCPTransport.java:688)
	at java.security.AccessController.doPrivileged(Native Method)
	at sun.rmi.transport.tcp.TCPTransport$ConnectionHandler.run(TCPTransport.java:687)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
Caused by: java.lang.OutOfMemoryError: GC overhead limit exceeded
	at com.sun.tools.javac.code.Scope.makeEntry(Scope.java:231)
	at com.sun.tools.javac.code.Scope.enter(Scope.java:220)
	at com.sun.tools.javac.code.Scope.enter(Scope.java:202)
	at com.sun.tools.javac.code.Scope.enter(Scope.java:198)
	at com.sun.tools.javac.comp.MemberEnter.visitVarDef(MemberEnter.java:683)
	at com.sun.tools.javac.tree.JCTree$JCVariableDecl.accept(JCTree.java:852)
	at com.sun.tools.javac.comp.MemberEnter.memberEnter(MemberEnter.java:437)
	at com.sun.tools.javac.comp.MemberEnter.memberEnter(MemberEnter.java:449)
	at com.sun.tools.javac.comp.MemberEnter.finishClass(MemberEnter.java:459)
	at com.sun.tools.javac.comp.MemberEnter.finish(MemberEnter.java:1404)
	at com.sun.tools.javac.comp.MemberEnter.complete(MemberEnter.java:1199)
	at com.sun.tools.javac.code.Symbol.complete(Symbol.java:574)
	at com.sun.tools.javac.code.Symbol$ClassSymbol.complete(Symbol.java:1037)
	at com.sun.tools.javac.comp.Enter.complete(Enter.java:493)
	at com.sun.tools.javac.comp.Enter.main(Enter.java:471)
	at com.sun.tools.javac.main.JavaCompiler.enterTrees(JavaCompiler.java:982)
	at com.sun.tools.javac.main.JavaCompiler.enterTreesIfNeeded(JavaCompiler.java:965)
	at com.sun.tools.javac.processing.JavacProcessingEnvironment.doProcessing(JavacProcessingEnvironment.java:1242)
	at com.sun.tools.javac.main.JavaCompiler.processAnnotations(JavaCompiler.java:1170)
	at com.sun.tools.javac.main.JavaCompiler.processAnnotations(JavaCompiler.java:1068)
	at org.jetbrains.kotlin.kapt3.base.AnnotationProcessingKt.doAnnotationProcessing(annotationProcessing.kt:79)
	at org.jetbrains.kotlin.kapt3.base.AnnotationProcessingKt.doAnnotationProcessing$default(annotationProcessing.kt:35)
	at org.jetbrains.kotlin.kapt3.AbstractKapt3Extension.runAnnotationProcessing(Kapt3Extension.kt:230)
	at org.jetbrains.kotlin.kapt3.AbstractKapt3Extension.analysisCompleted(Kapt3Extension.kt:188)
	at org.jetbrains.kotlin.kapt3.ClasspathBasedKapt3Extension.analysisCompleted(Kapt3Extension.kt:99)
	at org.jetbrains.kotlin.cli.jvm.compiler.TopDownAnalyzerFacadeForJVM$analyzeFilesWithJavaIntegration$2.invoke(TopDownAnalyzerFacadeForJVM.kt:96)
	at org.jetbrains.kotlin.cli.jvm.compiler.TopDownAnalyzerFacadeForJVM.analyzeFilesWithJavaIntegration(TopDownAnalyzerFacadeForJVM.kt:106)
	at org.jetbrains.kotlin.cli.jvm.compiler.TopDownAnalyzerFacadeForJVM.analyzeFilesWithJavaIntegration$default(TopDownAnalyzerFacadeForJVM.kt:81)
	at org.jetbrains.kotlin.cli.jvm.compiler.KotlinToJVMBytecodeCompiler$analyze$1.invoke(KotlinToJVMBytecodeCompiler.kt:555)
	at org.jetbrains.kotlin.cli.jvm.compiler.KotlinToJVMBytecodeCompiler$analyze$1.invoke(KotlinToJVMBytecodeCompiler.kt:82)
	at org.jetbrains.kotlin.cli.common.messages.AnalyzerWithCompilerReport.analyzeAndReport(AnalyzerWithCompilerReport.kt:107)
	at org.jetbrains.kotlin.cli.jvm.compiler.KotlinToJVMBytecodeCompiler.analyze(KotlinToJVMBytecodeCompiler.kt:546)
@bcorso
Copy link

bcorso commented Oct 24, 2019

Hmm, my initial guess is that this is due to reading the KotlinClassMetadata for each module (required for the new Kotlin-specific features). I'll take a look at this and see if anything obvious shows up in profiling.

@bcorso bcorso self-assigned this Oct 24, 2019
@lucas34
Copy link
Author

lucas34 commented Oct 24, 2019

I manage to compile by using gradle plugin 3.6.0-beta1. Will continue testing with this one.

@Chang-Eric
Copy link
Member

When switching to 3.6.0-beta1, does that fix just the OOM or does it also change the build speed as well?

@lucas34
Copy link
Author

lucas34 commented Oct 25, 2019

With 3.6.0-beta1 the build speed is back to how to was with the old dagger version.

@chrisharris77
Copy link

I still get Out of Memory Errors with dagger 2.25.2 and gradle plugin 3.6.0-beta02

@gildor
Copy link

gildor commented Nov 4, 2019

I think that 3.6.0 helps (sometimes) just because in 3.6 were fixed a few regressions of 3.5 that cause higher memory usage, so most probably Dagger 2.25.2 has higher usage of memory, but it less visible on 3.6 which has lower usage of memory, but the problem still exists (as @chrisharris77 mentioned)

@Chang-Eric
Copy link
Member

Thanks, we're looking into this. It is likely that the extra Kotlin metadata processing we need to do for the object class and qualifier on fields support is taking the extra memory.

@gavra0
Copy link

gavra0 commented Dec 11, 2019

R.java files (generated in 3.5) may cause very high memory usage. In 3.6 we generate R.jar so this should save quite a bit of memory in the javac compiler (which is used by KAPT).

@Ethan1983
Copy link

Subscribing, just started using 2.25.2

@lucas34
Copy link
Author

lucas34 commented Dec 17, 2019

Still have out of memory after updating to 2.25.3

@doniwinata0309
Copy link

doniwinata0309 commented Dec 19, 2019

I got 3gb leaked from kapt (kotlinCompiler daemon). From the tree view seems from dagger codegen requestKinds
Screen Shot 2019-12-19 at 19 41 03
Screen Shot 2019-12-19 at 19 41 16
Some people also got similar issue:
https://youtrack.jetbrains.com/issue/KT-32962

@bcorso
Copy link

bcorso commented Dec 23, 2019

Thanks for the heap dump. Looks like this is from a static cache in RequestKinds causing a memory leak, which is probably made worse due to the persistent GradleDaemon. We should have a fix shortly, and we'll push out a new minor release.

cpovirk pushed a commit that referenced this issue Dec 24, 2019
See #1645

The static RequestKinds cache causes a memory leak in gradle due to the persistent GradleDaemon. This CL removes that cache entirely, and instead fixes the root causes of the performance issues, by short circuiting InjectionSiteVisitor to abort early if the method is not injectable. This avoids a lot of unnecessary calls to RequestKinds.getRequestKind() for methods that are not injectable (e.g. aren't annotated with @Inject).

The aggregated pprof results from 10 runs before and after changes in this CL are below.

                                            Before (s)    After (s)     Diff
-----------------------------------------------------------------------------
ComponentProcessingStep.process               51.80         43.25      -16.5%
RequestKinds.getRequestKind                    8.48          0.33      -95.9%
-----------------------------------------------------------------------------

RELNOTES=Fix memory leak with RequestKinds cache.

-------------
Created by MOE: https://github.com/google/moe
MOE_MIGRATED_REVID=286934462
cpovirk pushed a commit that referenced this issue Dec 26, 2019
See #1645

The static RequestKinds cache causes a memory leak in gradle due to the persistent GradleDaemon. This CL removes that cache entirely, and instead fixes the root causes of the performance issues, by short circuiting InjectionSiteVisitor to abort early if the method is not injectable. This avoids a lot of unnecessary calls to RequestKinds.getRequestKind() for methods that are not injectable (e.g. aren't annotated with @Inject).

The aggregated pprof results from 10 runs before and after changes in this CL are below.

                                            Before (s)    After (s)     Diff
-----------------------------------------------------------------------------
ComponentProcessingStep.process               51.80         43.25      -16.5%
RequestKinds.getRequestKind                    8.48          0.33      -95.9%
-----------------------------------------------------------------------------

RELNOTES=Fix memory leak with RequestKinds cache.

-------------
Created by MOE: https://github.com/google/moe
MOE_MIGRATED_REVID=286934462
@bcorso
Copy link

bcorso commented Dec 27, 2019

This should hopefully be fixed in 2.25.4.

@chao2zhang
Copy link

2.25.4 resolves the out of memory issue in our case. We have 70k+ lines in the generated dagger component.

@lucas34
Copy link
Author

lucas34 commented Dec 27, 2019

Thanks everyone. Will bump and will close this issue if we can confirm that the issue went away.

@doniwinata0309
Copy link

I checked my heap dump from 2.25.4 and the leak went away. Thanks for the fix !

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

9 participants