Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Leak on parceler compiler (org.parceler.ParcelAnnotationProcessor$$Bootstrap) #378

Closed
doniwinata0309 opened this issue Jan 6, 2020 · 31 comments

Comments

@doniwinata0309
Copy link

doniwinata0309 commented Jan 6, 2020

Hi,
Our build got memory leak on org.parceler.ParcelAnnotationProcessor$$Bootstrap. It causing stuck on our build when JVM (daemon) memory full, and cannot perform any GC. Then the heap dump showing leak and dominator object from org.parceler.ParcelAnnotationProcessor$$Bootstrap.
It takes around 50 % of our heap size (around 2GB).

The image below shows the dominator and leak object when compiling our project.

dominator object

instance detail

leak

This is the screenshot of our main daemon when it got full because of parceler leak. At this point the build will hanging and won't complete (not even causing OOM).
Screen Shot 2020-01-02 at 15 15 25

It seems coming from this file.
https://github.com/johncarl81/parceler/blob/master/parceler/src/main/java/org/parceler/ParcelAnnotationProcessor.java#L4

Do you have clue what might be wrong ? We suspect transfuse as DI manager creating static variable that will contains reference to that class. But i am not really sure which part causing this issue.

We have like 50+ modules implement parceler_api (annotation processor), and using lot of @Parcel class.
Let me know if you have additional information, Thanks.

@johncarl81
Copy link
Owner

Hmm, this is certainly concerning. Could you help me reproduce @doniwinata0309 ?

@doniwinata0309
Copy link
Author

This is sample to repro the leak:
https://github.com/doniwinata0309/build-perf-test/tree/test/parceler

  1. Go to test/parceler branch
  2. build "./gradlew androidAppModule0:aFD --rerun-tasks" 2 times
  3. Use visualVM to generate heap dump of gradleDaemon
  4. Open with MAT and there should be leak on parceler.

I also attached the result of leak on the branch https://github.com/doniwinata0309/build-perf-test/tree/test/parceler
one with kapt incremental enable, one kapt incremental disable.

@doniwinata0309
Copy link
Author

doniwinata0309 commented Jan 8, 2020

The REPOSITORY object being referenced on the tree view:
https://github.com/johncarl81/transfuse/blob/master/transfuse-bootstrap/src/main/java/org/androidtransfuse/bootstrap/Bootstraps.java
It seems alway add new object new GeneratedCodeRepository and causing leak there ?
The object also getting huge as we add more @parcel class i guess, and also multiple instance found on the leak.

@johncarl81
Copy link
Owner

Hmm, maybe im out of touch with gradle these days, but I cant get ./gradlew androidAppModule0:aFD --rerun-tasks to run, I get the following:

./gradlew androidAppModule0:aFD --rerun-tasks

FAILURE: Build failed with an exception.

* What went wrong:
Task 'aFD' not found in project ':androidAppModule0'.

* Try:
Run gradlew tasks to get a list of available tasks. Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

BUILD FAILED in 546ms

Is aFD an acronym for some build task?

@doniwinata0309
Copy link
Author

yes it is from ./gradlew androidAppModule0:assembleFlavDebug --rerun-tasks
let me know if this command works or not.

@johncarl81
Copy link
Owner

Ok, I'm up and running. I haven't been able to pinpoint the URLClassLoader memory leak you're referring to.

If you perform a GC after the build is memory cleared up back to normal?

The point of the BootStraps class is to look up by reflection a generated code class. Any idea what's holding on to the reference to keep it from being GC'd (if that's truely the case)?

@doniwinata0309
Copy link
Author

Did you using MAT analyzer and dump the gradle daemon ? mine will have like 1.2 gb of hprof file after 2 build with --rerun-tasks.
If you perform a GC after the build is memory cleared up back to normal?

No it is still leak after i pressed GC. This is the detailed leak i just capturing.
Screen Shot 2020-01-10 at 00 43 10

I think
scoped org.parceler.ParcelAnnotationProcessor$$Bootstrap @ 0x76521f2d0
and REPOSITORY class org.parceler.transfuse.bootstrap.Bootstraps @ 0x76521e9f8
cannot be GC ? because seems they are static object

@doniwinata0309
Copy link
Author

is it because scoped with @bootstrap causing ParcelAnotationProcessor cannot be GC'd. ParcelAnnnotationProcessor itself become really huge because it contains and refer to Bootstraps which also hold REPOSITORY object. Does the Bootstraps will getting huge as more class using @parcel ? wondering if somehow we can destroy the bootstraps/REPOSITORY object, so even though parcelAnotation cannot be GC'd at least it only leaking very less amount of memory.

@johncarl81
Copy link
Owner

This should be easy to fix then.. just need to remove the cache located here.

@doniwinata0309
Copy link
Author

it should be worth if it safe to remove that field. I would love to try that, can you give me advice how to try that on some branches or patch ?

google/dagger@2902519
Dagger also got similar issue with cache holding huge object on memory, and they remove the cache entirely.

@johncarl81
Copy link
Owner

johncarl81 commented Jan 11, 2020

This should do the trick: https://github.com/johncarl81/transfuse/tree/remove_bootstraps_repo (johncarl81/transfuse@a5ec966). To test, you build Transfuse and install it into your local maven repo. Next, you'll need to change this in Parceler: to 0.3.0-SNAPSHOT, built it and install it into your local maven repo. Finally, in your project, reference your local maven repository to build in gradle (mavenLocal())

@doniwinata0309
Copy link
Author

Thanks @johncarl81 , i already build the maven and tried it on our project. The leak has gone and daemon not stuck anymore, it certainly giving huge impact into our full build.
Would very love to use this fix version very soon 👍

@johncarl81
Copy link
Owner

johncarl81 commented Jan 12, 2020

Awesome! Thanks for testing it out.

We use a similar technique in the Parcels utility here. I imagine that the Bootstraps issue is more prevalent because new keys are used for each run, vs the same class keys used in the Parcels class. But, Parcels is used during runtime, which would be a much larger issue for users. Do you see any memory issues around the Parcels utility class for your project?

@doniwinata0309
Copy link
Author

doniwinata0309 commented Jan 13, 2020

So the generatedMap will hold all generated all at once, or load it one by one depends on which page we opened (if the page contains more parcel class it will generate new static object) ? so far i don't see any issue with this. I do a quick check on our app and that object seems only around 1.4kb in runtime memory. This is the object of generatedMap you are referring to on the runtime:
Screen Shot 2020-01-13 at 11 38 31

@johncarl81
Copy link
Owner

Parcels loads the generatedMap lazily, and holds it for the lifetime of the application. It really shouldn't load much and mitigate some runtime overhead of loading class definitions using reflection.

@doniwinata0309
Copy link
Author

I see, I think it is fine to our apps with current behaviour on Parcels class. As it not load much memory and can mitigate runtime overhead as you said.

@johncarl81
Copy link
Owner

Ok, ill look at making a release shortly with this update.

@doniwinata0309
Copy link
Author

Hi @johncarl81 any ETA for the new release with this fix ? Thanks

@doniwinata0309
Copy link
Author

Hi @johncarl81, I hope you are doing well. Any chance this Bugfix will be deployed soon ? Thanks

@johncarl81
Copy link
Owner

Shoot, I've been meaning to do a release, just been backed up with other priorities. Keep bugging me if I don't get to it.

@doniwinata0309
Copy link
Author

Understood, thank you for letting me know your situation. I will remind you again in weeks :)

@johncarl81
Copy link
Owner

Released to maven central under 1.1.13.

@doniwinata0309
Copy link
Author

Thank you for the release and fix this issue, greatly appreciate your time.

@johncarl81
Copy link
Owner

@doniwinata0309, just to be doubly sure, is this release working 100% for you now?

@doniwinata0309
Copy link
Author

I did benchmark today at my CI, so far i did not see any significant improvement on build speed as i expected. But from visual VM using very less memory footprint compared to previous one. Does the change johncarl81/transfuse@a5ec966 possibly will causing more cpu usage because it removing static classes that hold generated code ? I will give more info after checking the cpu usage on this release comparing to the previous one.

@johncarl81
Copy link
Owner

Perhaps. It will have to stand up the environment fresh for each run of the processor, but that's what we want here. Curious to see your results.

@doniwinata0309
Copy link
Author

Hi @johncarl81 , i analysed parceler 1.1.12 and 1.1.13 to our build in this doc:
Parceler 1.1.13-2.pdf
I think the profiler showing no huge differences on cpu activity and number of threads. The memory footprint on 1.1.13 is much better and causing less major GC (which is very good). So we decide to update it later. I am not sure yet what causing memory spike when processing application module, probably not related to parceler.
Let me know what do you think about the doc.

@johncarl81
Copy link
Owner

Good stuff @doniwinata0309. Is this analyzed against https://github.com/doniwinata0309/build-perf-test/tree/test/parceler ?

@doniwinata0309
Copy link
Author

No, it is the test result from my real project actually 😄

@johncarl81
Copy link
Owner

I think we solved the immediate issue, but I am still curious about other touchups we could make, also in this memory leak you're referring to. If this issue is solved, open another issue if anything else comes up. Sound good? And thanks again for bringing this to our attention!

@doniwinata0309
Copy link
Author

Sure, i will let you know if i find some similar problem again, so far it looks good on the newest Android Gradle Plugin. I really appreciate your time on fixing this issue. Thanks for your help 🙏 🙏

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants