Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

v8 heap usage is larger than I'd expect with LL enabled. #22119

Open
ara4n opened this issue May 9, 2022 · 5 comments
Open

v8 heap usage is larger than I'd expect with LL enabled. #22119

ara4n opened this issue May 9, 2022 · 5 comments
Labels
A-Lazy-Loading A-Memory Memory leaks, leak hunting tools A-Performance O-Occasional Affects or can be seen by some users regularly or most users rarely S-Minor Impairs non-critical functionality or suitable workarounds exist T-Defect

Comments

@ara4n
Copy link
Member

ara4n commented May 9, 2022

Steps to reproduce

I just did a heap snapshot about 20 minutes after launching Element Nightly and was surprised to see it contained 192,329 MatrixEvent objects and 157,401 RoomMember objects:

Screenshot 2022-05-09 at 11 46 20

(heap available if needed).

This feels crazily high, given the whole point of membership Lazy Loading is to only load members for the rooms your client is interacting with, and I had only viewed 3 rooms, each with <50 users present, at that point in time.

It's possible that given this is after ~12h of incremental sync, processing E2EE traffic in all my other rooms had caused their memberships to be loaded (in order to speak E2EE) - but i'm still surprised this resulted in as many as 157K room members. The 35K non-member events also feels way too high.

Is it possible that threads is leaking events, or triggering membership loads which then undermines the memory improvements of lazy loading?

This might also be contributing to the OOM in element-hq/element-desktop#680

Outcome

What did you expect?

Roughly 300MB of heap to be in use on a fresh launch

What happened instead?

Roughly 600MB of heap to be in use, which may leak further, and might also be causing OOMs overnight. Will update this bug with further heapdumps which will hopefully differentiate between LL membership warming up and actual leaks.

Operating system

macOS 12.3 on M1

Application version

Nightly

How did you install the app?

No response

Homeserver

No response

Will you send logs?

No

@ara4n ara4n added the T-Defect label May 9, 2022
@ara4n
Copy link
Member Author

ara4n commented May 9, 2022

(This is with threads turned on)

@ara4n
Copy link
Member Author

ara4n commented May 9, 2022

This is with 'only' 95 Thread objects running around, btw - although 95 is still high, given i hadn't consciously interacted with any threads, or viewed many rooms:

Screenshot 2022-05-09 at 11 54 06

@novocaine novocaine changed the title We are no longer lazy-loading room members as much as we should/could Excess memory usage due to no longer lazy-loading room members as much as we should/could May 9, 2022
@novocaine novocaine added S-Minor Impairs non-critical functionality or suitable workarounds exist O-Occasional Affects or can be seen by some users regularly or most users rarely A-Performance A-Memory Memory leaks, leak hunting tools labels May 9, 2022
@ara4n
Copy link
Member Author

ara4n commented May 9, 2022

It's not that related to processing a big incremental sync, as after a clean restart (after the app has immediately been running) it has roughly the same profile:

Screenshot 2022-05-09 at 13 28 02

@ara4n
Copy link
Member Author

ara4n commented May 9, 2022

Counting my number of encrypted rooms, and the total number of members in each room (counting duplicates twice):

e=0; me=0; for (r of mxMatrixClientPeg.get().getRooms()) { if (r.currentState.events.get('m.room.encryption')) { e++; me += r.currentState.events.get('m.room.member').size; } }; console.log(e,me);

1311 6419

So, i still have ~100K room members unaccounted for.

@ara4n
Copy link
Member Author

ara4n commented May 9, 2022

...and looking at all rooms, i only have 34K different member events:

e=0; me=0; for (r of mxMatrixClientPeg.get().getRooms()) { if (true) { e++; me += r.currentState.events.get('m.room.member').size; } }; console.log(e,me);
rageshake.ts:72 3907 34445

I guess it's possible that these are being stored 4 times (two sentinels per room, across oldState and currentState or something?), which could bring us up to the ~120K mark.

This only averages 34445/3907 = 8 members per room, which isn't implausible, given we load the 20 most recent messages in each room, or whatever.

So, I don't think LL is being undermined (as from memory there are around 400K different room members visible from my account when LL is disabled) - the surprising number of members is more likely if we're cloning them 4x.

Separately, we may still have a leak somewhere accounting for the additional ~40K of events... but across 4000 rooms, this is only an additional 10 events per room, which sounds plausible (a few msgs of timeline + create + name + topic + PLs + any other random state). We might want to doublecheck that we're not preloading too much timeline.

Finally, we may have a leak causing element-hq/element-desktop#680 - i'll heapdump again later in the day to see what the difference is.

@ara4n ara4n changed the title Excess memory usage due to no longer lazy-loading room members as much as we should/could v8 heap usage is larger than I'd expect with LL enabled. May 9, 2022
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
A-Lazy-Loading A-Memory Memory leaks, leak hunting tools A-Performance O-Occasional Affects or can be seen by some users regularly or most users rarely S-Minor Impairs non-critical functionality or suitable workarounds exist T-Defect
Projects
None yet
Development

No branches or pull requests

3 participants