Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

(bug) memory leak #551

Closed
lonadels opened this issue Mar 31, 2024 · 6 comments · Fixed by #553
Closed

(bug) memory leak #551

lonadels opened this issue Mar 31, 2024 · 6 comments · Fixed by #553

Comments

@lonadels
Copy link

lonadels commented Mar 31, 2024

bot.on(":someEvent", ctx => {});

After this, a memory leak occurs. This happens at the "context" type calculation stage. At this point, the RAM is loaded by 2 GB. At the same time, the tsc may hang until the end of the calculations, and if there is not enough memory, the program will crash with out of memory error. Until you add ctx => ..., everything is fine.

This only works with events starting with a colon and shortcuts.

This bug occurs when using the "Combine With OR" method. Therefore, the bug is with :text, because :text is an alias to msg:text, and msg is the same as ["message", "channel_post"], and this, in turn, is combine.

bot.on("msg"); // still fine
bot.on("msg", ctx => {}); // memory leak
bot.on("message", ctx => {}); // still fine
bot.on(["message", "channel_post"], ctx => {}); // memory leak
@lonadels
Copy link
Author

lonadels commented Mar 31, 2024

Discussion: 1, 2, 3.

@lonadels
Copy link
Author

FATAL ERROR: Reached heap limit Allocation failed - JavaScript heap out of memory
----- Native stack trace -----

 1: 0xb84bd6 node::OOMErrorHandler(char const*, v8::OOMDetails const&) [node]
 2: 0xefead0 v8::Utils::ReportOOMFailure(v8::internal::Isolate*, char const*, v8::OOMDetails const&) [node]
 3: 0xefedb7 v8::internal::V8::FatalProcessOutOfMemory(v8::internal::Isolate*, char const*, v8::OOMDetails const&) [node]
 4: 0x11107c5  [node]
 5: 0x1128648 v8::internal::Heap::CollectGarbage(v8::internal::AllocationSpace, v8::internal::GarbageCollectionReason, v8::GCCallbackFlags) [node]
 6: 0x10fe761 v8::internal::HeapAllocator::AllocateRawWithLightRetrySlowPath(int, v8::internal::AllocationType, v8::internal::AllocationOrigin, v8::internal::AllocationAlignment) [node]
 7: 0x10ff8f5 v8::internal::HeapAllocator::AllocateRawWithRetryOrFailSlowPath(int, v8::internal::AllocationType, v8::internal::AllocationOrigin, v8::internal::AllocationAlignment) [node]
 8: 0x10dcf46 v8::internal::Factory::NewFillerObject(int, v8::internal::AllocationAlignment, v8::internal::AllocationType, v8::internal::AllocationOrigin) [node]
 9: 0x1537cd6 v8::internal::Runtime_AllocateInYoungGeneration(int, unsigned long*, v8::internal::Isolate*) [node]
10: 0x1971ef6  [node]
Aborted (core dumped)

@KnorpelSenf
Copy link
Member

This is known to be caused by a performance regression in the TypeScript compiler. The compiler needs to be updated in order to fix this. There's nothing grammY can do.

Fortunately, this was reported before in #543 so we were able to report this back to Microsoft along with a minimal reproduction example that I had created months in advance (I expected something like this to happen for a different reason). Microsoft has acknowledged and fixed the bug already. However, the fix isn't released yet.

For now, you can fix this using:

npm install typescript@next

Please let me know if that helps.

@KnorpelSenf
Copy link
Member

It seems like the underlying perf improvement does not suffice to fix this, so you probably need to go with typescript@5.3 for now. I am investigating this, but it's a pretty hard problem, so I have no idea how long it will take or what the outcome will be.

@KnorpelSenf
Copy link
Member

You may want to help out by looking at https://github.com/grammyjs/grammY/tree/fix-fq?tab=readme-ov-file#how-to-repro

@KnorpelSenf
Copy link
Member

@lonadels please review #553

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging a pull request may close this issue.

2 participants