Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[BUG] Memory leak #97

Open
MasoudKh87-git opened this issue Apr 19, 2021 · 4 comments
Open

[BUG] Memory leak #97

MasoudKh87-git opened this issue Apr 19, 2021 · 4 comments
Labels
bug Something isn't working help wanted Extra attention is needed

Comments

@MasoudKh87-git
Copy link

Hi Andrew,

When I use this component with big data, the program slows down due to memory leaks

Memory Leak

@MasoudKh87-git MasoudKh87-git added the bug Something isn't working label Apr 19, 2021
@avallete
Copy link
Owner

Hi,

This issue will be hard to investigate without more details/examples to reproduce.

If you can provide a MWE I'll be able to investigate further on this issue.

@MasoudKh87-git
Copy link
Author

I have 29,000 records.
For the first time it’s rendering for 29,000 times and shows the records perfectly. BUT when I use the arrow keys to go up/down in the list, it renders the whole list again which causes a considerable slow movement in the list.
I think it is not destroying the listeners and nodes which is already created, therefore memory leaks occur.

@avallete
Copy link
Owner

The items list elements are removed from the DOM at each call to update and I don't see where references could be kept preventing the garbage collection. I will investigate more on it.

However, the slow down could not even be due to leaks but to the way events and rendering itself is handled

const fragment = renderItems<T>(items, getSelected(), getInputValue(), itemClickHandler, render, renderGroup)

which remove and re-render all elements at every call to update(). This part of the code could surely use some optimization to play more nicely with huge amount of data.

I never used it nor designed it to handle this amount of data locally. I usually leverage the fetch api and some request filtering (limit/pagination) to move the processing load from client to server.

What could be done:

  • Add some "big workload" tests with data-gen tools (eg: faker) to time the behavior on (1k, 5k, 10k, 20k, 30k+) entries.
  • Search for memory leaks
  • The rendering part probably also need to be refactored into a less naive approach

@avallete avallete added the help wanted Extra attention is needed label Apr 20, 2021
@MasoudKh87-git
Copy link
Author

Hi,
Thanks for your comment, that's right and I handled this issue with limiting the list after fetch.
But as you can see in memory leak snapshot nodes(green line) and listeners(orange line) are not destroying after disposing the component!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working help wanted Extra attention is needed
Projects
None yet
Development

No branches or pull requests

2 participants