Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Issue with Large Logseq Graphs Exceeding 300MB Leading to Startup Failures #11236

Open
1 of 2 tasks
Lluvia95 opened this issue Apr 15, 2024 · 5 comments
Open
1 of 2 tasks
Labels
:type/performance Performance related, speed or cpu usage.

Comments

@Lluvia95
Copy link

Search first

  • I searched and no similar issues were found

What Happened?

Hello everyone,

I’m reaching out to the community for assistance with an issue I’ve encountered using Logseq with a significantly large graph. My current graph has grown to include over 30,000 files and continues to expand. However, I’ve observed a consistent issue where the application fails to launch whenever the graph’s size surpasses 300MB.

Has anyone experienced similar issues with large graphs? Any insights, workarounds, or solutions would be greatly appreciated.

Reproduce the Bug

Make a graph has a lots of files and links, till the graph’s size surpasses 300MB.

Expected Behavior

No response

Screenshots

No response

Desktop or Mobile Platform Information

No response

Additional Context

No response

Are you willing to submit a PR? If you know how to fix the bug.

  • I'm willing to submit a PR (Thank you!)
@andelf
Copy link
Collaborator

andelf commented Apr 15, 2024

Currently, Logseq loads all data into memory as a datascript DB. So this is the root cause for performance issues in large graphs.
May I know what's in your graph? How many pages and journals and how many assets files? And what's the size of the largest file?

@andelf andelf added the :type/performance Performance related, speed or cpu usage. label Apr 15, 2024
@Lluvia95
Copy link
Author

Thank you for explaining the performance issues with large graphs in Logseq due to how data is loaded into memory. Indeed, I do have a quite extensive graph with 69,581 pages, and tens of thousands of asset files. In terms of solutions, would upgrading my computer's RAM significantly mitigate these performance issues?

@Lluvia95
Copy link
Author

image Every time I open Logseq with a graph larger than 300Mb, after a while, the console ends up looking like this.

@Lluvia95
Copy link
Author

I've now backed up a graph that's 299Mb in size, and every time I restart Logseq, I have to endure a lengthy wait before it opens.

@Lluvia95
Copy link
Author

Lluvia95 commented May 7, 2024

Thank you for explaining the performance issues with large graphs in Logseq due to how data is loaded into memory. I have a quite extensive graph, consisting of 69,581 pages and tens of thousands of asset files, which likely contributes to the slowdowns I've been experiencing.

Interestingly, I noticed that when I open Logseq with a backup of my 299MB graph, the application completes the reindexing and then exits immediately. After this process, the graph's size increases to 310MB, leading me to suspect that the issue may not solely be related to my system's memory. It could be a software issue, possibly with the program hitting a critical limit during initialization, causing an overflow or crash.

Would you have any recommendations on how to address this behavior or strategies to mitigate it?

Thank you for your guidance!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
:type/performance Performance related, speed or cpu usage.
Projects
None yet
Development

No branches or pull requests

2 participants