-
Notifications
You must be signed in to change notification settings - Fork 121
-
Notifications
You must be signed in to change notification settings - Fork 121
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Massive initial memory usage and leak #1537
Comments
Duplicate of #1534 The AUR package is not created by FAF devs and not supported. |
The issue persists using the program directly downloaded from github. Additionally, I'd like to add that the AUR package is well written a does a nice job of integrating it with the system, it doesn't do anything wrong. I'd point my finger of blame to a forced unsupported-for-two-years version of the jvm doing weird stuff, but who knows really. Considering the ram usage doesn't even show as attached to the process, AND doesn't return when closed, something very bad is happening to the system. |
Reopneing this instead of #1534 |
I'm unable to run this without pointing it to openjdk10. Openjdk11 will trigger
I glanced at the code and the build specifically targets 10 compatibility, not 11 or higher. (maybe they use a weird off by one internal api versioning or something, not really familiar with desktop jdk) |
We can change this in the installer, it is miss set |
changed it in the installer next version will allow u to use 11 12 and 13 |
I tested it with java 11 and the memory usage seemed better in the client, but catastrophically failed when I launched the game (faster than even the others), so that's a weird behavior. JDK 13 still prompts the warning saying that my java version should be between 10 and 13. This may have to do with java 13 using an internal version code of "HIGHER" and not 13. I guess they intend to stop people from targeting specific jdks. Launching the client specifying jdk13 path on the appropriate environment variable forces it to launch, however, the client ui is completely broken on jdk13 and massive (and beautiful) stacktraces show up on the console. |
So, after testing on my laptop and finding the issue wasn't happening there, a few hours of chatting and a long stack of happy coincidences, the issue has been narrowed down. There's a bug somewhere in the javafx/xorg/mesa/llvm/drm/kernel (seen a lot of the Spiderman meme situation, with the everyone pointing at each other) that causes memory to not be freed after being allocated. It's innocuous in desktop usage as the memory leak is really slow. It's likely that the hacked together progressbars in javafx are acting as a torture test and triggering the bug in extreme. By searching for "javafx mesa linux memory leak" similar behavior is seen by people as far back as 2016. There are two workarounds for the issue:
This, of course, nukes performance. But it's a 2d launcher, so... Added to the
This could be done automatically for linux builds, but that would hurt users that don't have the presumably very specific software/hardware combo required to trigger the bug. |
Thanks for investigating. The progressbar has created quite some trouble so far (see #1371) |
Well I close this because apparently there is nothing we can do |
Describe the bug
Opening the client. specially using network dependent features such as downloading stuff from the Vault or starting a simply hosting a co-op map will consume gigabytes upon gigabytes of RAM. (Around 4-6GB then 8 when launching the client)
Additionally, the ram usage of the client will slowly creep up until the system runs out of RAM regardless of what I do.
The console is surprisingly quiet about all of this. Some network errors pointing to unreachable URLs (that 404 if opened with the browser) happen from time to time.
To Reproduce
Install the client from the AUR, launched it, monitor RAM with your preferred method.
Expected behavior
RAM usage is not world ending.
OS
Arch Linux w/ mesa-git and linux 5.5, but has been happening for a log while.
The text was updated successfully, but these errors were encountered: