-
Notifications
You must be signed in to change notification settings - Fork 10
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Very slow performance and heavy stuttering on Linux #47
Comments
@gameblabla - Sorry to reply so late. I've recently made some slight changes to the interpolation code, so please try again and see if you get some performance increases. As pointed out by Graf, E_Display was calling N_SetInterpolater() when it didn't really need to -- thusly, I moved it into P_Tick(), and so far on my system, it seems to result in less "wasted" frames. Since then, UsernameAK has made changes to the CMAKE system to support Linux. I also recommend you switch to the "arranged_source" branch, as that one has the most changes so far. It hasn't hit "main" yet because I'm waiting to see if you Linux users can compile with the source files/directories changed up slightly. So, please try that out first, then the changed code, and let me know if you notice any performance increases =) |
@Corbachu i have the same problems :D |
As noted in 3DGE-ISSUES.txt, -norenderbuffers is required on Linux with ATI. But i have Intel HD Graphics O_o |
Intel HD graphics...generally EDGE will run like crap on it. Any OpenGL-heavy engine on IntelHD isn't expected to perform well most of the time. I dont have one so I can't say for sure. The desktop machine though is beefy enough..strange... |
@Corbachu hmmm... Doom 3 gives 60 fps without glitches on my IntelHD on maximum graphics quality (I know a guy who has ATI and it gives 15 fps on it, but the specs are almost the same) but EDGE runs like crap on it. |
i5-6400 and a GTX 680 here, using the proprietary NVIDIA drivers, of course. I also do notice some stuttering despite it running constantly at 60 FPS. |
I just have walked through source code with GDB, and I've found a problem: https://github.com/3dfxdev/hyper3DGE/blob/3fdb5f080b7517ebd3473d2dad6cee94ed2ee989/src/r_shaderprogram.cc#L145 gives GL_INVALID_VALUE error |
Is that not supported under Linux video drivers? I am thinking and have heard Linux drivers are finicky, but if you have suggestions then I'm all ears. Maybe defaulting to -norenderbuffers on Linux would help? |
From the OpenGL docs: I've tested that the value of GL_MAX_DRAW_BUFFERS is 8 in my system. |
@usernameak well it's not that its a shitty renderer, it just wasn't designed for interpolation. We really gotta take a look at the lerp code and fix it up. I think most of it is bottlenecking in E_Tick() in e_main.cc, but a lot of the code resides in n_network.cc. |
@Corbachu the reason of such bug is that NVIDIA linux driver has less limits about |
okay, so what we need to do is (if you can), wrap an #ifdef LINUX around that and have it check that the extension limit is in a valid range. You can insert that via R_Main.cc, in RGL_CheckExtensions(), so if it fails that test we can disable the shaderprogram much earlier in the initialisation. That's the best solution IMO. :-) After you handle that, then we can begin fixing up the interpolation code. EDGE does this backwards (literally), while other DOOM ports (notably GZD) do forward prediction. so that's something we all need to discuss to prevent framerate issues. |
@Corbachu I remembered that in the Linux Intel driver has very bad shader preprocessor that almost doesn't work. So... we have to write our own one! And yes, add that fucking |
I thought gitinfo was added! I remember adding it before...lol I'll do it when I get off work tonight :D |
@Corbachu everything becomes smooth when i do |
yeah, I figured that was the cause. Rachael had me create a lerp2 branch so we can focus on rewriting all of the interpolation code :D |
Please build and try this again -- I have made numerous changes to the swap interval code that should have stuttering and slowdowns fixed xD Please test on Linux so we can compile and release a 2.1.0-Test3 binary for Linux folk =) |
@gameblabla Can this be revisited by some Linux folk? I had made important changes that fixed the SDL refresh timing for the screen, so now there's hardly any stutters unless you are playing a giant map like Frozen Time. Please look into this again so I can close it. =) |
@dsdman can you confirm or deny any of these slowdown issues for me? Still on the last steps of setting up my VM but I would like real-OS performance tested. I really appreciate the help! Let me know if you can or can't. Thanks friend. |
@Corbachu I can't confirm nor deny any of these slowdowns, mainly because I still do not have a display in-game (no menus, no hud, no graphics, no anything). I plan on opening a ticket later when I have time. |
@dsdman restart the compile with debug and developers enabled in the code. Does EDGE generate the debug textfile? If so, please paste the entire contents or pop it via PasteBin. Please also paste the additional text files (glsl.log and glext.log) as those will help me determine if it could be a glEnumError or whatnot. I have a sneaking suspicion it has to do with the video code (I_video.cc) not setting the swap interval right, but at least with all 3 of those logs I can see what might be happening. |
@dsdman also run the engine with -norenderbuffers just in case! |
@Corbachu Okay, here it is: Debug.txt: glext.log and edgegl.txt are both empty, but below is a list of all the ifdefs I could find using grep. Which one enables output to this file? |
@dsdman There should only be "GLSL.log" and "glext.log". Edgegl.txt turned into a logfile in the newer commits. Try setting r_swapinterval to 0 in edge.cfg and reboot. Setting it to " 2" is only for Win32, so if it is already at 1, then set it to 0.. |
Also, run the engine with -oldglchecks just in case. . |
@Corbachu ran it with both r_swapinterval at 0 and 1, still no graphics. However, setting it to 0 also seems to crash it (no sounds/input with it at 0 in-game. It still has sounds/input in the menu that you can't see. At 1 it has sounds/input even in-game). Here are the updated logs (with it set to 0): |
Hmmm. I'm not sure what the issue can be at this point. You tried -oldglchecks right? Also try turning off r_bloom, r_fxaa, and r_lens. |
@Corbachu Yes, both -oldglchecks and -norenderbuffers are on. I just finished trying permutations of these two options along with changing r_swapinterval, r_bloom, r_fxaa, and r_lens in the config file. The only difference between any of these options seems to be that r_swapinterval at 0 crashes the game. |
Hmmm, okay so r_swapbuffers needs a "r" flag to prevent users from changing it (breaking the game). I started looking into OpenGL debuggers last night but haven't settled on one. Since Win32's renderer works, we would need to settle on one for Linux only. I know Something Somewhere is calling a glEnumError; it isn't SDL as the context appears to be created and dumped just fine. So I propose we start looking for a solution as soon as possible. What do you guys think? |
@dsdman Do a pull. I have enforced much stricter checks for the RenderBuffers context. Now, if -norenderbuffers is specified, it will prevent RGL_InitRenderBuffers() from even creating the context. Avoid using -oldglchecks unless absolutely necessary. If this doesn't fix the Linux rendering context, I'm moving on to hard OpenGL debugging. |
@Corbachu Just did a pull + recompile. There is a compile error (oddly enough when linking the executable):
Still gives me a black screen (both with and without -norenderbuffers.) I guess we'll have to debug it through OpenGL debugging tools, though I recall it not being broken several months ago, so it may be a regression with one of the past changes to the renderer. I'll do some testing to see if I can find the most recent commit that doesn't have this problem. |
@dsdman keep me up to date, if/when you find one, let me know how to set it up since I have Ubuntu in a VM now :-) Hope we can get this ironed out! |
And the r_fxaa_quality was introduced but I forgot to push it...I'll do that tonight :D |
So after applying my pull request (#58) and disabling r_fxaa_quality, 3DGE works again on Linux. However, here's when things become odd. (I'm using the Nouveau driver btw) What is going on ? lol. Seems like your code is still not quite right. |
@gameblabla can you please test master to make sure that the fixes I put work? I rewrote the swapinterval code to be somewhat OS-agnostic and... well, personally, easier to understand. @Corbachu I don't know if I got the new r_swapinterval values or triggers right, to be quite honest I don't even know if it should trigger based on r_vsync values or not. My "fix" is more about getting things working right now, than getting things correct. You'll have to correct them if they're wrong. Sorry. >.< |
@dsdman could you also please test and make sure @raa-eruanna 's fixes have indeed restored the Linux renderer? Thanks :) |
@Corbachu Still won't render on my end. Another thing interesting is that if I revert to commit 19a7657 and apply @gameblabla (now closed) linux fix pull request, it still won't render on my end. So maybe this is an issue with the proprietary Linux Nvidia driver. I will see if I can use the Nouveau (open source) driver and try again with that. |
@raa-eruanna Sadly, it does not fix it for me. I compiled the latest git and it still happens. It's hard to describe but it is basically not in sync with the game world, like i can see previous frames. EDIT: I compared it with an older version of EDGE and it certainly does not feel as smooth as the older version. |
Manipulating the swap interval values goes a bit beyond my OpenGL understanding then, unfortunately. If r_vsync is 0 it's supposed to turn it off completely and there's a statement earlier in the same file that does exactly that - I haven't checked if it's ever even executed, though. |
I just switched to the proprietary Nvidia drivers and it works properly with it too. EDIT: So i tried it on my laptop and while it does not exhibit the issues i had on nouveau, it is pretty slow, with or without interpolation. |
I just finished testing after resolving #57. It runs butter smooth on my end, even with r_lerp set to 1. |
Great! Performance issues are eliminated then. I'll go ahead and close it =) |
I'm sorry to say this but they're not. It's still very slow on Intel machines with their iGPUs. |
Noticed this issue was reopened. Some of you are saying it's smooth, others are not. No more opening this issue -- just for reference. For continuity and cleanliness' sake, can we please post an all new issue? @dsdman sorry my friend! Thead is just too 'noisy' for my ADHD to follow ;-) |
Hello,
Since a year ago or so, the Linux version was not running smoothly.
Even when the fps counter is at 59 FPS, it still stutters like mad.
3DGE was compiled with both extreme optimisation "-O3 -march=native -mtune=native" and the default ones.
No matter what you're playing, the engine stutters a lot, even though the fps counter says the game runs fine.
This was on a Ryzen 5 1500X 3.5 Ghz (SMT disabled) with a Geforce 780.
On an older laptop with a Pentium Dual Core 2 Ghz and an Intel GM45 chipset, it runs even worse.
It used to run okay in 2015/early 2016 until interpolation was added.
Even with interpolation disabled, it still runs like crap.
At a high resolution (720p), it freezes after playing the first map for 2/3 seconds.
I had to lower the resolution to 640x480 to make it run without any weird freezes and even then,
it was skipping a lot of frames.
I also tried to disable post-rendering effects in the code and it help a bit but couldn't remove the stutter.
Do you suffer from stuttering issues on Windows ? Seems like not.
Is this caused by a poor frame-skipping code ? It skipped frames on my older laptop so there must be something like that.
I thought that maybe that was an issue with system libraries but i recompiled an older version with the current system libraries and it ran fine.
The text was updated successfully, but these errors were encountered: