Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Very slow performance and heavy stuttering on Linux #47

Closed
gameblabla opened this issue Aug 21, 2017 · 45 comments
Closed

Very slow performance and heavy stuttering on Linux #47

gameblabla opened this issue Aug 21, 2017 · 45 comments

Comments

@gameblabla
Copy link
Contributor

gameblabla commented Aug 21, 2017

Hello,
Since a year ago or so, the Linux version was not running smoothly.
Even when the fps counter is at 59 FPS, it still stutters like mad.
3DGE was compiled with both extreme optimisation "-O3 -march=native -mtune=native" and the default ones.
No matter what you're playing, the engine stutters a lot, even though the fps counter says the game runs fine.
This was on a Ryzen 5 1500X 3.5 Ghz (SMT disabled) with a Geforce 780.

On an older laptop with a Pentium Dual Core 2 Ghz and an Intel GM45 chipset, it runs even worse.
It used to run okay in 2015/early 2016 until interpolation was added.
Even with interpolation disabled, it still runs like crap.
At a high resolution (720p), it freezes after playing the first map for 2/3 seconds.
I had to lower the resolution to 640x480 to make it run without any weird freezes and even then,
it was skipping a lot of frames.

I also tried to disable post-rendering effects in the code and it help a bit but couldn't remove the stutter.
Do you suffer from stuttering issues on Windows ? Seems like not.
Is this caused by a poor frame-skipping code ? It skipped frames on my older laptop so there must be something like that.

I thought that maybe that was an issue with system libraries but i recompiled an older version with the current system libraries and it ran fine.

@Corbachu
Copy link
Contributor

Corbachu commented Sep 27, 2017

@gameblabla - Sorry to reply so late.

I've recently made some slight changes to the interpolation code, so please try again and see if you get some performance increases. As pointed out by Graf, E_Display was calling N_SetInterpolater() when it didn't really need to -- thusly, I moved it into P_Tick(), and so far on my system, it seems to result in less "wasted" frames.

Since then, UsernameAK has made changes to the CMAKE system to support Linux. I also recommend you switch to the "arranged_source" branch, as that one has the most changes so far. It hasn't hit "main" yet because I'm waiting to see if you Linux users can compile with the source files/directories changed up slightly.

So, please try that out first, then the changed code, and let me know if you notice any performance increases =)

@usernameak
Copy link
Contributor

@Corbachu i have the same problems :D

@usernameak
Copy link
Contributor

usernameak commented Sep 30, 2017

As noted in 3DGE-ISSUES.txt, -norenderbuffers is required on Linux with ATI. But i have Intel HD Graphics O_o

@Corbachu
Copy link
Contributor

Corbachu commented Oct 3, 2017

Intel HD graphics...generally EDGE will run like crap on it. Any OpenGL-heavy engine on IntelHD isn't expected to perform well most of the time. I dont have one so I can't say for sure. The desktop machine though is beefy enough..strange...

@usernameak
Copy link
Contributor

usernameak commented Oct 4, 2017

@Corbachu hmmm... Doom 3 gives 60 fps without glitches on my IntelHD on maximum graphics quality (I know a guy who has ATI and it gives 15 fps on it, but the specs are almost the same) but EDGE runs like crap on it.
So... i can say that EDGE renderer is shit
I have i3-4150, HD Graphics 4400 and 6GB RAM

@OrdinaryMagician
Copy link

i5-6400 and a GTX 680 here, using the proprietary NVIDIA drivers, of course.

I also do notice some stuttering despite it running constantly at 60 FPS.

@usernameak
Copy link
Contributor

usernameak commented Oct 4, 2017

I just have walked through source code with GDB, and I've found a problem: https://github.com/3dfxdev/hyper3DGE/blob/3fdb5f080b7517ebd3473d2dad6cee94ed2ee989/src/r_shaderprogram.cc#L145 gives GL_INVALID_VALUE error

@Corbachu
Copy link
Contributor

Corbachu commented Oct 4, 2017

Is that not supported under Linux video drivers? I am thinking and have heard Linux drivers are finicky, but if you have suggestions then I'm all ears. Maybe defaulting to -norenderbuffers on Linux would help?

@OrdinaryMagician
Copy link

From the OpenGL docs:
GL_INVALID_VALUE is generated if colorNumber is greater than or equal to GL_MAX_DRAW_BUFFERS.

I've tested that the value of GL_MAX_DRAW_BUFFERS is 8 in my system.

@Corbachu
Copy link
Contributor

Corbachu commented Oct 4, 2017

@usernameak well it's not that its a shitty renderer, it just wasn't designed for interpolation. We really gotta take a look at the lerp code and fix it up. I think most of it is bottlenecking in E_Tick() in e_main.cc, but a lot of the code resides in n_network.cc.

@usernameak
Copy link
Contributor

@Corbachu the reason of such bug is that NVIDIA linux driver has less limits about glBlitFramebuffer than other linux drivers

@Corbachu
Copy link
Contributor

Corbachu commented Oct 5, 2017

okay, so what we need to do is (if you can), wrap an #ifdef LINUX around that and have it check that the extension limit is in a valid range. You can insert that via R_Main.cc, in RGL_CheckExtensions(), so if it fails that test we can disable the shaderprogram much earlier in the initialisation. That's the best solution IMO. :-)

After you handle that, then we can begin fixing up the interpolation code. EDGE does this backwards (literally), while other DOOM ports (notably GZD) do forward prediction. so that's something we all need to discuss to prevent framerate issues.

@usernameak
Copy link
Contributor

usernameak commented Oct 6, 2017

@Corbachu I remembered that in the Linux Intel driver has very bad shader preprocessor that almost doesn't work. So... we have to write our own one! And yes, add that fucking gitinfo.h to .gitignore

@Corbachu
Copy link
Contributor

Corbachu commented Oct 6, 2017

I thought gitinfo was added! I remember adding it before...lol I'll do it when I get off work tonight :D

@usernameak
Copy link
Contributor

@Corbachu everything becomes smooth when i do r_lerp 0

@Corbachu
Copy link
Contributor

yeah, I figured that was the cause. Rachael had me create a lerp2 branch so we can focus on rewriting all of the interpolation code :D

@Corbachu
Copy link
Contributor

@usernameak
@gameblabla

Please build and try this again -- I have made numerous changes to the swap interval code that should have stuttering and slowdowns fixed xD

Please test on Linux so we can compile and release a 2.1.0-Test3 binary for Linux folk =)

@Corbachu
Copy link
Contributor

@gameblabla Can this be revisited by some Linux folk? I had made important changes that fixed the SDL refresh timing for the screen, so now there's hardly any stutters unless you are playing a giant map like Frozen Time.

Please look into this again so I can close it. =)

@Corbachu
Copy link
Contributor

@dsdman can you confirm or deny any of these slowdown issues for me? Still on the last steps of setting up my VM but I would like real-OS performance tested. I really appreciate the help! Let me know if you can or can't. Thanks friend.

@dsdman
Copy link
Contributor

dsdman commented Jan 22, 2018

@Corbachu I can't confirm nor deny any of these slowdowns, mainly because I still do not have a display in-game (no menus, no hud, no graphics, no anything). I plan on opening a ticket later when I have time.
That being said, I turned on debug_fps in the config file, and the fps does not show up in the log, so I'm assuming this option displays/draws the fps on screen. Also on a side-note, I don't think there would be any slowdowns, as even Doom 2016 plays at 60 fps (anywhere below 900p) on this machine.

@Corbachu
Copy link
Contributor

@dsdman restart the compile with debug and developers enabled in the code. Does EDGE generate the debug textfile? If so, please paste the entire contents or pop it via PasteBin. Please also paste the additional text files (glsl.log and glext.log) as those will help me determine if it could be a glEnumError or whatnot. I have a sneaking suspicion it has to do with the video code (I_video.cc) not setting the swap interval right, but at least with all 3 of those logs I can see what might be happening.

@Corbachu
Copy link
Contributor

@dsdman also run the engine with -norenderbuffers just in case!

@dsdman
Copy link
Contributor

dsdman commented Jan 22, 2018

@Corbachu Okay, here it is:

Debug.txt:
glsl.log:
EDGE2.log

glext.log and edgegl.txt are both empty, but below is a list of all the ifdefs I could find using grep. Which one enables output to this file?
ifdefs.txt

@OrdinaryMagician
Copy link

Also have the same black screen problem. Here's my logs:

debug.txt
EDGE2.log
glsl.log

Nothing from glext.log, and I have no edgegl.txt file.

@Corbachu
Copy link
Contributor

@dsdman There should only be "GLSL.log" and "glext.log". Edgegl.txt turned into a logfile in the newer commits.

Try setting r_swapinterval to 0 in edge.cfg and reboot. Setting it to " 2" is only for Win32, so if it is already at 1, then set it to 0..

@Corbachu
Copy link
Contributor

Also, run the engine with -oldglchecks just in case. .

@dsdman
Copy link
Contributor

dsdman commented Jan 23, 2018

@Corbachu ran it with both r_swapinterval at 0 and 1, still no graphics. However, setting it to 0 also seems to crash it (no sounds/input with it at 0 in-game. It still has sounds/input in the menu that you can't see. At 1 it has sounds/input even in-game). Here are the updated logs (with it set to 0):
edge2.log
debug.txt
glsl.log

@Corbachu
Copy link
Contributor

Hmmm. I'm not sure what the issue can be at this point. You tried -oldglchecks right? Also try turning off r_bloom, r_fxaa, and r_lens.

@dsdman
Copy link
Contributor

dsdman commented Jan 23, 2018

@Corbachu Yes, both -oldglchecks and -norenderbuffers are on. I just finished trying permutations of these two options along with changing r_swapinterval, r_bloom, r_fxaa, and r_lens in the config file. The only difference between any of these options seems to be that r_swapinterval at 0 crashes the game.

@Corbachu
Copy link
Contributor

Hmmm, okay so r_swapbuffers needs a "r" flag to prevent users from changing it (breaking the game).

I started looking into OpenGL debuggers last night but haven't settled on one. Since Win32's renderer works, we would need to settle on one for Linux only.

I know Something Somewhere is calling a glEnumError; it isn't SDL as the context appears to be created and dumped just fine. So I propose we start looking for a solution as soon as possible. What do you guys think?

@Corbachu
Copy link
Contributor

@dsdman Do a pull. I have enforced much stricter checks for the RenderBuffers context. Now, if -norenderbuffers is specified, it will prevent RGL_InitRenderBuffers() from even creating the context. Avoid using -oldglchecks unless absolutely necessary. If this doesn't fix the Linux rendering context, I'm moving on to hard OpenGL debugging.

@dsdman
Copy link
Contributor

dsdman commented Jan 24, 2018

@Corbachu Just did a pull + recompile. There is a compile error (oddly enough when linking the executable):

CMakeFiles/3DGE.dir/src/r_main.cc.o: In function RGL_Init()': r_main.cc:(.text+0x1cc6): undefined reference to r_fxaa_quality'
I "fixed" this by commenting out line 760 in r_main.cc: r_fxaa_quality.d = 0

Still gives me a black screen (both with and without -norenderbuffers.) I guess we'll have to debug it through OpenGL debugging tools, though I recall it not being broken several months ago, so it may be a regression with one of the past changes to the renderer. I'll do some testing to see if I can find the most recent commit that doesn't have this problem.

@Corbachu
Copy link
Contributor

@dsdman keep me up to date, if/when you find one, let me know how to set it up since I have Ubuntu in a VM now :-)

Hope we can get this ironed out!

@Corbachu
Copy link
Contributor

And the r_fxaa_quality was introduced but I forgot to push it...I'll do that tonight :D

@gameblabla
Copy link
Contributor Author

gameblabla commented Jan 25, 2018

So after applying my pull request (#58) and disabling r_fxaa_quality, 3DGE works again on Linux.
I first tried it with -norenderbuffers, r_lerp disabled and the game would stutter.
I then tried it again without it and enabled all the options (including r_lerp at 1) and the game was pretty smooth !
Still some very little stutter.

However, here's when things become odd. (I'm using the Nouveau driver btw)
I set my GPU to max perf and without -norenderbuffers and now the game would stutter like mad !
I then set my GPU to its lowest clock and the game ran pretty smooth again.

What is going on ? lol. Seems like your code is still not quite right.
3DGE used to work ok before interpolation was added to the codebase and since then, it behaved like that.

@madame-rachelle
Copy link
Contributor

@gameblabla can you please test master to make sure that the fixes I put work? I rewrote the swapinterval code to be somewhat OS-agnostic and... well, personally, easier to understand.

@Corbachu I don't know if I got the new r_swapinterval values or triggers right, to be quite honest I don't even know if it should trigger based on r_vsync values or not. My "fix" is more about getting things working right now, than getting things correct. You'll have to correct them if they're wrong. Sorry. >.<

@Corbachu
Copy link
Contributor

@dsdman could you also please test and make sure @raa-eruanna 's fixes have indeed restored the Linux renderer? Thanks :)

@dsdman
Copy link
Contributor

dsdman commented Jan 25, 2018

@Corbachu Still won't render on my end. Another thing interesting is that if I revert to commit 19a7657 and apply @gameblabla (now closed) linux fix pull request, it still won't render on my end. So maybe this is an issue with the proprietary Linux Nvidia driver. I will see if I can use the Nouveau (open source) driver and try again with that.

@gameblabla
Copy link
Contributor Author

gameblabla commented Jan 26, 2018

@raa-eruanna Sadly, it does not fix it for me. I compiled the latest git and it still happens. It's hard to describe but it is basically not in sync with the game world, like i can see previous frames.
However, its a little better now with r_lerp disabled. Still does not feel quite right, like vsync is not turned on. (even though vsync is set to 1)

EDIT: I compared it with an older version of EDGE and it certainly does not feel as smooth as the older version.

@madame-rachelle
Copy link
Contributor

Manipulating the swap interval values goes a bit beyond my OpenGL understanding then, unfortunately. If r_vsync is 0 it's supposed to turn it off completely and there's a statement earlier in the same file that does exactly that - I haven't checked if it's ever even executed, though.

@gameblabla
Copy link
Contributor Author

gameblabla commented Jan 26, 2018

I just switched to the proprietary Nvidia drivers and it works properly with it too.
In fact, this does not have the sync issues Nouveau has with it. (when using Interpolation mode)
I thought it was maybe due to the lack of automatic reclocking but even when set to performance mode, it works fine.
Kinda sad because i didn't want to use the proprietary drivers. Now i need to try this on Intel chips.

EDIT: So i tried it on my laptop and while it does not exhibit the issues i had on nouveau, it is pretty slow, with or without interpolation.
I also tried 3DGE on Weston (a wayland compositor) with Nouveau and it's actually smoother than on X11, it plays with little stutter.
EDIT2: I also just realized the stuttering issues also happens with the old version... Well looks like it's not going to be easy to fix :/

@dsdman
Copy link
Contributor

dsdman commented Jan 26, 2018

I just finished testing after resolving #57. It runs butter smooth on my end, even with r_lerp set to 1.

@Corbachu
Copy link
Contributor

Great! Performance issues are eliminated then. I'll go ahead and close it =)

@gameblabla
Copy link
Contributor Author

I'm sorry to say this but they're not. It's still very slow on Intel machines with their iGPUs.
It's definitely not smooth at all on them.

@dsdman dsdman reopened this Feb 4, 2018
@Corbachu
Copy link
Contributor

Corbachu commented Feb 4, 2018

Noticed this issue was reopened. Some of you are saying it's smooth, others are not. No more opening this issue -- just for reference. For continuity and cleanliness' sake, can we please post an all new issue? @dsdman sorry my friend! Thead is just too 'noisy' for my ADHD to follow ;-)

@Corbachu Corbachu closed this as completed Feb 4, 2018
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

6 participants