New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
SDL2: adjust software-gl check to allow hardware accelerated Mesa #825
Comments
aaaaaaand I was not observant enough. I just realized that the SDL2 branch is using the XDG standard $HOME/.local/share/ location for its config files now, so obviously setting force_software_renderer in $HOME/.chocolate-doom/ did nothing. I can now say setting the option in the proper config file does make a difference, the game opens just as fast as it did with SDL1. Still not sure why it thinks I have software GL though. I tried using the following Mesa options: |
The check just looks for "Mesa" -- I've argued against it because it's a perfectly valid driver in many hardware accelerated configurations (namely, ones using free software drivers only). |
I've changed the topic of this to suggesting relaxation of assuming Mesa means software GL. I've since seen some linux guys also notice this, and as @chungy indicates hes already mentioned that there are legit hardware accelerated senarios with Mesa GL. The issue goes away entirely if one sets force_software_renderer as mentioned, but I don't see any reason for it to not work with the GL mode either... Any other consensus? EDIT: @chungy i don't think i can remove the OpenBSD tag, can you strip it out? I don't think we're OpenBSD specific on this one anymore.. :-) |
What we need is a counter-proposal to the Mesa check. I don't think we would want to just remove it, since the message is useful to many people (and merely misleading to others), so I'd prefer it if we had a better suggested check instead. |
why not look for whether direct rendering is enabled or not? if not, its using mesa software. if yes, there is some hardware acceleration available.
|
I think the issue is we are relying on what SDL exposes. At the moment we use
We might be able to use SDL_GL_GetAttribute (SDL_GL_ACCELERATED_VISUAL,... |
|
This returns 1 for me on OS X with direct rendering but also in a Linux VM with no 3D accelaration enabled (or 2D video acceleration). glxinfo in that VM returns
|
Nowadays, VMs are able to do miraculous things, e.g. share the hosts 3D acceleration with the guest. Are you sure that this isn't the case here? |
I'm not certain it isn't, but I had explicitly not ticked the '3d acceleration' option in Virtual Box, and I'm running LXDE in the VM not something like GNOME shell with a compositor. So as best as I can tell, it shouldn't. Argh. I have a powerpc here, I might see what that does. Edit: oh, I could try the VNC trick |
...direct rendering: yes even in a vnc session. |
Can we take a step back. Does anyone know of a concrete, real situation where SDL2 branch does not perform well, accelerated or not? Even in a VM, or via VNC, or on my Pi, I've not had any performance problems. So perhaps we don't even need the warning. |
I noticed this also. Note that on modern X with Gallium, A software rasterizer will be something such as I believe "Direct rendering" means that commands and such can directly be sent to the GPU and that memory mapping can be used, rather than sending stuff over the X connection (which is a bit slow) via GLX. So, you can have hardware acceleration when there is no direct rendering, just that it has more overhead because it has to travel through a pipe. On my systems running Debian sid, you cannot even force indirect rendering anymore (via |
I've pushed two tests branches
I would appreciate anyone interested try the second and report what they get, in conjunction with what kind of performance they get from the branch. I'm particularly interested in any situation where choco performs poorly. |
With the first branch the warning goes away. However, if I force software rendering by setting My guess would be that SDL2 either checks a string with a non-matching value or just does not care at all and returns true if OpenGL is being used (at least on Linux) even if it is a software rasterizer. |
Seems to run decently here, no delay or anything for game start:
|
Thanks that's very useful feedback. It seems this check is useless.
I think you are right. The documentation on these attributes is very sparse but I think it's possible that SDL doesn't initialise it to anything useful in the read-only case; it's really designed for writing to and signalling to SDL you want a particular behaviour.
Cheers! |
This is what is printed when I run the first branch:
When run from the second branch both DEBUG messages disappear. Both branches perform exactly 577 fps with Current master branch performs at about 470 fps. Should have tried with fullscreen, maybe... |
Fullscreen mode: 405 fps with sdl2, 165 fps with master. |
This check was designed to warn users if they did not have hardware acceleration that performance might be poor and to suggest toggling force_software_renderer. However the check is not reliable: it can't determine whether hardware acceleration is taking place on Linux, as Mesa front-ends both hardware and software implementations. We explored alternatives (checking SDL_GL_ACCELERATED_VISUAL) but these proved similarly unreliable. On Linux, GLX offers glxIsDirect, but this is of no use where GLX is not available, including (I think) Linux framebuffer or Wayland. Rather than continue to mislead people, delete the test and warning. Fixes chocolate-doom#825.
Here is the only other difference I have really noticed on the SDL2 branch vs SDL1 branch. The problem isn't really a problem :-)
When launching the game, there is a 1-2 second extra delay from when the program output shows in a terminal window, to when the window starts to draw/go fullscreen. After this, the game performs as normal with no noticeable difference whatsoever 👍 (aside from #824, of course!)
I do notice I get this message:
...which seems strange as I certainly don't have software GL:
Setting the recommended setting seems to make no difference, gameplay runs good regardless of setting.
The text was updated successfully, but these errors were encountered: