-
Notifications
You must be signed in to change notification settings - Fork 152
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Shader 'depthtile1' fails to compile. #919
Comments
Hmm, try running the game with -set r_glProfile compat and see if it works then? |
Nope,
|
Hmm, something is really weird: When you use the core profile, you use GLSL version 150, but when you use compatibility profile, you use GLSL version 330... |
The shaders you pasted lack some lines after the definition of minDepth and avgDepth, but possibly that was caused by copy-and-pasting the shader to the issue tracker. If I add those missing lines from the unv source code, both the core and compat shaders successfully pass the khronos GLSL validator. |
Well, should I redownload the game? |
Redownloaded, still the same. |
I'm seeing the same thing using a GeForce GT 240 on F23 WS using the NVidia 340xx drivers:
|
GeForce 210 1GB here. Pentium D 820, 3Gbytes of RAM. Maybe related to the nvidia driver? |
I'm going to assume that is the case because the error:
This is perfectly fine. Before OpenGL 3.3, GLSL versions followed their own scheme independent of OpenGL's version number (despite being released alongside a given OpenGL version). There is a nice page on Wikipedia that documents this. OpenGL "core" and "compatibility" profiles were introduced with OpenGL 3.2 (GLSL version 150), and OpenGL 3.3 changes the GLSL shader version to "330" to match the GL version. |
How do I turn on more debugging output from (what I presume to be) NVidia's shader program compiler such that we might be able to pinpoint the issue and, possibly, open a bug report with NVidia with a solid repro? Either way, it might be worth capturing this specific info here for future reference? |
You are unlikely to get any more debugging output than what was displayed in the console here.
You could try running with |
Another user (LDAsh[|]-) on IRC reported the same issue with the attached log. |
I got the same issue, as well on Linux than on Windows, with or without Nvidia Drivers, it was fixed since ? |
Like.... Someone ? I'm still facing that issue ! |
Up, someone has a workaround for this ? It is very frustrating. 😞 |
gimhael overhauled a lot of the shaders. Mind taking a stab to see if the issue is fixed for you? |
@DolceTriade Unfortunately, I gave my Nvidia PC on which this bug appears to my brother. I'll ask him to try if it works for him, but I doubt he'll actually try. |
Maybe @Incognito4nonymous can test, if it was really the very same error. |
I'll try as soon as I can. Sorry for the answer time I've been extremely busy lately. |
Huhhh, mmkay, I can try this, but the download being pretty heavy for my bandwidth I have to get a 75% guarantee it works beforehand. |
As of today (just downloaded the 0.50.0 torrent) it isn't working, Nvidia's compiler throws the same error. This happened with Ubuntu 16.10, a 310m and Nvidia's 340 drivers. |
Thanks for letting us know. We'll continue to try to get the latest version of Unv out with the fix. |
The issue seems to concern Optimus GPUs only, I have tried with my other computer and a brand new Nvidia GT 210 and it works fine. |
One issue seems to be connected to this one,
This happens while attempting to load the game with an Intel Core I5 on an Optimus computer. |
This bug seems unrelated to this error. Furthermore, this bug is caused when you're using Mesa, not nvidia:
|
Yo yo yo ! So, anything new around here out of curiosity ? Mi GPU es su GPU if it is needed for testing. |
Thanks, we'll try to generate a small build to facilitate testing. Currently, we're trying to narrow down the scope of why this bug appears. From our side, it seems that everything is correct and the thing that the error complains about
Doesn't even appear in our shaders, which makes it tough to figure out why stuff is failing. |
Okay, I'll wait for the build with great anticipation for further testing then. 😉
|
@Incognito4nonymous I edited your comment to make it looking better, you can use this syntax to prevent text being interpreted (one three-backquotes line before and one after your text):
which produces:
instead of:
Otherwise, since the |
Ok thanks, I'll try to remember that stuff when I'll post logs. |
Unfortunately, the PC the bug was occurring on has gone dead a few months ago. No way I can check for it anymore. I'm leaving this open as many other people seem to have reported the same issue. |
Okay, in my opinion the best lead would be to see where the regression is, since the version alpha 42 works on all computers. |
Yeah, bisecting between Alpha 42 and Alpha 50 would help. That would require someone to be able to compile the game on the hardware though... |
@Incognito4nonymous do you know with which version it stops to work, alpha 43? |
It looks like |
@Incognito4nonymous can you try building the daemon engine's master branch and running it this way:
Just rebuilding the engine would be enough to test that. |
Roger that, I'll test alpha 48 works on my computer and will try to compile the main daemon as soon as I can. |
@illwieckz |
I managed to build an hardware setup that reproduces the bug. The bug is still present in for-0.51 branch. The hardware is an nVidia GeForce 210 and the driver is the nvidia-340 one (earlier drivers do not support this hardware). Note that the hardware is advertised for OpenGL 3.2 on the box, but the driver claims OpenGL 3.3 support (same as
|
That's a good new, although are you able to reproduce the Deform Vertexes bug ? |
Your
So it's not expected to be reproduceable on NVidia. If you get that Also reading this:
It looks like your IGP driver is missing some OpenGL extension. On Wikipedia it's noticed this IGP only supports up to OpenGL 2.1. @Incognito4nonymous can you open an issue for your specific |
@Incognito4nonymous: The DeformVertexes shader on git master does not use GL_EXT_texture_integer any more, are you using an older release of Unvanquished ? @illwieckz: Could you test if the shader compiles after disabling the GL_EXT_texture_gather extension (/r_ext_texture_gather 0) ? |
@gimhael I'll do (in some days)! |
@gimhael it's not So, doing this:
makes the game start and run on this specific hardware. |
@gimhael I'm currently on 0.50.0, I downloaded it in September of 2016. @illwieckz |
@Incognito4nonymous I'm not sure the trick works on 0.50.0, btw we have to release a new engine version because too much bugs of 0.50.0 are now fixed. |
So... What's your suggestion ? Should I wait for the release or download the latest commit ? |
It's probably better for you to build the last master commit, the probability of us not releasing something between Christmas and New Year's eve is very high 😛 |
See DaemonEngine/Daemon#63 for a pull request that aims to fix the bug (disable the related extension on this broken driver for this hardware). |
this was fixed long time ago in repository, fix is shipped with latest release: https://www.unvanquished.net/?p=1278 |
The text was updated successfully, but these errors were encountered: