Join GitHub today
GitHub is home to over 40 million developers working together to host and review code, manage projects, and build software together.
Sign upDon't always enable GL_FRAMEBUFFER_SRGB #805
Comments
tomaka
added
T-enhancement
T-wrong
labels
May 5, 2015
This comment has been minimized.
This comment has been minimized.
|
My idea was to add a flag to programs that tells whether the program outputs sRGB or linear RGB. Glium would disable
Actually if I'm not mistaken it would look exactly the same. |
This comment has been minimized.
This comment has been minimized.
|
Another option for you is to simply render the intermediate image into an |
This comment has been minimized.
This comment has been minimized.
I tried this, and unfortunately it doesn't seem to work - the output is exactly the same. I could be doing something wrong though.
That sounds like it would be fine. Would this be an extra parameter to Regarding sRGB not being supported everywhere, I was working on a game a while back that worked fine on my system, but on someone else's system the output was darker than it should be. I think the problem was that the system didn't have sRGB capable framebuffers, so GL_FRAMEBUFFER_SRGB didn't do anything. I ended up having to do the conversion myself in the shaders, with code like |
This comment has been minimized.
This comment has been minimized.
Actually the flag already exists, I just don't know where to put in in the program creation API. |
nstoddard commentedMay 5, 2015
I'm working on an application that uses FXAA. The output didn't look quite right, and it turns out it's because glium is automatically enabling GL_FRAMEBUFFER_SRGB. There doesn't seem to be any way to disable this - I had to modify the glium source code to disable it. There needs to be a way to disable it, or maybe it should be disabled by default and you would provide a draw parameter to enable it.
There's also some systems that don't support GL_FRAMEBUFFER_SRGB, which would cause output on those systems to look bad. Is there a way for glium to detect if it's not supported, and emulate it with a shader instead?