-
-
Notifications
You must be signed in to change notification settings - Fork 35.1k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Incorrect brightness when gl_FragColor is semi-transparent. #5810
Comments
I can add this change to this PR here really easily: #5805 |
Just a side note: if you use light values outside of [0,1], shouldn't you be using a floating point render target and a HDR/tone-mapping post-processing pass? Otherwise colors of (8,8,8) and (1,1,1) will be the same white, which is not "correct"? |
@crobi You are correct that HDR would solve the issue as well, but an FP buffer and a separate pass is a lot more processing, and memory (at least 4x more memory for the frame buffer), which is costly on mobile. Such a change would still be fully compatible with HDR -- just set an FP frame buffer at the target -- but it also produces better results for semi-transparent materials in LDR at no real extra cost. |
I can see that your approach solves your problem. I just found it strange to use light intensities of over 1.0 in LDR rendering. Is that a common thing to do?
If three.js switches to using pre-multiplied alpha in general, code that uses custom shaders (partially constructed from the shader library) or custom blend modes might break. |
Yeah, I can see this having side effects with other blending modes too? |
@bhouston A valid premultiplied form must have the RGB components less than the alpha component, and |
Source? I'm not a webgl expert, but I haven't seen this rule anywhere. The only place this could have an influence is if you export or import the entire context (as per here) - in this case the RGBA color could be de-multiplied and you'd get RGB component values of over 1. I can see this being problematic, but I haven't seen an actual rule for this. |
@crobi Hmmm... I can't seem to find an "official" source, either. However, I believe I remember seeing @greggman provide a demo several years ago which showed that you can get unexpected results when blending, if a premultiplied form is expected, and the RGB components are greater than the alpha component. In any event, this fiddle appears to show the same issue. If you change the clear color from |
Is that really unexpected though? A premultiplied That being said, I still believe it is not a good idea to use HDR colors (with RGB components not limited to 1.0) with LDR buffers (where color components cannot exceed 1.0) - the presented approach just handles one special case when a color component is larger than 1.0 but smaller than 1/alpha. |
Any values "out of range" are undefined. different browsers may have different results. Premultiplied of (1,0,0,0.1) is an out of range value if the context was created with |
I can see the problems, I just don't see this defined in any specification. So I'm wondering whether this is an error/omission in the webgl spec, me being unable to find the part where it is defined, or just your conjecture? This is my interpretation:
|
Just to be clear that I didn't propose using premultipled Alpha frame Best regards, On Fri, Dec 26, 2014 at 8:18 AM, Robert Autenrieth <notifications@github.com
|
@bhouston wrote
That is not quite true. In the general case, if the drawing buffer has an alpha channel not equal to 1, your proposed blending formula is the "mathematically correct" formula only if the drawing buffer and the shader output are both premultiplied.
three.js supports a semi-transparent drawing buffer. Perhaps Clara.io is setting |
@WestLangley wrote:
I'm not so sure about that.
Clara.io is doing neither. To understand why my proposal works, have a look at the standard blend function used:
This is what I am proposing to change it to:
Notice there is a single change in the blendSrc function. And instead of "SrcAlphaFactor" I am proposing doing that bit in the shader:
Thus this does not change how the background is handled in any way as compared to what was before. All I am doing is changing where the SrcAlphaFactor is being done to avoid an unnecessary clamp. That is it. Let me stress it is mathematically equivalent to what was being done previously, with the exception that gl_FragColor is no longer clamped to [0-1] prior to being multiplied by its alpha. Now whether or not removing the artifacts the current approach creates are worth changing the blend mode used by the lit material (MeshLambertMaterial, MeshPhongMaterial) is worth it is unknown. (BTW I do not propose changing the default blend mode on all materials as it isn't necessary and it could cause problems with custom shaders.) |
For clarity purposes, I am referencing the default blend mode specified in Material here: https://github.com/mrdoob/three.js/blob/master/src/materials/Material.js#L22 Which I undestand isn't used directly because BlendMode is set to Normal. But those lines are equilvaent to this line here in WebGLRenderer that sets the blending mode explicitly when it is Normal:
https://github.com/mrdoob/three.js/blob/master/src/renderers/WebGLRenderer.js#L5681 |
@bhouston I hope you will take the time to understand what I have written, because what I am telling you is true. The three.js This is why Your proposed blending formula is the correct formula only if the drawing buffer and the shader output are both premultiplied. |
@WestLangley wrote:
Yes! I was trying to replicate NormalBlending mode while avoiding the clamp. So yes, it has the same limitations as NormalBlending mode. :) I was not making a claim that this change is correct when compared to all blending modes, rather just Normal. Of course if you want a blending modes different than being equivalent to NormalBlending, you need to change things -- it is of course impossible to replicate different distinct multiple blending modes with a single equation. :) So I think we are in agreement. If @greggman concerns are true and that the value (0.8, 0.8, 0.8, 0.1) in the back buffer a real issue, then this isn't really possible. If it wasn't a concern, I would add a flag to materials called |
@WestLangley would you support the addition to some materials, but probably not all, of a |
If you want to output from your shader so-called "valid" colors in premultiplied form, the way to do that is to create your custom shader material (perhaps by extending Phong), and set the material's blending mode to be the custom one you suggested. On a related note, I am curious as to how you would answer the following questions. I am not sure there is an answer. What color is represented by |
@WestLangley asked:
They were never invalid. Just because a color, premultiplied or not premultipled, is not within LDR [0-1] doesn't make it invalid. @bhouston asked:
@WestLangley replied:
So that is a no? :( I thought it would be a useful addition. Oh well. I can do it that way too. @WestLangley asked:
The first color (1, 1, 1, 0.1) is the premultipled version of (10,10,10, 0.1) -- one just has to solve the equation (ra, ga, ba, a) for (r, g, b, a), which is straightforward. Solve ra = 1, where a = 0.1 for r, thus r = 1/0.1, r = 10. It is a color outside of the LDR range, but when one considers transparency it is fully within the LDR range of [0-1]. My method preserved all colors that end up within the LDR range when incorporating transparency, where as without that change, transparent colors are unnecessary clipped. The second color you show, (1, 1, 1, 0), is not a valid premultiplied color because there is no way to solve the equation (ra, ga, ba, a), because a is 0. Solving ra = 1, where a = 0 for r, leads to r = 1/0, which is undefined. Does this make sense? |
By "valid" I meant that the RGB components in a premultiplied representation were less than the alpha component. Shaders that output "valid" representations are fine, as long as the blending function is appropriately set. I am still not sure about the consequences of blending premultiplied representations where the RGB components are greater than the alpha component. Consequently, I personally, would not do it. I would like, at some point, to be able to respond more definitively to @crobi's comments. |
@WestLangley wrote:
The values are just HDR premultiplied alpha, rather than LDR. Premultiplied HDR is used all over the place in the visual effects industry. It is a valid representation and is a straightforward extension of LDR premultiplied alpha.
Well, with a non-FP framebuffer, you will run into clipping issues with HDR premultipled alpha but that is about it. Other than that, there should be no side effects as compared to LDR premultiplied alpha (where RGBA are all within the range [0-1].) BTW I speak from experience, I wrote two HDR renderers that are widely used in the visual effects industry, and are of course used to create images that are incorporated into really complex compositing situations. |
@bhouston Well, then maybe the reason I can't find a definitive problem with it is because there is no problem with it when used appropriately -- which is what you are saying... Although the fiddle I posted is still troubling me. |
@WestLangley Your example, as @crobi wrote, is behaving as expected. In premultipled alpha situation, the RGB of the foreground is added to the background. The white background (1,1,1) is multiplied by (1 - 0.1) to become (0.9, 0.9,0.9) and then the foreground red is added to get (1.9, 0.9, 0.9), but because the screen is LDR, you get (1.0, 0.9, 0.9). The black text starts with (0,0,0), which is multiplied by (1 - 0.1) to become (0,0,0) and then the foreground red is added to get (1.0, 0, 0). This is the expected result. |
@WestLangley If you come to the conclusion that HDR premultiplied alpha is not incorrect, it would still be very cool to have an option on shaders to output premultipled alpha results. I am sure there are applications outside of just my issue with unnecessary transparency clamping artifacts in LDR. |
I don't know why I'm responding and I probably don't know all the issues but ... It does seem like it would be nice if there was an option to have three.js render premultiplied alpha values given that's what the browser wants (in general) and matches the browser's use of In other words, if I want to render something over the page, like this doing the correct thing should be easier? That seems pretty easy. Just like bhouston suggested add a
That seems a lot nicer than requiring users to write all custom shaders if they want to use premultiplied alpha. |
@greggman If @bhouston wants to output premultiplied values from his custom shaders, there is no problem doing that either. All he has to do is set the material blending properties appropriately. In fact, we could add a new blending option: If in addition, @bhouston wants to output premultiplied values from some or all of the pre-defined three.js shaders (e.g., If we allow pre-defined shaders to output premultiplied values, it will impact post-processing logic, render-to-texture, and gamma correction, for example, but we will have to cross that bridge when we get to it. |
Random remarks:
Are there really applications that use non-premultiplied drawing buffers? From what I've seen, we are talking about the following two options:
In both cases, a single full screen quad with saturated red color and 10% opacity on top of a black background would result in the drawing buffer containing (0.1,0,0) as RGB values. This is what you want the user to see - a very dark red screen. Of course, you would see a difference if you use alpha values of <1, but disable blending. This could break some applications.
Hm, how?
Shaders (and presumably GPUs in general) use IEEE 754 data types, so 1/0 is +infinity. I would assume that if you de-multiply (1,1,1,0), you get (+inf, +inf, +inf, 0), which gets clamped to (1,1,1,0) when written to a LDR buffer. I agree though that the de-multiplication should have been described in more detail in the spec. |
I'm proposed the |
Right now in the ThreeJS GLSL shaders, if there is a bright specular highlight on a transparent material, the highlight is not rendered correctly.
For example let say that at the end of the shader, the gl_FragColor = ( 8, 8, 8, 0.1 ). This should result in a bright highlight of let's say (0.8,0.8,0.8) being written to the frame buffer. But in fact, the values of gl_FragColor are clamped into the range of [0,1] before the value is passed to the gl.BlendFunc, thus even if you write gl_FragColor = (8,8,8, 0.1), it is actually equivalent to writing gl_FragColor = ( 1.0, 1.0. 1.0, 0.1 ), because the values will be clamped, which will result in a value of (0.1, 0.1, 0.1) being written to the frame buffer, much lower than the correct (0.8,0.8, 0.8) value.
I found a way to get around this artificial clamping of brightness on semi-transparent objects. The trick is to pre-multiply. Thus one would do this:
gl_FragColor = vec4( gl_FragColor.rgb * gl_FragColor.a, gl_FragColor.a )
And then change the blending modes for the material as follows:
The above settings do the standard blending but assumes a pre-multiplied gl_FragColor as input and it retained the whole range of acceptable intensities.
Here is a comparison of the improved results possible by using pre-multipled alpha in the gl_FragColor for high intensity semi-transparent materials -- this change is already live on Clara.io:
I am unsure if there are side effects to using pre-multiplied alpha. I suspect because the
gl_FragColor.a
is unchanged, there is few side effects.The text was updated successfully, but these errors were encountered: