Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Alpha Hash and Alpha2Coverage Implementation #40364

Merged
merged 1 commit into from
Nov 3, 2020

Conversation

marstaik
Copy link
Contributor

@marstaik marstaik commented Jul 14, 2020

This commit adds the following features to the Vulkan rendering pipeline:
Alpha Hash, AlphaToCoverage + AlphaToOne, and "mipped" antialiased edges.

These techniques are very helpful for creating nice looking Hair and Foliage, where alpha channels are used the most.

Preview Video:

https://youtu.be/zQKkUNvAAJ4

Alpha Hashing

Alpha Hashing is a dithered alpha testing method from NVIDIA that you can find more documentation for here

AlphaToCoverage and AlphaToOne

AlphaToCoverage is a technique that takes the fragment shaders alpha channel and AND's it with the MSAA SampleMask to produce additional areas for Anti-aliasing.

AlphaToOne is an additional flag in which after the alpha channel is used for the MSAA SampleMask, the alpha value is set to the maximum alpha value.

It is the combination of these two techniques that allow for good-looking anti-aliased alpha testing.

But why both? Why is AlphaToCoverage by itself not enough?

In the preview video above, you may notice that when switching from "Alpha Edge Blend" to "Alpha Edge Clip" (the respective names in the UI for AlphaToCoverage and AlphaToCoverage + AlphaToOne) that using AlphaToCoverage by itself still results is some bleed/halo effects around the textures. This is because AlphaToCoverage alone does not clamp the alpha value; that is - the resulting alpha of the fragment shader still has to be blended somehow. And in general this is an issue we have with alpha blending. All AlphaToCoverage does by itself is export the Alpha channel as an area for MSAA to act upon.

This is where AlphaToOne comes in - after adding the apha to the MSAA SampleMask, the alpha value is set to the maximum - so no blending occurs anymore.

The final result is fragment color output with no alpha channel, yet MSAA smooths the areas where the alpha channel was.

Alpha Edge Sharpening

This blog post by Ben Golus goes through some techniques for sharpening the Anti-aliasing edge with some mipmapping techniques.

This technique has been implemented as a function called compute_alpha_antialiasing_edge in scene_hight_end.glsl and usage will be explained below.

The Material3D system uses this by default when AlphaAntialiasing is enabled.

Render Flags and Usage

AlphaToCoverage, AlphaToCoverage + AlphaToOne

In the spatial shader, the render flags alpha_to_coverage OR alpha_to_coverage_and_one can be added to process the result of the shaders ALPHA with either AlphaToCoverage, or AlphaToCoverage + AlphaToOne.

When either of these are set, the blend_mode is overridden to BLEND_MODE_ALPHA_TO_COVERAGE, which is better suited for blending.

Alpha Scissor

  • If ALPHA_SCISSOR_THRESHOLD float, [0,1] is set in the shader, then alpha values less than the threshold will be discarded.

Alpha Hash

  • If ALPHA_HASH_SCALE float, recommended (0,2] is set in the shader, alpha hashing will be used and alpha values less than the hash are discarded. Alpha Hash Scale simply affects the dithering affect of the alpha hash

Alpha Edge (Needs Texture)

Alpha Edge requires two variables:

  • ALPHA_ANTIALIASING_EDGE float [0,1] - Affects the edge point of the Edge sharpening. If ALPHA_SCISSOR_THRESHOLD is set, it is added to ALPHA_ANTIALIASING_EDGE and clamped from [0,1].
  • ALPHA_TEXTURE_COORDINATE vec2 - The texture coordinate to use. Think uv_coordinate * alpha_texture_size.

Please feel free to ask question and test it out!
I am looking forward to your feedback.

Thanks,
Marios S.

Bugsquad edit: This closes godotengine/godot-proposals#1273.

@marstaik marstaik force-pushed the alpha2coverage_up branch 2 times, most recently from be3d760 to e52460e Compare July 15, 2020 02:15
@s-ilent
Copy link

s-ilent commented Jul 16, 2020

Regarding the hashing function, I had some questions. Why did you choose this technique?
Is there a particular benefit in this use case to its guarantee of stability? The main benefit listed in this paper is that its stability works well for TAA, but Godot does not support TAA.
This is just conjecture, but the hashing function seems like it is expensive.

Placing the stability aside, the true random nature of the hash function means that it looks visually noisy, which IMO is worse than using simpler, cheaper methods using a screen-space noise sample offset by a temporal factor.
For example, sampling blue noise from a precomputed lookup texture (1), or R-sequence (2), or one of the other methods listed by Oculus in their article on dithering (3).
The end results look smoother both in still and in motion, and temporal offsets (as in 3) hide the instability, even if it leads to some shimmering.

  1. http://momentsingraphics.de/BlueNoise.html
  2. http://extremelearning.com.au/unreasonable-effectiveness-of-quasirandom-sequences/
  3. https://developer.oculus.com/blog/tech-note-shader-snippets-for-efficient-2d-dithering/

@marstaik
Copy link
Contributor Author

Hi @s-ilent , regarding the dithering functions I am not really an expert in the matter, nor have I greatly invested myself in comparing different dithering functions. But I can still give some of my input on the matter from what I have read from the papers.

I think first and foremost, I must apologize because the preview video does not really demonstrate what the hash dithering is for - and although I believe you already understand the intent behind it, I'll clarify for others here.

The Alpha Hash function (or any dithering in general) really isn't useful for the hair case I showed in the video - as the hair was opaque and faded to transparency at the edges.

Dithering is a technique to fake some form of OIT transparency where blending is required. We can argue that in the hair example, blending isn't really required as much, we just want smoothing of the edges. The hair strands themselves aren't transparent. Where dithering does shine is in situations where we want to emulate some form of true transparency throughout the mesh/texture plane/etc.

Slide 8 of the NVIDIA slides show what were trying to solve:
2h9
(notice how the entirety of the facial hair has alpha throughout, not just the edges)

Many of those techniques do look nice - and it could be nice to experiment with a few of them. It's hard to grasp through the paper however how well they work for texture transparency though - the examples I see are all technical visualizations of the noise or seem to deal with shadows. I'd love to see some real case examples where alpha transparency is simulated outside of shadows.

As for the benefit of the hashing function:

Is there a particular benefit in this use case to its guarantee of stability? The main benefit listed in this paper is that its stability works well for TAA, but Godot does not support TAA.

The temporal offsets can still lead to shimmering as you said - which to some people, or for some use cases, is visually worse than the noise generated by the hashing.

I don't believe that its only benefit is its stability for TAA, but it's stability in general - it looks consistent under camera movement.
The temporal offsets is still trying to calculate a 3d camera movement onto a 2d plane. For fast moving games/scenes, the shimmer can be extremely distracting.

Some ideas I got from reading those papers was perhaps pre-dithering transparency masks per texture instead of storing the smooth alpha values.

Some concerns I don't believe I saw addressed either - how does it behave when zooming out or looking from a distance?
Since the precomputed noise is essentially a noise texture, as the camera distance increases I would think that aliasing of some kind would start degrading the quality. Can the textures be mipped well? How do the transitions look?

I have a lot of questions for these techniques! I would love to see more research into them.

At the end of the day though, the dithering functions are actually one of the easier things to do custom implementations for!
If the hashing inst good enough, roll your own! - You can always pass the result afterwards to Alpha2Coverage to get it smoothed - It may even reduce temporal shimmer quite noticeably!

@reduz
Copy link
Member

reduz commented Oct 18, 2020

Being completely honest, I have the feeling there might be better alternatives to this problem that are not being explored. While for the most part I think this is good, I also adhere to the fact that AlphaHash is mostly a technique devised for TAA which does not make much sense in a forward renderer like the one in Godot.

One alternative I would like to investigate myself for this is alpha slicing and sorting, which should work quite well for hair and other transparent objects (specially two sided ones). On import, faces that intersect each other are cut/sliced and then, before drawing, the index buffer is sorted using a compute shader (this can be enabled when geometry is closer than a given threshold). Aided by depth prepass, results should be as good as the reference image.

Copy link
Member

@reduz reduz left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think this can still be merged after rebased, even if we implement other alternatives for improving alpha draw order. Only problem I see is what I mentioned here.

scene/resources/material.h Outdated Show resolved Hide resolved
@marstaik
Copy link
Contributor Author

I can rebase this and fix it up, just give me a few days :)

@marstaik
Copy link
Contributor Author

Yeah, I would not use Alpha Hash specifically for things like hair. However, it can be very good for simulating anti-aliasing for far away objects, so I don't think its useless.

scene/resources/material.h Outdated Show resolved Hide resolved
Copy link
Member

@clayjohn clayjohn left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'm accepting for now despite my comment about the hash function. If it turns out to be an issue we can easily drop in a new hash function at any time. Preferably one recommended from this recent paper. http://www.jcgt.org/published/0009/03/02/

@akien-mga
Copy link
Member

akien-mga commented Nov 1, 2020

Could you squash commits into one (or more if relevant, but fixups should be melded into the original commit that requires a fixups)?

Copy link
Member

@clayjohn clayjohn left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Docs need to be added before this is merged. You can generate the shell by running the engine from command line from the root directory of the godot project using the argument --doctool .

@marstaik
Copy link
Contributor Author

marstaik commented Nov 2, 2020

Squashed and doc comments added.

doc/classes/BaseMaterial3D.xml Outdated Show resolved Hide resolved
doc/classes/BaseMaterial3D.xml Outdated Show resolved Hide resolved
@marstaik
Copy link
Contributor Author

marstaik commented Nov 3, 2020

Fixed typos in docs.

@akien-mga akien-mga merged commit 873d461 into godotengine:master Nov 3, 2020
@akien-mga
Copy link
Member

Thanks!

@@ -482,6 +512,8 @@ class BaseMaterial3D : public Material {
TextureChannel ao_texture_channel;
TextureChannel refraction_texture_channel;

AlphaAntiAliasing alpha_antialiasing_mode;
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Uninitialized variable:

scene/resources/material.cpp:1481:6: runtime error: load of value 3200171710, which is not a valid value for type 'BaseMaterial3D::AlphaAntiAliasing'
SUMMARY: UndefinedBehaviorSanitizer: undefined-behavior scene/resources/material.cpp:1481:6 in

@Calinou
Copy link
Member

Calinou commented Jul 23, 2021

@marstaik How does alpha hashing compare to interleaved gradient noise? Is it still relevant when we have interleaved gradient noise in master now?

Also, I noticed there are references to both alpha_scissor and alpha_scissor_threshold in scene_forward_clustered.glsl:

#if defined(ALPHA_SCISSOR_USED)
if (alpha < alpha_scissor) {
discard;
}
#endif // ALPHA_SCISSOR_USED

@Calinou
Copy link
Member

Calinou commented Mar 20, 2023

@marstaik If you're still around, could you give some example values to use for alpha_antialiasing_edge? It's not clear which value should be used depending on the use case. I've noticed that using a value lower than alpha_scissor_threshold will cause the effective threshold to vary depending on distance from the camera (likely due to texture mipmaps). I assume this is intended to prevent vegetation from becoming too transparent at a distance.

In the future, couldn't the value of alpha_antialiasing_edge be automatically guessed based on the alpha scissor threshold?

@marstaik
Copy link
Contributor Author

@Calinou I haven't touched this in a while, but you can think of the alpha edge value as the width of the transparency band around the object that is then marked for msaa. A higher value should give you better visuals, but it also marks more pixels in general for the msaa process, and thus costs more.

If you are familiar with a radial soft brush like in Photoshop, then think of the edge value as the the band from the solid center to the faded away edge that disappears. The best possible result would be to mark the entire range of faded pixels for msaa, so they all get blended, but then the area of the circle that needs msaa also goes up drastically, because it's an area (the circumference of the object * "the width" of the edge band).

The alpha scissor is a hard cutoff. If it's 0.5, then any pixels with less than 0.5 transparency are discarded. The alpha edge then starts working offset from that edge.

Back to the brush example, if I set the scissor to 0.5 then the the outer half of the blended radius of the brush would completely disappear, as we scissor them off. We are then left with pixels of 100 opaqueness to 50 opaqueness, or 0 transparency to 50 transparency, depending on your viewpoint. The alpha edge value then only has those values left to play with to determine the band.

It's incredibly difficult I would think to automatically determine. There are different quality areas to consider (such as wanting very high quality player hair vs random foliage), but there is also input texture issues to consider. The artist dictates how much "edge" there is to work with depending on how fast they decided to fade the values to zero. IE did they step the alpha down by 10 percent every pixel, meaning a gradient of 10 pixels from opaqueness to fully transparent, or by 1 percent, meaning 100 pixels of "edge". That in addition to personal preference then determines how much you need to blend your edge to look acceptable.

@marstaik
Copy link
Contributor Author

Also, as comes with a few years down the line hindsight, it might be better to have a boolean toggle to make alpha edge relative to alpha scissor (and default it to on) so that it is scaled by the alpha scissor and much more intuitive.

IE if the scissor is 0.5 and there are only half as many pixels, then instead of being additive we can remap the alpha edge [0,1] range onto [0, .5]

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Add alpha-to-coverage support (MSAA for alpha-tested materials)
8 participants