New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
WebGLTile: Properly render semi-transparent tiles #14983
Conversation
This uses the default GL_LESS depth function.
We still need to update the depth layer. This doesn't change normal rendering behavior, either.
This changes the WebGL tile renderer to render tiles zoomed-in first, using the depth buffer as to clip lower layers of tiles without blending any transparent tiles. Any tile that uses alpha is drawn last, with the highest possible Z, to allow blending with the lower layer of already drawn tiles.
📦 Preview the website for this branch here: https://deploy-preview-14983--ol-site.netlify.app/. |
Thanks for digging into this, @puckipedia I’m not at a full sized keyboard right now, so can’t tell if that rendering test failure is related. It should be possible to see the attached actual/expected result images (though I always struggle to find where GitHub hides the attachment links). |
Looks like an actual regression, woops! I didn't have chrome locally, and couldn't easily use the puppeteer binary, so ended up testing with Firefox and i think i missed this test.. Will look later. |
actual: (from https://github.com/openlayers/openlayers/suites/14913615489/artifacts/850052736) it seems to primarily be an opacity problem |
The vector tile renderer depends on being able to render at the same depth multiple times, for different vector tile layers.
Ah, i'd tested with vector tiles a bit, but forgot that they render multiple layers in one tile, and thus don't have the depth buffer reset inbetween. Adjusting to |
Thanks for this very nice contribution, @puckipedia! |
When using the WebGL raster tile renderer with semi-transparent tiles (e.g. overlay layers), I noticed that the tiles aren't properly getting clipped by the other zoom layers, like happens on the canvas renderer: (Sample used: https://gist.github.com/puckipedia/a2a191a6684c5d9d238eed4f78133756)
Video of the situation before this PR
webgl-tile-before.mp4
Not only the layers involved in the fade, but also lower zoom levels, are visible, causing a bunch of visual noise that is not necessary.
This pull request solves that in a bit of a heavy-handed way: It adds proper depth testing to the WebGL helper, as well as the WebGL tile renderer:
Video of the situation after this PR
webgl-tile-after.mp4
As you can see, compared to the original situation, there's always at most two tiles being blended.
The PR first introduces the depth buffer to the
PostProcessingPass
andRenderTarget
, as well as allowing enabling depth testing (with the default OpenGL settings) to the WebGLHelper
. This tries to avoid any conflict with existing code using the helper, and as such is entirely opt-in. If depth isn't enabled, the depth renderbuffer will be entirely ignored.463b5b5 then adjusts the renderer to render from the highest zoom level to the lowest, making use of the depth buffer to clip any lower zoom levels to only render where higher zoom level tiles haven't been rendered. Afterwards, it renders all the tiles with an alpha lower than 1, which allows them to blend with the existing tiles. This is necessary to preserve proper transparency behavior.