Join GitHub today
GitHub is home to over 40 million developers working together to host and review code, manage projects, and build software together.
Sign upTransparent Objects Show Lower Scene #438
Comments
|
I'll wait for the example (please make it as simple as possible). |
|
Oops, accidentally closed the issue. Re-opened. |
|
OK, here we go. https://www.dropbox.com/s/0cm2ye0bst3tn4r/ShaderTransparencyIssues.blend?dl=0 The example blend file should work with any project that has 2D filters in the bdx/shaders/2d folder. Comment out the |
|
I'll preface this by saying that I don't really understand the underlying details of how OpenGL actually hashes this out, so I could be wrong: The blending functions, set via the BlendingAttribute, are
So, if Sacky is semi-transparent (0.5 alpha, for example), and the Platform is opaque (1.0 alpha), the alpha for the resulting pixels is:
That would explain why you can see the background scene through Sacky. If there was a way to specify |
|
I asked on the forums, and it's dead over there. :< I guess the problems we keep running into are the hard problems nobody wants to / can solve, haha. Anyway, I think you've come to the same conclusion I have after doing some research, too - I think we need that separate blending function. I think we could do it ourselves, but it'd be a lot of extensions and code for a single problem. I wonder if a custom shader would be able to solve things? I don't know, though, because blending modes exist "outside" of shaders, I think? Like, you can't use a shader to blend an object against another? EDIT: I guess we can also submit a feature request to LibGDX's Github page and hope it gets implemented at some point? |
|
forums have become a desert recently... |
|
Yeah, I dunno. Our community's small, but I think we do well to answer whatever questions come up...? Maybe we have to try harder too, haha. In any case, maybe it's possible to do this by manually setting the blend mode ourselves for each object and not using the BlendingAttribute at all? I'll have to try it, but I think the renderer will set the blend mode only if a material has a BlendingAttribute. So, I think we'd handle setting the blending and alpha and sorting transparent objects back-to-front (because the renderer won't do it for us without the BlendingAttribute). |
|
If you made relevant threads in other places, it would probably be a good idea to provide a link here (so everyone can keep up). I kinda assumed that LibGDX forums would be dead, so I made a thread on the OpenGL forums. As you can see there, it seems premultiplying alpha in all the textures would work (with a different blend mode), but I'm not sure if doing that is a good idea, because now everyone needs to be aware of the fact that it's premultiplied, to avoid surprises (because typically, people assume color is true color, and alpha is not yet applied). I don't know if this solution would cause other problems, but I guess it's worth a try. Beyond that, we basically need glBlendFuncSeparate.
I don't think the problems themselves are inherently difficult - We get stuck because the LibGDX rendering system doesn't expose all the OpenGL functionality we want. Anyway, the person who wrote the libgdx 3D system is @xoppa, so I'm guessing that he's probably be the person to ask about this stuff.
As far as I know, yes, and it appears it's not something that you can replace (directly) with shaders: http://stackoverflow.com/questions/11633950/opengl-blend-modes-vs-shader-blending |
|
I haven't seen the code. But if you want to have the alpha to be fully opaque when rendering to the screen, then make the alpha one when rendering to the screen. There's a reason you're rendering to a framebuffer first, so you probably already are using a custom shader for the spritebatch. Add this the end of your fragment shaders main method: if (gl_FragColor.a <= 0.1) // pick a alpha test number
discard;
gl_FragColor.a = 1.0; |
|
Btw, please let me know any suggestion you have for the 3D api. Note, however that libgdx exposes everything that opengl es 2.x has to offer. There's nothing that libgdx or the 3d api is hiding. |
|
Hey, thanks for the quick response. We don't want the alpha to be fully opaque. I explain (in better detail) what we want, and the problem we're facing, in this thread. It seems that having a separate blending function for alpha would resolve the issue, but, as far as I can tell, there's no way to specify that via BlendingAttribute, is there? |
|
Sure, you can use it however you like. But BlendingAttribute only allows you to set one source and dest function, so it is probably more convenient to use a custom attribute for that. |
|
Can you please be a bit more specific? I don't know how to create a custom attribute that would do what we need. I looked at the implementation of the BlendingAttribute, but there's nothing in there which references glBlendFunc, so I assume that there's some other part of the system that actually calls that OpenGL function, and that we would need to change that to glBlendFuncSeparate ... ? If this is something that can be done without patching LibGDX, I would really appreciate it if you could specifically tell us what we actually need to do to get this working. Thanks. |
|
Never assume. Every single part of the libgdx 3d api is extendable by design. Have a look at: https://xoppa.github.io/blog/using-materials-with-libgdx/ and https://github.com/libgdx/libgdx/wiki/ModelBatch. you might want to start at https://xoppa.github.io/blog/basic-3d-using-libgdx/ and work your way through those tutorials when you're new to the libgdx 3D api. |
|
To be a bit more specific (and you'll also read in those links): attributes are used to specify attributes. They don't have any functionality (like calling gl method e.g.), they're passive. The shader typically implements the actual functionally that is required to render something. If you want to change the blend function then you do that in the shader, depending on the attributes. Here's an example: https://github.com/libgdx/libgdx/blob/master/tests/gdx-tests/src/com/badlogic/gdx/tests/g3d/ShaderTest.java. But tutorials and wiki probably give a better idea on how to do that. |
|
Here's the wiki on creating attributes btw: https://github.com/libgdx/libgdx/wiki/Material-and-environment. |
That's my point: We can't just extend BlendedAttribute, and I can't find anything in your links that clearly points out what exactly we need to change, and where, so that the system will use glBlendFuncSeparate instead of glBlendFunc. Looking at the DefaultShader source, it seems like we would also have to modify RenderingContext.setBlending, in addition to using a custom shader, because that's what ultimately sets the blending function in opengl ... Right? |
Ah, sure. Here it is. To be fair, it's not that LibGDX's forums are dead, it's just that almost nobody's responded in several days, so it just felt kinda like it, despite the large amount of views, I guess. Fortunately, one person has and seems to have confirmed the issue, which is nice.
That's what I was looking at, as well. P.S. Thanks for stopping in, @xoppa. |
|
As you've read in the wiki of RenderContext: "Only a small subset of the GL calls is implemented, but you can extend it to add additional calls.". If you like to have a setSeparateBlended function or something, then you can add that. Note, btw, that you don't have to use it, if you don't like that. You can safely call glBlendFuncSeparate in your shader without having to modify anything. Anyways, make sure to include your sscce. It talks alot easier if you show what you've tried and where you actually got stuck, instead of guessing and assuming. |
I didn't try anything yet, because I don't know what I actually need to do. The documentation (at least that which you referenced) doesn't enumerate which classes I'm supposed to extend, and which methods I'm supposed to override, in which parts of the system, in order to get the results I want. That's why I'm asking you to at least clarify that for us - Or, if this is really as trivial as you're making it sound, you could just show us the code that would allow us to blend alpha separately. That would save us a lot of time, and we would really appreciate that. |
|
Sorry, perhaps it is because I wrote the documentation, that I don't see what you aren't seeing. Please let me know which part is not clear so I can clearify it. every part of the 3D api is customizable. You can create your own RenderableProvider, Shader, ShaderProvider, RenderableSorter and TextureBinder. As well as that you can extend all the default Renderable, ModelInstance, DefaultShader, DefaultShaderProvider, DefaultRenderableSorter, RenderContext and DefaultTextureBinder. As well as the base classes (e.g. BaseShaderProvider, etc.). Btw, I'm sure I missed some classes, but as said, you can find those on the wiki. Likewise, you can create custom attributes or customize the material and environment as you need. The wiki does not enumerate which classes you should extend for your specific use-case, because the wiki is not written for your specific use-case. Instead it describes every part, so you can decide which part you want to extend. You said that you want to call an opengl method. Calling opengl methods is typically part of performing a render call. So that would be in your shader class. As said, if you provide a sscce, I can actually look at it. Not knowing how to solve your problem is not a reason to not show your problem to someone who wants to help you with it. If you don't want to create a sscce, then I don't think I can help you in more detail. I'm afraid that I don't have the time to create one for you. Have a look at: Btw, it looks like something that might be interesting for others as well. So if you provide a ssce, perhaps we can add it to the tests. |
|
OK, I've been tooling around with this for awhile, but have yet to get it to work for some reason - just going the basic route of extending DefaultShader to make our own is simply not working. Here's the entirety of my custom shader file (named BDXDefaultShader): package com.nilunder.bdx.utils;
import com.badlogic.gdx.graphics.g3d.Renderable;
import com.badlogic.gdx.graphics.g3d.shaders.DefaultShader;
public class BDXDefaultShader extends DefaultShader {
public BDXDefaultShader(Renderable renderable) {
super(renderable);
}
}When I try to use this custom shader in our ShaderProvider: private static class BDXShaderProvider extends DefaultShaderProvider {
@Override
public Shader getShader(Renderable renderable) {
if (matShaders.containsKey(renderable.material.id))
return matShaders.get(renderable.material.id).getShader(renderable);
//return super.getShader(renderable);
return new BDXDefaultShader(renderable);
}
}I get an error, which is:
Any ideas? I have no idea what's going wrong - it seems like the BaseShader can't set the u_time uniform property because it's set to null? I dunno, I'm not overriding or "not doing" anything, so I don't see where the issue's arising? P.S. I personally had no idea what a "SSCCE" was until you mentioned it - I thought you were misspelling "source" at first. Please use that as an idea that shows you how much you need to break stuff down, haha. |
|
It looks like you're extending DefaultShaderProvider (which extends BaseShaderProvider), but aren't using its implementation. Then you might as well implement the ShaderProvider interface directly instead. Because of this, you need to initialize (and properly dispose) the shader your yourself. See the wiki for an example showing how to extend DefaultShaderProvider. See the source of BaseShaderProvider for an example on how you can properly initialize and dispose the shader if you prefer not to use its functionality. |
|
OK, I think I got it. Thanks for your help, @xoppa! Now an issue's that this works fine, but only for the "standard" alpha-enabled blend mode. Additive blending doesn't work, for example, against the background scene, though it works fine for the current scene. The blended object just shows up as an unblended, transparency-enabled object against the background. Here's what it looks like for an additive blending object to show up with full alpha transparency: As you can see, the portion of the sprite against the background scene (with the white stripes) shows up as just opaque, while the portion visible against the current scene shows up as additive blended. In the meantime, I'll submit a PR. |
|
Would this mean that the Profiler's background will lose transparency? |
|
No, it's fine. The profiler's background still has transparency. |
|
Well, this was merged, and the core problem is solved. I think the actual complete resolution would be to create our own BlendingAttribute and use it in the shader to blend apart from the built-in one (since the built-in one makes the shader use glBlendFunc). However, I believe this would make us manually sorting blended / transparent objects, so we'll stop this here for now, I guess. Thanks for the help, @xoppa. I'll close this now. |

Hey, there. So this is a bug that I happened upon while solving another bug.
Essentially, transparent objects (or objects / sprites with transparent pixels) show the scene below when a shader is used on the scene.
As far as I can tell, it's not the shader, as I bypassed actually drawing the scene using the shader - it has something to do with drawing objects to the RenderBuffer and the blending mode on materials. I don't think it's how we draw the RenderBuffer to other RenderBuffers or the screen, as only transparent objects are the problem ones - opaque objects show up fine.
Perhaps I'll be able to get back to this to attach an example later.