Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Transparent Objects Show Lower Scene #438

Closed
SolarLune opened this issue Sep 28, 2015 · 28 comments
Closed

Transparent Objects Show Lower Scene #438

SolarLune opened this issue Sep 28, 2015 · 28 comments

Comments

@SolarLune
Copy link
Contributor

@SolarLune SolarLune commented Sep 28, 2015

Hey, there. So this is a bug that I happened upon while solving another bug.

Essentially, transparent objects (or objects / sprites with transparent pixels) show the scene below when a shader is used on the scene.

As far as I can tell, it's not the shader, as I bypassed actually drawing the scene using the shader - it has something to do with drawing objects to the RenderBuffer and the blending mode on materials. I don't think it's how we draw the RenderBuffer to other RenderBuffers or the screen, as only transparent objects are the problem ones - opaque objects show up fine.

Perhaps I'll be able to get back to this to attach an example later.

@GoranM
Copy link
Owner

@GoranM GoranM commented Sep 29, 2015

I'll wait for the example (please make it as simple as possible).

@GoranM GoranM closed this Sep 29, 2015
@GoranM GoranM reopened this Sep 29, 2015
@GoranM
Copy link
Owner

@GoranM GoranM commented Sep 29, 2015

Oops, accidentally closed the issue.

Re-opened.

@SolarLune
Copy link
Contributor Author

@SolarLune SolarLune commented Oct 1, 2015

OK, here we go.

https://www.dropbox.com/s/0cm2ye0bst3tn4r/ShaderTransparencyIssues.blend?dl=0

The example blend file should work with any project that has 2D filters in the bdx/shaders/2d folder. Comment out the scene.filters.add line and the object becomes easy to see. Leave it in, and the object is visually mixed with the other scene, and is far darker than it should be.

@SolarLune
Copy link
Contributor Author

@SolarLune SolarLune commented Oct 6, 2015

Here's a screenshot showing the problem a bit

screen shot 10-05-15 at 10 43 pm

I've sceenshotted a tweaked version of the example - As you can see, the background scene and the background cube is visible through Sacky, even though you should only be able to see the ground or Sacky.

@GoranM
Copy link
Owner

@GoranM GoranM commented Oct 7, 2015

I'll preface this by saying that I don't really understand the underlying details of how OpenGL actually hashes this out, so I could be wrong:

The blending functions, set via the BlendingAttribute, are GL_SRC_ALPHA for source, and GL_ONE_MINUS_SRC_ALPHA for destination. This means that that the following calculation is applied to produce the resulting pixels in frameBuffer (s = source, d = destination, r = result):

(sR*sA) + (dR*(1-sA)) = rR  
(sG*sA) + (dG*(1-sA)) = rG
(sB*sA) + (dB*(1-sA)) = rB
(sA*sA) + (dA*(1-sA)) = rA

So, if Sacky is semi-transparent (0.5 alpha, for example), and the Platform is opaque (1.0 alpha), the alpha for the resulting pixels is:

(0.5*0.5) + (1.0*(1.0-0.5)) = 0.75

That would explain why you can see the background scene through Sacky.

If there was a way to specify glBlendFuncSeparate on BlendingAttribute, I think GL_ZERO for alpha source, and GL_ONE for alpha destination would basically resolve the issue, but it doesn't seem to be available.

@SolarLune
Copy link
Contributor Author

@SolarLune SolarLune commented Oct 8, 2015

I asked on the forums, and it's dead over there. :< I guess the problems we keep running into are the hard problems nobody wants to / can solve, haha.

Anyway, I think you've come to the same conclusion I have after doing some research, too - I think we need that separate blending function. I think we could do it ourselves, but it'd be a lot of extensions and code for a single problem.

I wonder if a custom shader would be able to solve things? I don't know, though, because blending modes exist "outside" of shaders, I think? Like, you can't use a shader to blend an object against another?

EDIT: I guess we can also submit a feature request to LibGDX's Github page and hope it gets implemented at some point?

@xGnoSiSx
Copy link

@xGnoSiSx xGnoSiSx commented Oct 8, 2015

forums have become a desert recently...

@SolarLune
Copy link
Contributor Author

@SolarLune SolarLune commented Oct 8, 2015

Yeah, I dunno. Our community's small, but I think we do well to answer whatever questions come up...? Maybe we have to try harder too, haha.

In any case, maybe it's possible to do this by manually setting the blend mode ourselves for each object and not using the BlendingAttribute at all? I'll have to try it, but I think the renderer will set the blend mode only if a material has a BlendingAttribute. So, I think we'd handle setting the blending and alpha and sorting transparent objects back-to-front (because the renderer won't do it for us without the BlendingAttribute).

@GoranM
Copy link
Owner

@GoranM GoranM commented Oct 8, 2015

If you made relevant threads in other places, it would probably be a good idea to provide a link here (so everyone can keep up).

I kinda assumed that LibGDX forums would be dead, so I made a thread on the OpenGL forums. As you can see there, it seems premultiplying alpha in all the textures would work (with a different blend mode), but I'm not sure if doing that is a good idea, because now everyone needs to be aware of the fact that it's premultiplied, to avoid surprises (because typically, people assume color is true color, and alpha is not yet applied).

I don't know if this solution would cause other problems, but I guess it's worth a try.

Beyond that, we basically need glBlendFuncSeparate.

I guess the problems we keep running into are the hard problems nobody wants to / can solve

I don't think the problems themselves are inherently difficult - We get stuck because the LibGDX rendering system doesn't expose all the OpenGL functionality we want.

Anyway, the person who wrote the libgdx 3D system is @xoppa, so I'm guessing that he's probably be the person to ask about this stuff.

I don't know, though, because blending modes exist "outside" of shaders, I think?

As far as I know, yes, and it appears it's not something that you can replace (directly) with shaders: http://stackoverflow.com/questions/11633950/opengl-blend-modes-vs-shader-blending

@xoppa
Copy link

@xoppa xoppa commented Oct 8, 2015

I haven't seen the code. But if you want to have the alpha to be fully opaque when rendering to the screen, then make the alpha one when rendering to the screen. There's a reason you're rendering to a framebuffer first, so you probably already are using a custom shader for the spritebatch. Add this the end of your fragment shaders main method:

if (gl_FragColor.a <= 0.1) // pick a alpha test number
    discard;
gl_FragColor.a = 1.0;
@xoppa
Copy link

@xoppa xoppa commented Oct 8, 2015

Btw, please let me know any suggestion you have for the 3D api. Note, however that libgdx exposes everything that opengl es 2.x has to offer. There's nothing that libgdx or the 3d api is hiding.

@GoranM
Copy link
Owner

@GoranM GoranM commented Oct 8, 2015

Hey, thanks for the quick response.

We don't want the alpha to be fully opaque. I explain (in better detail) what we want, and the problem we're facing, in this thread.

It seems that having a separate blending function for alpha would resolve the issue, but, as far as I can tell, there's no way to specify that via BlendingAttribute, is there?

@xoppa
Copy link

@xoppa xoppa commented Oct 8, 2015

Sure, you can use it however you like. But BlendingAttribute only allows you to set one source and dest function, so it is probably more convenient to use a custom attribute for that.

@GoranM
Copy link
Owner

@GoranM GoranM commented Oct 8, 2015

Can you please be a bit more specific?

I don't know how to create a custom attribute that would do what we need. I looked at the implementation of the BlendingAttribute, but there's nothing in there which references glBlendFunc, so I assume that there's some other part of the system that actually calls that OpenGL function, and that we would need to change that to glBlendFuncSeparate ... ?

If this is something that can be done without patching LibGDX, I would really appreciate it if you could specifically tell us what we actually need to do to get this working.

Thanks.

@xoppa
Copy link

@xoppa xoppa commented Oct 8, 2015

Never assume. Every single part of the libgdx 3d api is extendable by design. Have a look at: https://xoppa.github.io/blog/using-materials-with-libgdx/ and https://github.com/libgdx/libgdx/wiki/ModelBatch. you might want to start at https://xoppa.github.io/blog/basic-3d-using-libgdx/ and work your way through those tutorials when you're new to the libgdx 3D api.

@xoppa
Copy link

@xoppa xoppa commented Oct 8, 2015

To be a bit more specific (and you'll also read in those links): attributes are used to specify attributes. They don't have any functionality (like calling gl method e.g.), they're passive. The shader typically implements the actual functionally that is required to render something. If you want to change the blend function then you do that in the shader, depending on the attributes. Here's an example: https://github.com/libgdx/libgdx/blob/master/tests/gdx-tests/src/com/badlogic/gdx/tests/g3d/ShaderTest.java. But tutorials and wiki probably give a better idea on how to do that.

@xoppa
Copy link

@xoppa xoppa commented Oct 8, 2015

Here's the wiki on creating attributes btw: https://github.com/libgdx/libgdx/wiki/Material-and-environment.

@GoranM
Copy link
Owner

@GoranM GoranM commented Oct 8, 2015

They don't have any functionality (like calling gl method e.g.), they're passive.

That's my point: We can't just extend BlendedAttribute, and I can't find anything in your links that clearly points out what exactly we need to change, and where, so that the system will use glBlendFuncSeparate instead of glBlendFunc.

Looking at the DefaultShader source, it seems like we would also have to modify RenderingContext.setBlending, in addition to using a custom shader, because that's what ultimately sets the blending function in opengl ... Right?

@SolarLune
Copy link
Contributor Author

@SolarLune SolarLune commented Oct 9, 2015

If you made relevant threads in other places, it would probably be a good idea to provide a link here (so everyone can keep up).

Ah, sure. Here it is.

To be fair, it's not that LibGDX's forums are dead, it's just that almost nobody's responded in several days, so it just felt kinda like it, despite the large amount of views, I guess. Fortunately, one person has and seems to have confirmed the issue, which is nice.

Looking at the DefaultShader source, it seems like we would also have to modify RenderingContext.setBlending, in addition to using a custom shader, because that's what ultimately sets the blending function in opengl ... Right?

That's what I was looking at, as well.

P.S. Thanks for stopping in, @xoppa.

@xoppa
Copy link

@xoppa xoppa commented Oct 9, 2015

As you've read in the wiki of RenderContext: "Only a small subset of the GL calls is implemented, but you can extend it to add additional calls.". If you like to have a setSeparateBlended function or something, then you can add that. Note, btw, that you don't have to use it, if you don't like that. You can safely call glBlendFuncSeparate in your shader without having to modify anything.

Anyways, make sure to include your sscce. It talks alot easier if you show what you've tried and where you actually got stuck, instead of guessing and assuming.

@GoranM
Copy link
Owner

@GoranM GoranM commented Oct 9, 2015

It talks alot easier if you show what you've tried and where you actually got stuck, instead of guessing and assuming.'

I didn't try anything yet, because I don't know what I actually need to do. The documentation (at least that which you referenced) doesn't enumerate which classes I'm supposed to extend, and which methods I'm supposed to override, in which parts of the system, in order to get the results I want.

That's why I'm asking you to at least clarify that for us - Or, if this is really as trivial as you're making it sound, you could just show us the code that would allow us to blend alpha separately.

That would save us a lot of time, and we would really appreciate that.

@xoppa
Copy link

@xoppa xoppa commented Oct 9, 2015

Sorry, perhaps it is because I wrote the documentation, that I don't see what you aren't seeing. Please let me know which part is not clear so I can clearify it.

every part of the 3D api is customizable. You can create your own RenderableProvider, Shader, ShaderProvider, RenderableSorter and TextureBinder. As well as that you can extend all the default Renderable, ModelInstance, DefaultShader, DefaultShaderProvider, DefaultRenderableSorter, RenderContext and DefaultTextureBinder. As well as the base classes (e.g. BaseShaderProvider, etc.). Btw, I'm sure I missed some classes, but as said, you can find those on the wiki. Likewise, you can create custom attributes or customize the material and environment as you need.

The wiki does not enumerate which classes you should extend for your specific use-case, because the wiki is not written for your specific use-case. Instead it describes every part, so you can decide which part you want to extend.

You said that you want to call an opengl method. Calling opengl methods is typically part of performing a render call. So that would be in your shader class.

As said, if you provide a sscce, I can actually look at it. Not knowing how to solve your problem is not a reason to not show your problem to someone who wants to help you with it. If you don't want to create a sscce, then I don't think I can help you in more detail. I'm afraid that I don't have the time to create one for you. Have a look at:
http://sscce.org/
http://stackoverflow.com/help/mcve
https://github.com/libgdx/libgdx/wiki/Getting-Help#executable-example-code

Btw, it looks like something that might be interesting for others as well. So if you provide a ssce, perhaps we can add it to the tests.

@SolarLune
Copy link
Contributor Author

@SolarLune SolarLune commented Oct 11, 2015

OK, I've been tooling around with this for awhile, but have yet to get it to work for some reason - just going the basic route of extending DefaultShader to make our own is simply not working. Here's the entirety of my custom shader file (named BDXDefaultShader):

package com.nilunder.bdx.utils;

import com.badlogic.gdx.graphics.g3d.Renderable;
import com.badlogic.gdx.graphics.g3d.shaders.DefaultShader;

public class BDXDefaultShader extends DefaultShader {

    public BDXDefaultShader(Renderable renderable) {
        super(renderable);
    }

}

When I try to use this custom shader in our ShaderProvider:

private static class BDXShaderProvider extends DefaultShaderProvider {
        @Override
        public Shader getShader(Renderable renderable) {
            if (matShaders.containsKey(renderable.material.id))
                return matShaders.get(renderable.material.id).getShader(renderable);
            //return super.getShader(renderable);

            return new BDXDefaultShader(renderable);
        }
    }

I get an error, which is:

Exception in thread "LWJGL Application" java.lang.NullPointerException
        at com.badlogic.gdx.graphics.g3d.shaders.BaseShader.has(BaseShader.java:268)
        at com.badlogic.gdx.graphics.g3d.shaders.DefaultShader.begin(DefaultShader.java:700)
        at com.badlogic.gdx.graphics.g3d.ModelBatch.flush(ModelBatch.java:211)
        at com.badlogic.gdx.graphics.g3d.ModelBatch.end(ModelBatch.java:224)
        at com.nilunder.bdx.Bdx.main(Bdx.java:197)
        at com.solarlune.sha.BdxApp.render(BdxApp.java:29)
        at com.badlogic.gdx.backends.lwjgl.LwjglApplication.mainLoop(LwjglApplication.java:214)
        at com.badlogic.gdx.backends.lwjgl.LwjglApplication$1.run(LwjglApplication.java:120)

Any ideas? I have no idea what's going wrong - it seems like the BaseShader can't set the u_time uniform property because it's set to null? I dunno, I'm not overriding or "not doing" anything, so I don't see where the issue's arising?

P.S. I personally had no idea what a "SSCCE" was until you mentioned it - I thought you were misspelling "source" at first. Please use that as an idea that shows you how much you need to break stuff down, haha.

@xoppa
Copy link

@xoppa xoppa commented Oct 11, 2015

It looks like you're extending DefaultShaderProvider (which extends BaseShaderProvider), but aren't using its implementation. Then you might as well implement the ShaderProvider interface directly instead. Because of this, you need to initialize (and properly dispose) the shader your yourself.

See the wiki for an example showing how to extend DefaultShaderProvider. See the source of BaseShaderProvider for an example on how you can properly initialize and dispose the shader if you prefer not to use its functionality.

@SolarLune
Copy link
Contributor Author

@SolarLune SolarLune commented Oct 12, 2015

OK, I think I got it. Thanks for your help, @xoppa! Now an issue's that this works fine, but only for the "standard" alpha-enabled blend mode. Additive blending doesn't work, for example, against the background scene, though it works fine for the current scene. The blended object just shows up as an unblended, transparency-enabled object against the background.

Here's what it looks like for an additive blending object to show up with full alpha transparency:

transparencyproblem

As you can see, the portion of the sprite against the background scene (with the white stripes) shows up as just opaque, while the portion visible against the current scene shows up as additive blended.

In the meantime, I'll submit a PR.

@rafcolson
Copy link
Contributor

@rafcolson rafcolson commented Oct 12, 2015

Would this mean that the Profiler's background will lose transparency?

@SolarLune
Copy link
Contributor Author

@SolarLune SolarLune commented Oct 12, 2015

No, it's fine. The profiler's background still has transparency.

GoranM added a commit that referenced this issue Oct 17, 2015
Fixing blending to be transparent across scenes. (Bug #438)
@SolarLune
Copy link
Contributor Author

@SolarLune SolarLune commented Oct 18, 2015

Well, this was merged, and the core problem is solved.

I think the actual complete resolution would be to create our own BlendingAttribute and use it in the shader to blend apart from the built-in one (since the built-in one makes the shader use glBlendFunc). However, I believe this would make us manually sorting blended / transparent objects, so we'll stop this here for now, I guess.

Thanks for the help, @xoppa.

I'll close this now.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Linked pull requests

Successfully merging a pull request may close this issue.

None yet
5 participants
You can’t perform that action at this time.