Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Scene: Respect original material settings in .overrideMaterial #14577

Open
gkjohnson opened this issue Jul 29, 2018 · 6 comments
Open

Scene: Respect original material settings in .overrideMaterial #14577

gkjohnson opened this issue Jul 29, 2018 · 6 comments

Comments

@gkjohnson
Copy link
Collaborator

gkjohnson commented Jul 29, 2018

When setting the override material for a scene the maps and other settings on the original material are not used -- this applies to animations, mesh instances, and other uniforms and defines, as well. This jsfiddle shows that the normal map is not applied when rendering the scene with a MeshNormalMaterial applied:

http://jsfiddle.net/wbhrd58c/

image

This makes it a difficult to render things like normal and depth buffers for screen effects. So when an overrideMaterial is used the defines and uniforms of the original should be used (if they exist on the override material). This would let an override material us the color, displacementMap, normalMap, textures, skinning settings, etc available.

This mechanic could be used make post processing passes more robust and correct, afford a depth prepass with correct vertex transformations, a deferred renderer that uses the original material properties without having to manually copy everything, and shadows that support displacement maps out of the box.

This is closer to how Unity's Shader Replacement works to afford this functionality:

the camera renders the scene as it normally would. the objects still use their materials, but the actual shader that ends up being used is changed

In Unity if the objects material doesn't define a specific uniform then the default shader uniform value is used. To allow for backwards compatibility this could be included as an opt-in feature:

Scene.setOverrideMaterial( material : Material, overrideUniforms : Boolean );
@WestLangley
Copy link
Collaborator

WestLangley commented Jul 30, 2018

Right. scene.overrideMaterial overrides the material. What it doesn't do is honor mesh.customDepthMaterial, skinning, or morphs.

If you are requesting a feature enhancement, then this post is similar to:

#8676
#13858

@gkjohnson
Copy link
Collaborator Author

gkjohnson commented Jul 30, 2018

Yea this is an enhancement -- I don't think anything unexpected is happening here.

Those do seem similar and I think the right change here would address #8676 but I think #13858 would require some changes outside of what I'm imagining depending on what the customDepthMaterial is doing. I can see how customDepthMaterial was created to get around this problem, though.

Ideally the override material would take on the uniform properties defined on the original material, including animation, map, and other arrays and values if they exist on the override material. Unity's replacement shaders behave like this.

@Mugen87 Mugen87 changed the title Scene.OverrideMaterial does not respect original material settings Scene: Respect original material settings in .overrideMaterial Jul 31, 2018
@Oletus
Copy link
Contributor

Oletus commented May 19, 2020

I'd want to reopen discussion on this issue - after running into this issue in a current project and making an ugly partial workaround, it would be nice to see a solution.

What I'd propose is allowing something like an "override shader key" string to be set on the renderer, and then each material could have shader code or maybe defines corresponding to different keys. The renderer would choose the shader to use for each material based on the current override shader key. This kind of a system would keep the open extendability of the current system, instead of defining a fixed set of shaders like depth/normal.

One interesting direction for further development would be allowing rendering into multiple render targets simultaneously, so that something like a deferred renderer could be implemented with better performance. This doesn't need to be a priority at first, but something to consider so that we don't make a solution that's completely incompatible with the idea. The override shader key system could support this so that materials could also have shader variants able to output to multiple buffers.

Thoughts?

@gkjohnson
Copy link
Collaborator Author

I've put together a utility to allow for replacing shaders in a scene more easily. You can see the code here and a demo here. My use case just involved using a different shader while retaining the same uniforms so you could render raw values of some of the models (roughness, normals, etc). It allows for writing a callback so you can use different shaders for different materials and copy the relevant uniforms -- ie use a different material for a mesh vs an instanced line geometry. By default all uniforms are copied.

Basically you call replace to replace all materials in the scene and it caches the original material, you render, and then you call reset again to reset the scene to the original materials. You should be able to extend it (or something like it) to achieve your key-based shader replacement @Oletus.

I did run into a couple of frustrations / road blocks when working on this, though. Basically the built in materials (Standard, Basic, Phong, etc) lack of exposed uniforms, fragment shader code, vertex shader code, and defines makes it difficult / impossible to render the same material with a couple changes (same vertex shader, different fragment shader for example, or simply copying all uniforms to a new custom material). It would be nice to be able to access these fields the same way we can for ShaderMaterial -- it's not really clear to me if there's a reason why we can't. Maybe that should be a new isue.

When / if we ever get a "depth pass" or "normal pass" solution I would hope it's written in such a way that it can be used generally for these types of use cases. It's important really for a lot of screen effects and deferred rendering to be implemented properly.

@Oletus
Copy link
Contributor

Oletus commented May 21, 2020

@gkjohnson The solution you linked seems to require extra traversals of the scene. I'd imagine that kind of a solution is quite costly for performance, if we consider the use case of rendering depth/normal buffers as a part of post-processing or a deferred rendering pipeline on every frame. I think whatever we come up with this should be written with performance in mind first and foremost. Did you measure what kind of CPU performance impact does your solution have compared to the current method of rendering a normal buffer (without the effect of normal maps)?

@gkjohnson
Copy link
Collaborator Author

The solution you linked seems to require extra traversals of the scene. I'd imagine that kind of a solution is quite costly for performance, if we consider the use case of rendering depth/normal buffers as a part of post-processing or a deferred rendering pipeline on every frame.

That was a rough overview of how it works. There are a couple things you can do to speed it up -- such as only caching during the first replacement, rendering all subsequent passes, and then resetting afterward:

- replace material with normals material and cache original
- render scene

- replace material with roughness material and don't cache
- render scene

- reset to original cached materials

Keeping a shallow array of all meshes with materials that you want to replace (even if that means all of them) can help speed things up because you can avoid recursing and iterating over non-renderable nodes. You can generate that array once at the beginning of every frame or update when you add or remove something from the scene. I use a hacked in version of #16934 to track when items are added and removed more easily for this type of purpose.

You can disable scene.autoUpdate on subsequent renders, too, to avoid matrix updates.

Did you measure what kind of CPU performance impact does your solution have compared to the current method of rendering a normal buffer (without the effect of normal maps)?

I haven't gone that deep into it, no. It's not the ideal solution but until there's something built in this is what I've looked into. The postprocessing effects in the examples folder (and shadow map passes!) could really benefit from proper passes like this, though.

Generally I would like to see tools and features added into three.js that allow us to build our own flexibility around the library permanently, such as the exposed shaders, uniforms, and defines for built in materials and #16934. In this case there are benefits to be had within the library from including this functionality.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

4 participants