Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

How can I render a 3d model "shadelessly" (with no lighting) ? #84

Closed
angshine opened this issue Feb 23, 2020 · 6 comments
Closed

How can I render a 3d model "shadelessly" (with no lighting) ? #84

angshine opened this issue Feb 23, 2020 · 6 comments
Assignees
Labels
question Further information is requested

Comments

@angshine
Copy link

❓ How can I render a 3d model "shadelessly" (with no lighting) ?

Are there ways to project a 3d model to a 2d image which its pixel color just interpolated from the original color assigned per-vertex / per-face in the 3d model? Or, in other words, just render the 3d model without lighting?

@nikhilaravi
Copy link
Contributor

nikhilaravi commented Feb 23, 2020

@angshine Sure this is possible. You can create a custom shader to do this (also see shader.py for examples).

A shader can be composed in different ways and you can choose the steps involved e.g. texturing, lighting, blending. For your use case - it seems you only need to interpolate the vertex colors.

Do you need only the forward pass for rendering? Or do you plan to do a backward pass as well? This will affect the choice of blending function.

If you only need the forward pass then for each pixel you can assign the color of the closest face e.g. use hard_rgb_blend function. If you want to propagate gradients from the pixels back to the mesh properties you will probably need to use the softmax_rgb_blend function instead - this creates a soft blend using the top K closest faces per pixel (refer to the pytorch3d docs for more information on this).

Here is a simple Shader which only does vertex color interpolation with hard blending.

class SimpleShader(nn.Module):
    def __init__(self, device="cpu"):
        super().__init__()
    def forward(self, fragments, meshes, **kwargs) -> torch.Tensor:
        pixel_colors = interpolate_vertex_colors(fragments, meshes)
        images = hard_rgb_blend(pixel_colors, fragments)
        return images. # (N, H, W, 3) RGBA image

To get started with creating a renderer try the tutorials on rendering a textured mesh.

@nikhilaravi nikhilaravi self-assigned this Feb 23, 2020
@nikhilaravi nikhilaravi added the question Further information is requested label Feb 24, 2020
@angshine
Copy link
Author

@nikhilaravi Thanks for the detailed explanation!
If the Mesh is loaded using pytorch3d.io.load_objs_as_meshes(), so mesh.textures is initialized without verts_rgb but a texture map. Then, I implement the shader as follow: (which implaceinterpolate_vertex_colors with interpolate_texture_map)

class SimpleShader(nn.Module):
    def __init__(self, device="cpu"):
        super().__init__()
    def forward(self, fragments, meshes, **kwargs) -> torch.Tensor:
        pixel_colors =  interpolate_texture_map(fragments, meshes)
        images = hard_rgb_blend(pixel_colors, fragments)
        return images. # (N, H, W, 3) RGBA image

So far this shader works well, but I wonder if these two implementation leadings to the same rendering result?

@nikhilaravi
Copy link
Contributor

@angshine that looks correct for use with texture maps instead of vertex rgb colors.
What do you mean by the 'same rendering result'?

@nikhilaravi
Copy link
Contributor

@angshine do you need more help with this issue or can it be closed?

@nikhilaravi
Copy link
Contributor

@angshine I am closing this issue. If you have more questions feel free to reopen it.

@GentleDell
Copy link

GentleDell commented Dec 28, 2020

It seems that the interpolate_vertex_colors() and interpolate_texture_map() are removed in the new version. I tested the following piece of code and it works for me.

from pytorch3d.renderer.blending import hard_rgb_blend, BlendParams

class SimpleShader(nn.Module):
    def __init__(self, device="cpu", blend_params=None):
        super().__init__()
        self.blend_params = blend_params if blend_params is not None else BlendParams()

    def forward(self, fragments, meshes, **kwargs) -> torch.Tensor:
        blend_params = kwargs.get("blend_params", self.blend_params)
        texels = meshes.sample_textures(fragments)
        images = hard_rgb_blend(texels, fragments, blend_params)
        return images  # (N, H, W, 3) RGBA image

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
question Further information is requested
Projects
None yet
Development

No branches or pull requests

3 participants