Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Unlit Materials #1017

Merged
merged 7 commits into from
Jan 23, 2023
Merged

Unlit Materials #1017

merged 7 commits into from
Jan 23, 2023

Conversation

joseph-kaile
Copy link
Collaborator

normal1

normal2

@cesium-concierge
Copy link

Thanks for the pull request @joseph-kaile!

Reviewers, don't forget to make sure that:

static void computeFakeNormals(
const TArray<uint32_t>& indices,
TArray<FStaticMeshBuildVertex>& vertices) {
// Compute flat normals
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Incorrect comment.


v0.TangentX = v1.TangentX = v2.TangentX = TMeshVector3(0.0f);
v0.TangentY = v1.TangentY = v2.TangentY = TMeshVector3(0.0f);
v0.TangentZ = v1.TangentZ = v2.TangentZ = TMeshVector3(0.0f, 0.0f, 1.0f);
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think maybe using +Z as the normal isn't quite right. It's fine when the tileset is located near the georeference origin. But what if the origin is on one side of the globe, and the tileset is on the other? Then the tileset would only be lit when the sun is below the horizon, which is likely to surprise users.

So I think instead of +Z, the normal at each point should be set to the ellipsoid surface normal at that point, transformed to the mesh's coordinate system.

Basically you need to get each vertex position, transform it to ECEF using the transform passed into this method, and then call Ellipsoid::WGS84.geodeticSurfaceNormal to get the surface normal at that point. Then that surface normal has to be transformed from ECEF back to Unreal coordinates using the inverse of that passed-in transform (don't forget to construct the vector to transform as glm::dvec4(ecefNormal, 0.0) before multiplying it with the inverse of the transform, because it's a direction not a position).

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

OK, I believe I implemented the above in code. I'm not sure how to visualize the normals. However the code is straightforward enough. If I invert the normals the buildings get dark. Also, I disabled the dynamic shadows:
result

CHANGES.md Outdated

##### Fixes :wrench:

- Disable normals for unlit materials, which caused photogrammetry to appear too dark.
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Please mention the extension name here.

@nithinp7
Copy link
Contributor

nithinp7 commented Jan 2, 2023

@joseph-kaile Sorry if I'm missing context, but a couple of comments:

  • Smooth normals fix the weird triangl-ey lighting issues in photogrammetry and they seem like a more elegant solution than fake normals right?
    • Technically that is against gltf spec when normals are missing - spec dictates clients should generate flat normals when normals are missing, but imo smooth normals look better than flat normals for photogrammetry.
    • The only times flat normals need be generated are when tangents are needed and missing e.g., to apply a normal map. When tangents are missing normal maps are relative to MikkTSpace tangents, which requires flat normals. All that being said though, normal maps are never used for unlit photogrammetry (currently).
    • So for example, we could potentially just generate smooth normals if the unlit extension is present, if this is the route to go down.
  • But alternatively, why don't we just use an actually unlit material shading model instead of faking normals when the unlit extension is present? I don't necessarily love this option, as it would look even more unrealistic - but it seems to be how photogrammetry was intended to be rendered. Plus that's what we do in Cesium for Unity and CesiumJS.

computeFlatNormals(indices, StaticMeshBuildVertices);
if (material.hasExtension<ExtensionKhrMaterialsUnlit>()) {
glm::dvec3 ecefCenter = glm::dvec3(
transform * CesiumTransforms::unrealToOrFromCesium *
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I don't think the unrealToOrFromCesium here and below are needed. I tested by using Buffer Visualization -> World Normal and comparing the color of Melbourne with Cesium World Terrain. Cesium World Terrain will have some undulations in its normals, while Melbourne will be a solid color, but the overall color of them should be the same because the normals should point in basically up in both cases. With the unrealToOrFromCesium in place, they were quite different, but once I removed it they were the same. This makes sense because there's no need to use the Unreal coordinate system at all. The Origin is expressed in model coordinates, then we transform to ECEF, compute the normal, and transform the ECEF normal back to model coordinates.

There seems to be something weird with normals in Cesium for Unreal in general, though. Everything is good near the georeference origin, but on the back side of the globe something is wonky. For example, in this screenshot, both CWT and Melbourne are dark even though the sun is high in the sky:
image
This doesn't necessarily need to be fixed as part of this PR, but maybe you have an idea what's wrong? I'm at a bit of a loss at the moment.

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

That makes sense, I assumed that there was code I couldn't find that automatically transformed the coordinates in the position view accessor.

I did see something like that. I think when the georeference was so far away everything was sky blue. I just thought that this was from floating point precision cause its too far away from the origin it gets messed up.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Could be precision related, but in theory the directional light should just be a direction, so the large coordinate values shouldn't matter too much. But who knows. I'll write an issue.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Oh! I think it's just this issue again:
#527
And there's a workaround listed in there.

@kring
Copy link
Member

kring commented Jan 23, 2023

Thanks @joseph-kaile!

@kring kring merged commit 27adc4c into ue4-main Jan 23, 2023
@kring kring deleted the khr-materials-unlit branch January 23, 2023 00:40
@kring kring mentioned this pull request Jun 28, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

4 participants