You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
What's the correct way to transfer the lighting results of Pass N to Pass N+1?
From looking at the sample, it seems like the results of each pass is converted to a plain old texture, and then the next pass uses that texture as emissive data. But that seems wrong. It seems like you wouldn't want to convert the HDR lightmap info to an actual texture until the very end of all your passes/bounces. You'd want to keep the original floating point 32-bits-per-channel array you pass to lmSetTargetLightmap() around for use in the subsequent pass, rather than rely on a clamped/exposed/processed 8-bits-per-channel texture.
So am I just doing it wrong?
The text was updated successfully, but these errors were encountered:
What's the correct way to transfer the lighting results of Pass N to Pass N+1?
From looking at the sample, it seems like the results of each pass is converted to a plain old texture, and then the next pass uses that texture as emissive data. But that seems wrong. It seems like you wouldn't want to convert the HDR lightmap info to an actual texture until the very end of all your passes/bounces. You'd want to keep the original floating point 32-bits-per-channel array you pass to lmSetTargetLightmap() around for use in the subsequent pass, rather than rely on a clamped/exposed/processed 8-bits-per-channel texture.
So am I just doing it wrong?
The text was updated successfully, but these errors were encountered: