New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
HDR and HDRJPG environment maps produce different lighting #28071
base: dev
Are you sure you want to change the base?
Conversation
Good. The rotation obfuscates rendering quality, artifacts, and errors. I'd make the change permanent. And you will no longer need a render loop. |
True but a rotation or in general animations make the examples visual more appealing and engaging. Maybe a checkbox that allows to disable the rotation could be a compromise? |
Removing the renderloop is annoying... Makes the example more fragile... |
@elalish You were right! The reason the So I don't think the converter is the problem. My next suspect is PMREM...
You can open both images in different tabs to compare. |
Ooh, interesting! I have a thought - my PMREM uses lat-long blurs where the axis changes each time to hide artifacts. When the texture is larger, it'll go through more axis shifts, but critically, they might be in a different order per roughness level. That might be a simple fix. I need to take a look. |
Hmm, okay that helps a little, but it doesn't fix the overall brightness difference. With the above change, we know that the mip level blurs of the PMREM are identical for every layer once we get to where the smaller environment starts. Therefore the differences can only be in the first two PMREM blurs (going from 4k to 1k) or in whatever external tool converted the original 4k into a 1k equirect. Assuming the external reduction is done by simple mipmap-style pixel averaging, that has the possibility to introduce error itself since the equirect pixels don't actually represent a constant area. Imagine an HDR image with a single bright pixel and all others black. If 16 pixels are averaged together for form one in a smaller image, that result pixel will have the same value regardless of which of the 16 was bright. However, if the bright pixel was closer to a pole, it represents less light due to being physically smaller. Of course my weird PMREM lat/long blur might well be doing something similar or even worse. Still, it's interesting that basic image reduction may introduce error as well. Thoughts @WestLangley? |
For testing purposes...
https://rawcdn.githack.com/mrdoob/three.js/1d0fbff6c3425d37eee16b874b2eea63fa853e64/examples/webgl_loader_texture_hdrjpg.html