-
Notifications
You must be signed in to change notification settings - Fork 2.9k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
The hdr-compute-peak changes the brightness unaturally. #5960
Comments
The problem in your clip is the "SAMSUNG" logo at the top right. It's stupidly bright, way brighter than the rest of the clip. Clearly some mastering engineer thought it a good idea to blast the user with super-highlights in the hope that it would somehow be registered as readable text. The moment the SAMSUNG logo fades in, everything else just completely pales in comparison - which is why it gets so dark, as a result of the simulated eye adaptation. And since the fading is so quick and follows an unusual curve, it also triggers a scene change detection somewhere in the middle. Normally, small localized changes like this shouldn't trigger a scene change, but since the brightness difference in this case is so dramatic that it drowns out the rest of the scene, it's enough to hit the threshold value... |
In summary, I don't think this is going to be a problem for typical media; but if you have suggestions of how to improve the algorithm, I'd be all ears. Maybe we should prioritize regions of the screen based on their proximity to the center? |
I have the same problem with hdr-compute-peak. I get blown out brightness in just about all of my HDR movies. Most notably in Interstellar. The scenes in the ice caps and snowy plains, you can see it, but at 2:34:10, the whole picture is blown out. Below are screen caps with it enabled and disabled. So now I keep it disabled. I'm attaching the log with hdr-compute-peak, enabled. |
@Doofussy2 Might be a problem with the translation of compute shaders to D3D11. Can you test with OpenGL or Vulkan? |
I have the same problem with D3D11, While OpenGL and Vulkan work fine. |
I can test later, but the screenshots are from using nvdec-copy. |
i was talking about gpu-api=d3d11, hwdec is irrelevant for me. |
Yeah, I believe that's what haasn was talking about, I just wanted to make sure that nvdec didn't affect it. But I think it's not part of it. Either way, I will test this as soon as I'm able. |
I can confirm that using opengl or vulkan, the picture is correct. |
cc @rossy |
It also happens to this scene with Vulkan and OpenGL without the logo. |
@Doofussy2 @sat4all1 This seems like it could be a SPIRV-Cross regression. Can you see if it works in this test build (from #5804.) That has an older build of SPIRV-Cross, which fixes the issue for me. I'll try to find which change actually broke this. |
That works just fine, @rossy |
This might require a new issue report, but as it's so closely related I thought I'd mention it. I'm comparing HDR playback between mpv and my MiBox3 (which signals the display to use the HDR metadata). To keep as much consistency as possible, I tested using the same HDMI input which then uses all the same display settings, and used the same media. There is a noticeable difference in luminance. With the MiBox, the picture is much more vivid and bright. So I experimented with vf=format=sig-peak. That does make a notable improvement....but also causes problems. First, any value higher than 0.0, seems to have the same effect. I went as low as 0.0001 and as high as 0.9, the difference was imperceptible to me. Second, even at 0.0001, the same scene I posted from Interstellar is blown out. But the picture is improved in most other scenes (I tried several movies), both with vulkan and opengl, and hdr-compute-peak off and on. The result is the same. |
@Doofusy2 It sounds like you want |
Thanks @haasn , but sadly neither of those make a difference. |
This is what I tried gpu-api=opengl vf=format=sig-peak=0.1 Once you put in the sig-peak, nothing else seems to have an affect. |
Oh I think you meant use it instead of sig-peak! Ha! Sorry, still learning (and enjoying the experience). Yes, using hdr-compute-peak does brighten the image. Thank you @haasn |
FWIW, all that |
Yeah, I kinda found that out (the hard way). Thanks again @haasn I'm still hoping someone will look into the deinterlacing issue that I posted. That's a big problem, for me. |
@rossy thanks |
@Doofussy2 @sat4all1 Well, the SPIRV-Cross issue has been fixed, so hdr-compute-peak should work in future mpv builds. Thanks for testing. Note: That issue was a bug specific to Windows and D3D11, so not the same as @laichiaheng's original problem. |
thank you! |
This is resolved in the latest build. |
@lachs0r this problem is present in 0.29, but fixed in @shinchiro's builds |
@Doofussy2: Try the latest build. |
Yup, that works, @lachs0r, well done! Thank you! |
Tentatively marking this as "not a bug" since the original issue (the unnatural brightness change in that scene) is specific to the clip (and clips like it), and we have no good way of working around it. (Any such way would probably be more accurately termed a feature request) |
If there's an average involved somewhere, could a different method (e.g. median) be used? Or basically some other way to statistically filter out really far outliers. Or maybe a separate buffer (could be lower res) which averages brightness over time, and then above a certain threshold is used as a mask for which samples to ignore in the peak computation, that way a bright spot that moves around isn't going to "heat up" the buffer, but a spot that stays exactly the same will. |
Maybe, but a median seems rather difficult to implement (efficiently).
Interesting idea. Actually, perhaps an approach like this could replace the current storage buffer-based averaging altogether. Basically, we would be downsampling to a small storage image (say one sample per workgroup), using additive blending. Main question would be how to update the running average, but maybe instead of performing a true running average over the past N frames we could instead just decay the previous buffer contents by either a constant value or a given coefficient? Might be faster even in the non-local case too. I'll have a play with it after the holidays. |
CPU: i5-4460
GPU: AMD RX460
Linux kernel: 4.17.3-1-MANJARO
mpv version and platform
Reproduction steps
Play a HDR 4K 60fps video.
Expected behavior
The brightness changes naturally.
Actual behavior
It changes the brightness so suddenly, and the range seems to be too big, sometimes the graphic is vivid, sometimes it's too dim, and they happens in the same scene.
Log file
https://0x0.st/spJL.log
How it looks like: https://0x0.st/spJO.mkv
Sample files
Samsung Travel With My Pet HDR UHD 4K Demo
The text was updated successfully, but these errors were encountered: