Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

The hdr-compute-peak changes the brightness unaturally. #5960

Closed
laichiaheng opened this issue Jul 2, 2018 · 31 comments
Closed

The hdr-compute-peak changes the brightness unaturally. #5960

laichiaheng opened this issue Jul 2, 2018 · 31 comments

Comments

@laichiaheng
Copy link

laichiaheng commented Jul 2, 2018

CPU: i5-4460
GPU: AMD RX460
Linux kernel: 4.17.3-1-MANJARO

mpv version and platform

mpv 0.28.0-624-g861c10268d Copyright © 2000-2018 mpv/MPlayer/mplayer2 projects
 built on UNKNOWN
ffmpeg library versions:
   libavutil       56.18.102
   libavcodec      58.20.104
   libavformat     58.17.101
   libswscale      5.2.100
   libavfilter     7.25.100
   libswresample   3.2.100
ffmpeg version: N-91369-g4ac88ba548

Reproduction steps

Play a HDR 4K 60fps video.

Expected behavior

The brightness changes naturally.

Actual behavior

It changes the brightness so suddenly, and the range seems to be too big, sometimes the graphic is vivid, sometimes it's too dim, and they happens in the same scene.

Log file

https://0x0.st/spJL.log

How it looks like: https://0x0.st/spJO.mkv

Sample files

Samsung Travel With My Pet HDR UHD 4K Demo

@laichiaheng laichiaheng changed the title The hdr-compute-peak makes the video suddenly darker and suddenly brighter. The hdr-compute-peak makes changes the brightness unaturally. Jul 2, 2018
@laichiaheng laichiaheng changed the title The hdr-compute-peak makes changes the brightness unaturally. The hdr-compute-peak changes the brightness unaturally. Jul 2, 2018
@haasn
Copy link
Member

haasn commented Jul 2, 2018

The problem in your clip is the "SAMSUNG" logo at the top right. It's stupidly bright, way brighter than the rest of the clip. Clearly some mastering engineer thought it a good idea to blast the user with super-highlights in the hope that it would somehow be registered as readable text.

The moment the SAMSUNG logo fades in, everything else just completely pales in comparison - which is why it gets so dark, as a result of the simulated eye adaptation. And since the fading is so quick and follows an unusual curve, it also triggers a scene change detection somewhere in the middle. Normally, small localized changes like this shouldn't trigger a scene change, but since the brightness difference in this case is so dramatic that it drowns out the rest of the scene, it's enough to hit the threshold value...

@haasn
Copy link
Member

haasn commented Jul 2, 2018

In summary, I don't think this is going to be a problem for typical media; but if you have suggestions of how to improve the algorithm, I'd be all ears. Maybe we should prioritize regions of the screen based on their proximity to the center?

@Doofussy2
Copy link

I have the same problem with hdr-compute-peak. I get blown out brightness in just about all of my HDR movies. Most notably in Interstellar. The scenes in the ice caps and snowy plains, you can see it, but at 2:34:10, the whole picture is blown out. Below are screen caps with it enabled and disabled.

mpv-shot0002

mpv-shot0001

So now I keep it disabled. I'm attaching the log with hdr-compute-peak, enabled.

Portable mpv log.txt

@haasn
Copy link
Member

haasn commented Jul 5, 2018

@Doofussy2 Might be a problem with the translation of compute shaders to D3D11. Can you test with OpenGL or Vulkan?

@sat4all1
Copy link

sat4all1 commented Jul 5, 2018

I have the same problem with D3D11, While OpenGL and Vulkan work fine.

@Doofussy2
Copy link

I can test later, but the screenshots are from using nvdec-copy.

@sat4all1
Copy link

sat4all1 commented Jul 5, 2018

i was talking about gpu-api=d3d11, hwdec is irrelevant for me.

@Doofussy2
Copy link

Yeah, I believe that's what haasn was talking about, I just wanted to make sure that nvdec didn't affect it. But I think it's not part of it. Either way, I will test this as soon as I'm able.

@Doofussy2
Copy link

I can confirm that using opengl or vulkan, the picture is correct.

@haasn
Copy link
Member

haasn commented Jul 6, 2018

cc @rossy

@laichiaheng
Copy link
Author

laichiaheng commented Jul 6, 2018

It also happens to this scene with Vulkan and OpenGL without the logo.
It's from "LG_2_DEMO_4K_L_H_05_Rays of Light"
https://0x0.st/sfz7.mkv

@rossy
Copy link
Member

rossy commented Jul 7, 2018

@Doofussy2 @sat4all1 This seems like it could be a SPIRV-Cross regression. Can you see if it works in this test build (from #5804.) That has an older build of SPIRV-Cross, which fixes the issue for me. I'll try to find which change actually broke this.

@Doofussy2
Copy link

That works just fine, @rossy

Portable mpv log 2.txt

@Doofussy2
Copy link

This might require a new issue report, but as it's so closely related I thought I'd mention it. I'm comparing HDR playback between mpv and my MiBox3 (which signals the display to use the HDR metadata). To keep as much consistency as possible, I tested using the same HDMI input which then uses all the same display settings, and used the same media. There is a noticeable difference in luminance. With the MiBox, the picture is much more vivid and bright. So I experimented with vf=format=sig-peak. That does make a notable improvement....but also causes problems. First, any value higher than 0.0, seems to have the same effect. I went as low as 0.0001 and as high as 0.9, the difference was imperceptible to me. Second, even at 0.0001, the same scene I posted from Interstellar is blown out. But the picture is improved in most other scenes (I tried several movies), both with vulkan and opengl, and hdr-compute-peak off and on. The result is the same.

snapshot_156

@Doofussy2
Copy link

Doofussy2 commented Jul 7, 2018

To further illustrate what I'm referring to, here are some screenshots. The bottom one has the sig-peak=0.0001 and the top one is using just hdr-compute-peak. The bottom picture is more closely like the playback on my MiBox.

mpv-shot0002

mpv-shot0001

@haasn
Copy link
Member

haasn commented Jul 7, 2018

@Doofusy2 It sounds like you want --tone-mapping=clip. Try that. Or perhaps --tone-mapping=mobius if you don't like the clipped highlights?

@Doofussy2
Copy link

Thanks @haasn , but sadly neither of those make a difference.

@Doofussy2
Copy link

Doofussy2 commented Jul 7, 2018

This is what I tried

gpu-api=opengl
hwdec=nvdec-copy

vf=format=sig-peak=0.1
hdr-compute-peak
tone-mapping=mobius

Once you put in the sig-peak, nothing else seems to have an affect.

@Doofussy2
Copy link

Oh I think you meant use it instead of sig-peak! Ha! Sorry, still learning (and enjoying the experience). Yes, using

hdr-compute-peak
tone-mapping=mobuis

does brighten the image.

Thank you @haasn

@haasn
Copy link
Member

haasn commented Jul 8, 2018

FWIW, all that vf line does is completely disable tone mapping. (and peak detection)

@Doofussy2
Copy link

Doofussy2 commented Jul 8, 2018

Yeah, I kinda found that out (the hard way). Thanks again @haasn

I'm still hoping someone will look into the deinterlacing issue that I posted. That's a big problem, for me.

@sat4all1
Copy link

sat4all1 commented Jul 9, 2018

@rossy
The test build work ok for me too.

thanks

@rossy
Copy link
Member

rossy commented Jul 9, 2018

@Doofussy2 @sat4all1 Well, the SPIRV-Cross issue has been fixed, so hdr-compute-peak should work in future mpv builds. Thanks for testing.

Note: That issue was a bug specific to Windows and D3D11, so not the same as @laichiaheng's original problem.

@sat4all1
Copy link

sat4all1 commented Jul 9, 2018

thank you!

@Doofussy2
Copy link

This is resolved in the latest build.

@Doofussy2
Copy link

@lachs0r this problem is present in 0.29, but fixed in @shinchiro's builds

@mia-0
Copy link
Member

mia-0 commented Jul 31, 2018

@Doofussy2: Try the latest build.

@Doofussy2
Copy link

Yup, that works, @lachs0r, well done! Thank you!

@haasn
Copy link
Member

haasn commented Dec 23, 2018

Tentatively marking this as "not a bug" since the original issue (the unnatural brightness change in that scene) is specific to the clip (and clips like it), and we have no good way of working around it. (Any such way would probably be more accurately termed a feature request)

@haasn haasn closed this as completed Dec 23, 2018
@CounterPillow
Copy link
Contributor

In summary, I don't think this is going to be a problem for typical media; but if you have suggestions of how to improve the algorithm, I'd be all ears. Maybe we should prioritize regions of the screen based on their proximity to the center?

If there's an average involved somewhere, could a different method (e.g. median) be used? Or basically some other way to statistically filter out really far outliers.

Or maybe a separate buffer (could be lower res) which averages brightness over time, and then above a certain threshold is used as a mask for which samples to ignore in the peak computation, that way a bright spot that moves around isn't going to "heat up" the buffer, but a spot that stays exactly the same will.

@haasn
Copy link
Member

haasn commented Dec 24, 2018

If there's an average involved somewhere, could a different method (e.g. median) be used? Or basically some other way to statistically filter out really far outliers.

Maybe, but a median seems rather difficult to implement (efficiently).

Or maybe a separate buffer (could be lower res) which averages brightness over time, and then above a certain threshold is used as a mask for which samples to ignore in the peak computation, that way a bright spot that moves around isn't going to "heat up" the buffer, but a spot that stays exactly the same will.

Interesting idea. Actually, perhaps an approach like this could replace the current storage buffer-based averaging altogether. Basically, we would be downsampling to a small storage image (say one sample per workgroup), using additive blending. Main question would be how to update the running average, but maybe instead of performing a true running average over the past N frames we could instead just decay the previous buffer contents by either a constant value or a given coefficient? Might be faster even in the non-local case too.

I'll have a play with it after the holidays.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

7 participants