New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

aliasing problem with HDR and certain settings #4631

Closed
aufkrawall opened this Issue Jul 19, 2017 · 26 comments

Comments

Projects
None yet
2 participants
@aufkrawall

aufkrawall commented Jul 19, 2017

mpv version and platform

0.26.0 Windows and Linux

Reproduction steps

Set up these settings:
sigmoid-upscaling=yes
cscale=lanczos
dscale=lanczos

Expected behavior

Downscaled HDR video shouldn't look aliased in motion.

Actual behavior

With some structures like e.g. fine leaves, some very extinct aliasing becomes apparent in motion.

This is a problem between HDR video (dunno if Rec2020 related), sigmoid-upscaling and certain chroma and image scalers used at the same time.
The problem does not occur when setting sigmoid-upscaling=no or using bicubic for dscale.

Log file

http://sprunge.us/BWOH

Sample files

https://mega.nz/#!5vQRhT5Y!uRnFr5f0JtGc9KPeVqqMf5DCPkKtfn9v31FIBMZwrnQ

@haasn

This comment has been minimized.

Member

haasn commented Jul 19, 2017

Can you try adding --profile=opengl-hq?

@haasn

This comment has been minimized.

Member

haasn commented Jul 19, 2017

Also your log seems to identify that as BT.709 for some reason. Sounds like a bug, but I don't get it when using swdec so I assume your hwdec might be corrupting the metadata?

@aufkrawall

This comment has been minimized.

aufkrawall commented Jul 19, 2017

It doesn't occur with --profile=opengl-hq because it uses mitchell for dscale and like bicubic, it doesn't provoke the behavior.
It's not HW decoding related, I just need HW decoding to play the video fluidly. Since there are tons of dropped frames with software decoding, the issue is a bit harder to spot in motion, but it's definitely there.
In the wood scene, it likely can also be seen in screenshots.

@haasn

This comment has been minimized.

Member

haasn commented Jul 19, 2017

It doesn't occur with --profile=opengl-hq because it uses mitchell for dscale and like bicubic,

That's just the default, you can override it

@aufkrawall

This comment has been minimized.

aufkrawall commented Jul 19, 2017

Yes, the issue also occurs with a config which consists of just
profile=opengl-hq
dscale=lanczos

@haasn

This comment has been minimized.

Member

haasn commented Jul 23, 2017

Can you take a screenshot of the issue?

@aufkrawall

This comment has been minimized.

aufkrawall commented Jul 23, 2017

sigmoid on (note the aliasing corruption among the leaves in the background):
https://abload.de/img/sigmoidon0lsq7.png

sigmoid off (looks normal):
https://abload.de/img/sigmoidoffsgsyn.png

It's really obtrusive in motion and using gamma light instead of sigmoid comes with nasty drawbacks for other algorithms like smooth motion.

@haasn

This comment has been minimized.

Member

haasn commented Jul 23, 2017

Ah, yes. I see it.

Hmm, we could probably avoid it by doing tone mapping before downscaling when using linear light downscaling.

@haasn

This comment has been minimized.

Member

haasn commented Jul 23, 2017

(Or maybe you could just use mitchell downscaling like opengl-hq recommends)

@aufkrawall

This comment has been minimized.

aufkrawall commented Jul 23, 2017

Mitchell is way too soft to my eyes. :(
I don't really like any of the scalers chosen for the hq profile, my actual config looks like this:
hwdec=no
opengl-backend=x11egl
sigmoid-upscaling
correct-downscaling
cscale=ewa_lanczossharp
cscale-antiring=1
dscale=spline64
dscale-antiring=0.2
scale=ewa_lanczossharp
scale-antiring=1
video-sync=display-resample
interpolation
tscale=oversample
deband
x11-bypass-compositor=never
ytdl-format="((bestvideo[vcodec^=vp9]/bestvideo)+(bestaudio[acodec=opus]/bestaudio[acodec=vorbis]/bestaudio[acodec=aac]/bestaudio))/best"

@haasn

This comment has been minimized.

Member

haasn commented Jul 23, 2017

Random thought but does this help?

diff --git a/video/out/opengl/video_shaders.c b/video/out/opengl/video_shaders.c
index e83973b4b8..07b3e2bcf6 100644
--- a/video/out/opengl/video_shaders.c
+++ b/video/out/opengl/video_shaders.c
@@ -528,7 +528,7 @@ static void pass_tone_map(struct gl_shader_cache *sc, float ref_peak,
     GLSLF("// HDR tone mapping\n");
 
     // To prevent discoloration, we tone map on the luminance only
-    GLSL(float luma = dot(src_luma, color.rgb);)
+    GLSL(float luma = max(0, dot(src_luma, color.rgb));)
     GLSL(float luma_orig = luma;)
 
     // Desaturate the color using a coefficient dependent on the brightness
@aufkrawall

This comment has been minimized.

aufkrawall commented Jul 23, 2017

Sry, I'm quite noob with such stuff.
Would it be possible to eleborate the required steps a bit?

@haasn

This comment has been minimized.

Member

haasn commented Jul 23, 2017

  1. figure out how to compile mpv
  2. apply the patch (e.g. patch -p1 and then paste the patch)
  3. compile mpv
@haasn

This comment has been minimized.

Member

haasn commented Jul 24, 2017

Actually thinking about it I don't think that patch will help. Negative coefficients are virtually guaranteed to be clipped by this point anyway. The real problem stems from the interaction between super-bright values and negative coefficients, I think.

Basically if you have a filter kernel that uses negative lobes, any edges between very bright and very dark regions will receive extreme negative coefficients - enough to drop it all the way down to black, which results in the black pixels. So basically it's an extreme form of ringing.

Standard anti-ringing techniques could get rid of it (scale-antiring=1 or scale-clamp=0.8 for the more extreme variant). Alternatively we could try and teach the scaler to cap the negative contributions, which would be a less extreme version of “scale-clamp”.

I might try that if it's a big problem for you.

@aufkrawall

This comment has been minimized.

aufkrawall commented Jul 24, 2017

dscale-clamp=0.8 mitigates the issue, but I'd still consider the result "broken" in motion.

madVR's SDR conversion can also show a nasty sharpness for fine detail with high contrast, but even with all of the postprocessing turned off to mitigate that, it doesn't suffer that "temporal aliasing".

It doesn't look to me like dscale=bicubic was simply "hiding" the issue because it's blurrier. Looks more like the issue doesn't exist with it entirely.
Is it not possible that there is a bug with certain scalers, HDR and sigmoid/LL?

@haasn

This comment has been minimized.

Member

haasn commented Jul 24, 2017

Well this particular issue seems like an extreme case of ringing, which is why all of the non-ringing scalers don't seem to trigger it.

Anyway, we could still try and limit the amount of ringing being done with HDR sources.

@aufkrawall

This comment has been minimized.

aufkrawall commented Jul 24, 2017

Beware, clumsy layman theory: Or is it just not "correct" to do LL/sigmoid scaling with HDR? Wouldn't it be a simple workaround to add an option to do all scaling in gamma light with HDR content, except of smooth motion interpolation?

@haasn

This comment has been minimized.

Member

haasn commented Jul 24, 2017

sigmoid is only for upscaling, not downscaling. But yes, you could easily try doing gamma light downscaling for HDR content. But you said you run into other issues with that?

It also still has issues like not preserving brightness correctly etc.

@aufkrawall

This comment has been minimized.

aufkrawall commented Jul 24, 2017

According to manpage, sigmoid implies linear light, and that also does apply to downscaling if I got it right.
Thus the HDR aliasing issue also occurs when I change the config from sigmoid to just linear light.

Maybe I should be more clear: When I don't use linear light downscaling, that weird aliasing issue with HDR is gone by 100%. It looks totally fine then.

But I got another issue with not using linear light, independent of HDR video:
When I don't use linear light scaling, I can observe more distinct flicker with tscale=oversample + interpolation.
I guess that is because the interpolated data is then in gamma light and thus darker than the original data?

I think a minor brightness degradation for a still image when using scaling in gamma light for HDR is still several times better than the aliasing issue with linear light.

Thus my suggestion for an option which lets cscale, scale and dscale always happen in gamma light for HDR only, but tscale would still be linear/sigmoid light (if that's possible).

@haasn

This comment has been minimized.

Member

haasn commented Jul 24, 2017

I guess that is because the interpolated data is then in gamma light and thus darker than the original data?

Ah, yes, that is 100% the case. So maybe we should always do interpolation in linear light.

@haasn

This comment has been minimized.

Member

haasn commented Jul 24, 2017

FWIW I sort of have a refactor planned in the back of my head that would allow us to keep track of the “current” image parameters a bit better. That refactor would definitely help here, since it would allow us to very easy insert the necessary linearization commands. But for now, I could do this as a sort of work-around:

@haasn

This comment has been minimized.

Member

haasn commented Jul 24, 2017

Actually what we could also do is enable sigmoidization for HDR downscaling as well, which also lessens the effect of ringing (for the same reason as it does for upscaling).

@haasn

This comment has been minimized.

Member

haasn commented Jul 24, 2017

This is what sigmoid looks like, compared to gamma: It still rings a lot with lanczos though

comparison

@haasn

This comment has been minimized.

Member

haasn commented Jul 24, 2017

Unscaled:

sigmoid vs gamma

@haasn haasn closed this in 241d5eb Jul 24, 2017

@haasn

This comment has been minimized.

Member

haasn commented Jul 24, 2017

Seems like it makes a noticeable difference even for e.g. mitchell, it's just less pronounced than for lanczos.

I've merged the fix.

@aufkrawall

This comment has been minimized.

aufkrawall commented Jul 24, 2017

Thanks a lot, works as expected.
Btw. very nice how light on resources the SDR conversion is.

If there now was just a way to play HDR video fluidly on Linux and Nvidia without that broken cuda crap (or using slow Intel IGP). :(

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment