New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

How to play HDR videos on a HDR screen? #5521

Closed
bodayw opened this Issue Feb 13, 2018 · 50 comments

Comments

Projects
None yet
@bodayw

bodayw commented Feb 13, 2018

This might be a silly question, but the answer is not obvious to me.

So currently mpv is able to handle HDR videos quite well: it does HDR tone mapping by default, assuming when one is using a SDR screen. But I am wondering what is the "correct" way to play HDR videos on a HDR screen with mpv.

Below is what I have tried so far.

The HDR screen I have at hand is a Sony X900E. Given that the tone response curve in HDR mode is completely different from those in SDR (somewhat close to a power function), I think it would need a different icc profile measured when the screen is in HDR mode. I used i1 Display Pro with DisplayCal, with white level drift compensation enabled (not sure it's actually necessary here, though), and switched target "tone curve" under "calibration" tab to "as measured", and proceeded with "profile only".

Then in mpv, again with --icc-profile-auto to let mpv load the new profile, and since I already have a icc profile, I don't need to specify --target-trc=pq. Then if I understand correctly, I should use --tone-mapping=linear and leave --tone-mapping-param as 1.0 the default, to "disable" tone mapping, or say, to let mpv actually map the original luma of the video to the display using the measured tone response curve in the icc profile?

I am not sure I'm doing it right, but with the above settings, I could see somewhat similar result compared with mpv playing the same file in SDR mode with default tone mapping (--tone-mapping=mobius in v0.28), and also similar to madVR passthrough HDR metadata to the screen.

I also have two other questions which are kind of related:

  1. AFAIK mpv still doesn't support sending HDR metadata to the screen, and if it does, what would be the difference than configuring it to output normal range but "upconverted" to match HDR tone response curve, (hopefully) like the way I did above?

  2. In the mpv manual, it explains --tone-mapping as:

This is relevant for both HDR->SDR conversion as well as gamut reduction (e.g. playing back BT.2020 content on a standard gamut display).

which confused me because I thought gamut reduction is handled by icc color management already, which by default with --icc-intent of relative colorimetric, will just clip the out of range colors.

Also, is it correct that --tone-mapping won't do anything for a non-HDR video?

Thanks a lot!

@haasn

This comment has been minimized.

Member

haasn commented Feb 13, 2018

There are a couple of approaches here, each with their own limitations.

AFAIK mpv still doesn't support sending HDR metadata to the screen

Correct. The only way to get HDR passthrough currently is via third party programs.

and if it does, what would be the difference than configuring it to output normal range but "upconverted" to match HDR tone response curve, (hopefully) like the way I did above?

Depends on how much magic your screen is doing. Also, in HDR mode, the screen might have a very unstable and difficult-to-measure response curve. For example, displays based on local dimming or other such dynamic contrast techniques will be almost impossible to accurately profile. Additionally, displaying large patches of very bright content might trigger built-in peak brightness limits, resulting in a very different curve than when looking at only a handful of pixels.

If I was to buy a HDR display, I would make sure it has a linear response and high static contrast ratio (no dynamic contrast or local dimming techniques), and I would adjust the hardware brightness controls to max out at some fixed peak brightness e.g. 400 cd/m² (typical OLED peak), with a 10-bit or higher display connection. Then, I would use my GPU's CLUTs to switch between “SDR mode” (where 1.0 maps to 100 cd/m²) and “HDR mode” (where 1.0 maps to 400 cd/m²), so I can switch between the modes in software (e.g. when mpv is fullscreen). The use of a 10-bit display connection instead of an 8-bit connection will be enough to overcome the loss of 2 bits due to the 4x increase in peak brightness.

Then if I understand correctly, I should use --tone-mapping=linear and leave --tone-mapping-param as 1.0 the default, to "disable" tone mapping

This is one way of doing it, although the problem with this approach is that it depends on the source peak. If the source peak is, for example, 1000 cd/m², then your display would need to be calibrate to 1000 cd/m² for this to produce correct results.

Another approach would be to use --tone-mapping=clip and configure the tone mapping param based on your display's peak white point. This is essentially the same thing as --tone-mapping=linear except that the “value scale” is based on your display, rather than the source content.

Ideally, the best way to do it would be to use an actual tone mapping function like hable to tone map from e.g. 2000 cd/m² to your calibrated 400 cd/m² peak (or whatever), while allowing SDR content to passthrough untouched. The reason this doesn't work in mpv currently is because mpv has no user-facing option to set the target-peak (i.e. the measured peak of the target display).

There are several ways we could go about improving this situation:

  1. Add --target-peak which lets the user configure their display's peak brightness explicitly. For SDR curves, this gives the measured white level. (For --target-trc=pq, this soft-caps the output brightness. For --target-trc=hlg, this could be used to influence the inverse OOTF calculation)

  2. Measure the white point level from the ICC profiles' metadata somehow? At the very least, argyllCMS-generated ICC profiles don't seem to contain information about the absolute white point. Failing that, we'd need to add an option to set it explicitly (--icc-profile-white-level, or maybe re-use --target-peak for that?)

  3. I'm not sure if this is technically feasible, but you could also try coaxing/extending argyllCMS to generate floating point ICC profiles that contain out-of-range values (i.e., it would map white to Y=1, and super-whites to Y>1, e.g. Y=4). If this is done, we'd need to alter our 3DLUT generation code slightly in order to compensate. (We'd need to switch to float format, which is pretty straightforward) Then you'd be able to just define your reference white level during profile generation, rather than using “as-measured” normalization.

  4. Finally, there's also the option of introducing the concept of a --target-avg-brightness, which could serve a similar role to --target-peak in that you'd be able to use it to uniformly “make everything darker” in order to compensate for the fact that your reference white as calibrated in HDR mode is not at 1.0 (but at some lower value like 0.25 in my example).

I'm not sure which of these approaches is the best in the long run. There's also the issue that introducing e.g. --target-peak or ICC profile peak measurement brings with it the question of how we interpret that information.

which confused me because I thought gamut reduction is handled by icc color management already,

To clarify: When using an ICC profile, the “target-gamut” and “target-trc” are ignored and locked to the input space of the 3DLUT. Additionally, the 3DLUT is always generated against the input file's color space, so as a result, the gamut mapping step is always a no-op. However, when not using an ICC profile, the target-prim may differ from the file, and hence gamut reduction may occur - which will also be tone-mapped by the tone mapping algorithm.

Also, is it correct that --tone-mapping won't do anything for a non-HDR video?

False, for the reason I just explained. Watching an SDR BT.2020 video on an SDR BT.709 monitor will also involve tone mapping when not using an ICC profile.

@kkkrackpot

This comment has been minimized.

Contributor

kkkrackpot commented Feb 13, 2018

The only way to get HDR passthrough currently is via third party programs.

Is HDR passthrough really improves anything when HDR content is sent to an HDR monitor (a modern OLED TV, for example)?
And what are those "third party programs"?
And is there any chance for mpv to get HDR passthrough working?

@haasn

This comment has been minimized.

Member

haasn commented Feb 13, 2018

Is HDR passthrough really improves anything when HDR content is sent to an HDR monitor (a modern OLED TV, for example)?

I think the general consensus for mpv is that our software implementations should be as good if not better than whatever crap device manufacturers etc. put into their hardware. If this is not the case, rather than putting a high priority on HDR passthrough, I would instead try and improve our algorithms.

That said, there is a case to be made for HDR passthrough because e.g. PQ is more suited for the human visual system at very high contrast ratios.

And is there any chance for mpv to get HDR passthrough working?

We're limited by platform availability. The only platform that supports native HDR passthrough right now is Windows, using DXGI swapchains in exclusive fullscreen mode. Vulkan supports it on some platforms, but that currently basically means “only android”.

@rossy and @jeeb were both thinking about working in support for HDR output on windows. On Linux I don't think it's going to be supported any time soon; and if, only in fullscreen mode as well.

Native HDR compositing seems far on the horizon, and I honestly also doubt whether or not it would be needed. Although much like color management, we could shift the responsibility for this from the application to the compositor in general.

@kkkrackpot

This comment has been minimized.

Contributor

kkkrackpot commented Feb 13, 2018

@haasn Sorry for stupid question, but do you mean that (in simplified theory, at least) mpv does with HDR the same thing in software, as a HDR TV does in hardware? If so -- is there any sense at all to get a HDR TV for HDR content?
BTW, do you plan to bring --target-peak back?

@bodayw

This comment has been minimized.

bodayw commented Feb 14, 2018

Thanks for the prompt and comprehensive answer @haasn !

Also, in HDR mode, the screen might have a very unstable and difficult-to-measure response curve.

That is very true.

For calibration in SDR mode, I have already noticed that my calibration always fail if I leave local dimming on during measurement. I had to turn it off during measurement and turn it back on afterwards (which of course will make colors a bit off again, but I guess that's the trade off has to be made between color accuracy and a decent black level, as even putting local dimming at "low" already makes it look so much better when viewing in a dark room, while max deltaE is still within 3).

And in HDR mode, the backlight of the screen is going to be variable anyway, making calibration whole lot harder. There is also the problem that in DisplayCal's workflow, it's hard to get the screen's peak brightness because for most current HDR TVs it's only achievable in a small patch on the screen at a time. For example, I only got around 500 cd/m2 as the white level in my calibration, but according to the review on rtings.com, this model can go as high as 883 cd/m2 in a 2% size window of the screen.

Nevertheless, I guess it should still be better than nothing, especially with some try and error to compensate, plus the fact that most HDR TVs right now don't have a close to 100% DCI-P3 gamut, and I don't trust the TV could handle gamut mapping well on its own.

Another approach would be to use --tone-mapping=clip and configure the tone mapping param based on your display's peak white point. This is essentially the same thing as --tone-mapping=linear except that the “value scale” is based on your display, rather than the source content.

Sorry but I'm still confused here.

First, please excuse me for not being familiar with the technical standards, but what is the peak brightness value of the source that I should take for reference? Say, the source is mastered in HDR10 at 1000 cd/m2, does it mean that 1000 cd/m2 is encoded as 1.0 in the source, or it's variable e.g. the actually peak brightness it happened to be after mastering, like a "random" number of 836 cd/m2, is encoded as 1.0?

My understanding is that the problem with --tone-mapping=linear and --tone-mapping-param=1 is that I will need to make sure the max brightness level of my screen matches with the one in the source. Otherwise I will need to adjust --tone-mapping-param to "overshoot" (in case the peak brightness of my screen is lower) and meet the peak brightness level in the source. Or, I can use --tone-mapping=clip, and --tone-mapping-param would be the multiplicative inverse of the value I use for --tone-mapping=linear. Is it correct?

For example, my icc profile indicates the white point is 500 cd/m2, and assume 1000 cd/m2 is encoded as 1.0 in the source, then I should either use --tone-mapping=linear and --tone-mapping-param=2, or --tone-mapping=clip and --tone-mapping-param=0.5.

But it seems to me that either way, I might have to adjust the param often because it always depends on the source peak brightness level (HDR10 contents may be mastered to 1000-4000 cd/m2), or especially worse, I will have to adjust it for every single file if the peak brightness value encoded in the file is variable for every single file (rather than always 1000 cd/m2, 4000 cd/m2 etc.).

The reason this doesn't work in mpv currently is because mpv has no user-facing option to set the target-peak (i.e. the measured peak of the target display).

argyllCMS-generated ICC profiles don't seem to contain information about the absolute white point.

I remember there was an option like --target-peak when HDR support was implemented initially, but got removed afterwards. I thought the reason was that mpv is able to detect the peak brightness of the display by looking into the icc profile. There is a field in the icc profile called luminance which says the measured white level. What is the difference between this and the peak brightness level, or "absolute white point"?

Although I'm aware that the measured white level in my calibration is far from accurate (~500 cd/m2 vs. 880+ cd/m2 as I said above).

@bodayw

This comment has been minimized.

bodayw commented Feb 14, 2018

@kkkrackpot

is there any sense at all to get a HDR TV for HDR content?

I'm not an expert so please correct me if I'm wrong, but I think the answer here is still yes.

HDR screens are just more capable than SDR screens in terms of having higher peak brightness while maintaining good contrast ratio in general.

The problem for current LCD HDR screens is that, as @haasn said above, the built-in backlight adjustment makes it almost impossible to calibrate, and it's basically not at all adjustable by the end user. You can turn it off of course, but that means you are turning off the technology which made HDR possible on a LCD screen in the first place.

As for OLED screens, they don't have the problem of "lifting" the black level while getting bright, as they don't rely on backlights. The problem for OLED screens is just that they cannot get as bright as LCD screens. That said, picture quality is still much better on a OLED screen no matter SDR or HDR.

@jeeb

This comment has been minimized.

Member

jeeb commented Feb 14, 2018

First, a recommendation. ITU-T report BT.2390 has a surprisingly good explanation of things like OETF, EOTF and OOTF. If you end up wondering what the keywords I mumbo-jumbo about are, most likely they are mentioned in that document. It also goes through things like common misconceptions etc., so generally a useful thing to scroll through if you are interested in what HDR video is.

Say, the source is mastered in HDR10 at 1000 cd/m2, does it mean that 1000 cd/m2 is encoded as 1.0 in the source, or it's variable e.g. the actually peak brightness it happened to be after mastering, like a "random" number of 836 cd/m2, is encoded as 1.0?

In short, it should be what's there after grading and what's being encoded. AKA, what is the result after artistic adjustments.

So this (HDR10, metadata saying 1000 nits maximum luminance) is a case of:

  • transfer function being PQ (SMPTE ST.2084).
  • colorspace being BT.2020 (NCL.)
  • Static clip-wide maximum luminance metadata during the whole clip being 1000 nits

The color-related metadata (esp. the transfer function) defines the range of values and which value for luminance matches what amount of nits. So you can have a range from, say, 0.001 nits to 10k nits (for example, these are values often brought up with regards to PQ).

Then the metadata tells you that the maximum coded luminance during the whole clip was 1000 nits, which is supposed to make tone mapping simpler for screens that cannot reproduce all the way to the theoretical maximum coded luminocity of a given standard. In other words, it tells the thing rendering the content to not think about possible values higher than those 1000 nits, as they should not appear.

Of course, this does not help one with all the various possible scenes in a clip, as not all scenes will most likely use the whole range of 0.001 to 1000 nits. This is why both screens and software video renderers most likely have some sort of dynamic calculation of the maximum luminocity in a given frame or set of frames and adapt their tone mapping as per content.

@haasn

This comment has been minimized.

Member

haasn commented Feb 14, 2018

If so -- is there any sense at all to get a HDR TV for HDR content?

Higher contrast ratio, support for non-mpv HDR devices and software (e.g. games, PS4, blu-ray players), and a potentially more efficient transfer curve.

For calibration in SDR mode, I have already noticed that my calibration always fail if I leave local dimming on during measurement.

Can't you get around this by making the test area fullscreen?

First, please excuse me for not being familiar with the technical standards, but what is the peak brightness value of the source that I should take for reference? Say, the source is mastered in HDR10 at 1000 cd/m2, does it mean that 1000 cd/m2 is encoded as 1.0 in the source, or it's variable e.g. the actually peak brightness it happened to be after mastering, like a "random" number of 836 cd/m2, is encoded as 1.0?

PQ is based on an absolute scale, so a value of 1.0 (i.e. an integer value of 940 for a 10-bit samples) is always equal to 10,000 cd/m², regardless of the source peak. So with a tagged peak of 2000 cd/m², this essentially means that the highest value in the source is 0.2 (after linearization).

HLG on the other hand is a parametrized one-family transfer curve with a tunable peak. So a value of 1.0 always maps to the configured peak parameter. In theory, we could consult the tagged source peak from the mastering metadata and tune the HLG OOTF based on this value. But in practice, what we do (in order to get more consistent results) is just assume HLG is always tuned for 1000 cd/m², and treat it the same as PQ in terms of how the source peak is handled. We could change this logic if there's a good reason to add the extra complexity.

So these are the values coming in. Now what mpv does is it re-scales the values to match the value range of the target display. So for example, going from PQ (10000 cd/m² range) to an SDR target curve (100 cd/m² range), all values would get multiplied by 10000/100 = 100.

So take an input value of PQ 0.2 (= 2000 cd/m²). Multiplied times 100, this results in 20, which is 20 times the target range (1.0) and therefore clearly needs to be tone-mapped.

In addition, when using HDR peak detection (on by default), if the detected scene average brightness exceeds a value of 0.25 (on the SDR scale), the scene gets linearly darkened as a whole to compensate.

For example, my icc profile indicates the white point is 500 cd/m2, and assume 1000 cd/m2 is encoded as 1.0 in the source, then I should either use --tone-mapping=linear and --tone-mapping-param=2, or --tone-mapping=clip and --tone-mapping-param=0.5.

No, --tone-mapping=clip --tone-mapping-param=1. It's probably best to look at the code here:

linear: out = in * out_peak/in_peak * param
clip: out = in * param

But it seems to me that either way, I might have to adjust the param often because it always depends on the source peak brightness level (HDR10 contents may be mastered to 1000-4000 cd/m2)

If you find yourself constantly adjusting the tone-mapping-param, you're probaly using the wrong curve. If you want to preserve the input brightness always (at the risk of clipping), use --tone-mapping=clip. If you never want to clip (at the risk of everything being too dark), use --tone-mapping=linear. If you never want to clip and want things to remain perceptually bright, at the risk of distorting colors, use another tone mapping curve.

I remember there was an option like --target-peak when HDR support was implemented initially, but got removed afterwards.

It got removed because I realized that the HDR standards imply a standard reference white of 100 cd/m², so there was no more reason to include guess work. But that being said, it would only be useful to re-introduce if you explicitly want to allow “HDR-in-SDR” mode like you're trying to do. And the way it would be handled would be somewhat different.

There is a field in the icc profile called luminance which says the measured white level. What is the difference between this and the peak brightness level, or "absolute white point"?

Oh, this is absolutely true. I must have overlooked it. Okay, there's a big reason to try and incorporate it into our algorithms. I'll start work on this later-ish.

@haasn

This comment has been minimized.

Member

haasn commented Feb 14, 2018

Incidentally, I think our handling of HLG is wrong. Due to the stupid way in which mpv has to keep un-normalizing and re-normalizing the value range, we assume a normalization scale of 12.0 for HLG throughout - even after the OOTF, where this is not longer true (due to our assumption of peak 1000 cd/m² which implies a dynamic range of 10:1).

I have to change the way normalization is done in mpv to make more sense (basically switching back to an absolute scale), which should help greatly keep down the complexity of all this stuff.

@haasn

This comment has been minimized.

Member

haasn commented Feb 14, 2018

So RFC: At what point (brightness level) do we consider an ICC profile to be an actual HDR display, rather than just an SDR display that happens to be brighter or darker (probably due to environmental considerations) than normal?

For example, I have my display calibrated to about 65 cd/m² since I'm in a dim light environment. Obviously, in a case like this, we should assume 65 cd/m² is the reference white and treat it as we would 100 cd/m² normally. And others might have their displays calibrated to 200 cd/m² and still expect it to behave like SDR displays would.

Maybe we should pick some cutoff point like 400 cd/m² and assume 400 and above is HDR (100 cd/m² input maps to 100 cd/m² on the display, higher values are used for super-highlights), and everything below is SDR (100 cd/m² input maps to whatever the display's luminance is)?

I'm sort of worried about the potential confusion that could arise from this. But at the same time, I don't want to require users set every option under the sun by hand in order to get working HDR output on a calibrated HDR monitor.

Maybe we should do a dual check where we require the peak to be at least 400 and require the measured contrast ratio to be at least 10000:1? Is that kind of value typical for a profiled HDR display?

@kkkrackpot

This comment has been minimized.

Contributor

kkkrackpot commented Feb 14, 2018

@haasn Not sure if it's relevant, but afaik my SDR TV has ~310 cd/m. I can check calibration report later if needed.
UPD. Checked the report, it seems to have only 136 nits peak. No idea why I thought of 310...

haasn added a commit to haasn/mpv that referenced this issue Feb 14, 2018

vo_gpu: introduce --target-peak
This solves a number of problems simultaneously:

1. When outputting HLG, this allows tuning the OOTF based on the display
   characteristics.
2. When outputting PQ or other HDR curves, this allows soft-limiting the
   output brightness using the tone mapping algorithm.
3. When outputting SDR, this allows HDR-in-SDR style output, by
   controlling the output brightness directly.

Closes mpv-player#5521
@haasn

This comment has been minimized.

Member

haasn commented Feb 14, 2018

I did some preliminary work on this issue. @bodayw it should be possible to make mpv take into account your “SDR” display's true brightness and treat it as it would a native HDR display, by setting --target-peak=<nits> accordingly.

I also looked at what it would take to read out this information from the ICC profile, but it's a bit annoying since we don't even open the ICC profile at all when reading it from a cache. So I'd either always need to open the profile even if we already have the 3DLUT cached, or I'd need to store the tagged whitepoint alongside the cached 3DLUT. Not sure what's less annoying.

@bodayw

This comment has been minimized.

bodayw commented Feb 14, 2018

@jeeb

First, a recommendation.

Thanks! I'll definitely take a careful look.

@haasn

Can't you get around this by making the test area fullscreen?

Sounds like a good idea. I'm give it a try later. In theory one should always use the same settings during calibration and watching...especially mpv now uses the contrast ratio in the icc profile, while changing the local dimming setting would change that value for sure.

PQ is based on an absolute scale,

going from PQ (10000 cd/m² range) to an SDR target curve (100 cd/m² range), all values would get multiplied by 10000/100 = 100.

clip: out = in * param

I think it makes sense to me now. So in my case I would like to use --tone-mapping=clip, and map 500 cd/m2 in the source to 1.0 on the SDR scale (which is the measured peak luminance of my display), as 500 cd/m2 would be 0.05 in PQ scale, times 100 and I get 5, then in order for 5 * param = 1.0, param should be 0.2.

If you find yourself constantly adjusting the tone-mapping-param, you're probaly using the wrong curve.

Right. But now that I know PQ source values are in a absolute scale and mpv always use 100 cd/m2 as SDR target range, I realized what I said above doesn't make any sense. Please ignore that.

RFC: At what point (brightness level) do we consider an ICC profile to be an actual HDR display,

To have a dual check with the contrast ratio sounds much safer. For LCD displays some might be able to reach above 400 cd/m2 brightness, but without local backlight adjustment a 10000:1 CR would be far from possible.

Right now I can only speak from the only HDR TV that I have, which has a peak luminance of ~360 cd/m2 in SDR mode, and with local dimming on, 10000:1 CR is easily achievable (either in SDR or HDR mode).

@haasn

This comment has been minimized.

Member

haasn commented Feb 15, 2018

@bodayw Could you maybe upload your ICC profile of the HDR display calibrated (in whatever mode is possible)?

@bodayw

This comment has been minimized.

bodayw commented Feb 15, 2018

@haasn Here are the icc profiles I'm currently using:

https://0x0.st/sT1S.icm
https://0x0.st/sT1Q.icm

The first one is generated in HDR mode, where I left "tone curve" as "as measured", so it doesn't have calibration curves. The second one is generated in SDR mode, with D65, 100 cd/m2 and gamma 2.2 as calibration target.

BTW, I managed to have the TV calibrated with local dimming turned on using full screen patches. However, it always fail in HDR mode if I also turned on "X-tended contrast range". It's a proprietary algorithm developed by Sony and I couldn't find any details on what it does, apart from "making bright regions brighter and dark regions darker".

Would you mind sharing some thoughts on display calibration? Why do you use 65 cd/m2 as SDR white level? It seems like cinemas (usually pitch black) require white level at round 50 cd/m2 for the projector, and I too found 100 cd/m2 in a pitch black room way too bright. But now I put a light source behind the screen so that dim light diffuse across the wall (BT.2100 specifies 5 cd/m2 for that but I don't have proper measurement tools at hand so it's just at the level I feel OK), and 100 cd/m2 looks comfortably enough.

Also, in the new commit you mentioned "calibrating the HDR screen in disguise at gamma 2.8". How is gamma 2.8 relevant here? As said above, I didn't set any calibration target in HDR mode, and the actual tone response curve measured looks nothing like a gamma defined one (as you can see in the file).

@haasn

This comment has been minimized.

Member

haasn commented Feb 16, 2018

Why do you use 65 cd/m2 as SDR white level?

It's the lowest my display goes without cutting into the dynamic range. (A setting of 0 on the brightness setting)

But now I put a light source behind the screen so that dim light diffuse across the wall (BT.2100 specifies 5 cd/m2 for that but I don't have proper measurement tools at hand so it's just at the level I feel OK), and 100 cd/m2 looks comfortably enough.

I do have a dim light diffused across the wall, completely uncalibrated. I still think 65 cd/m² is too bright, in particular when viewing anything other than dark colors. (My desktop is mostly black)

Also, in the new commit you mentioned "calibrating the HDR screen in disguise at gamma 2.8". How is gamma 2.8 relevant here?

When not using an ICC profile, you have to calibrate it to one of the actual curves defined by --target-trc. But you're right, the curve chosen doesn't really matter. I just figured if you're going to be calibrating to a gamma curve, higher is better.

@kevmitch

This comment has been minimized.

Member

kevmitch commented Feb 17, 2018

https://www.youtube.com/watch?v=Ac7G7xOG2Ag

@bodayw have you had a chance to test @haasn's PR?

@bodayw

This comment has been minimized.

bodayw commented Feb 18, 2018

Sorry but I haven't since I couldn't get mpv compiled on my machine...I was following the guide with MSYS2 but looks like there are still some required files missing, and I don't have time to look into this further.

I would be happy to try when I have time again, or if someone can provide a mpv binary of that branch, otherwise I also don't mind waiting until this get included in future releases.

@zc62

This comment has been minimized.

Contributor

zc62 commented Feb 19, 2018

@bodayw The guide is actually written for the future (It needs ffmpeg 3.5 release entering MSYS2 repository), so it won't work if you follow closely. You need to compile ffmpeg git master.

See #5237 (comment)

I think you can test haasn's PR with vulkan so don't worry about compiling with d3d11 yet, which requires a little bit more effort.

haasn added a commit to haasn/mpv that referenced this issue Feb 20, 2018

vo_gpu: introduce --target-peak
This solves a number of problems simultaneously:

1. When outputting HLG, this allows tuning the OOTF based on the display
   characteristics.
2. When outputting PQ or other HDR curves, this allows soft-limiting the
   output brightness using the tone mapping algorithm.
3. When outputting SDR, this allows HDR-in-SDR style output, by
   controlling the output brightness directly.

Closes mpv-player#5521

@jeeb jeeb closed this in 441e384 Feb 20, 2018

@Bananamax

This comment has been minimized.

Bananamax commented Feb 26, 2018

Greetings. So officially MPV doesnt support HDR and doesnt activate HDR on an HDR Tv. Will that happen soon in the future or not at all? What other option is there to force HDR on for the TV and have it look like its working? ^^

@jeeb

This comment has been minimized.

Member

jeeb commented Feb 26, 2018

mpv does support HDR content, but since the APIs are limited on most operating systems we cannot support switching the GPU mode to 10bit, PQ (which is what "HDR over HDMI/DP" currently is).

Windows is the only OS where this is possible with some Win10+ APIs (or nvidia's proprietary crap), which I should be looking into soon (example code from MS is available here)

@haasn

This comment has been minimized.

Member

haasn commented Feb 26, 2018

It may be possible soon with --gpu-context=drm. I'll work on it as soon as I get some time.

@kkkrackpot

This comment has been minimized.

Contributor

kkkrackpot commented Feb 26, 2018

@haasn you mean windows only?

@wm4

This comment has been minimized.

Contributor

wm4 commented Feb 26, 2018

DRM is a Linux thing. And yes it needs to be implemented for every OS and windowing system.

@Bananamax

This comment has been minimized.

Bananamax commented Feb 26, 2018

Its fine on Windows I would appreciate it a lot :)

@haasn

This comment has been minimized.

Member

haasn commented Feb 26, 2018

Apparently nvidia have been working on getting HDR support into X.org. For wayland, I have no idea. I'm not knowledgeable in the area of wayland compositors and colorspace support in general.

(Which is sort of sad. I wish I was able to offer advice during protocol design or something so that we end up with the best world scenario)

@wm4

This comment has been minimized.

Contributor

wm4 commented Feb 26, 2018

I'd expect wayland does or will reflect the DRM API, like it seems to be the case for YCbCr wayland surfaces. But dunno either.

@haasn

This comment has been minimized.

Member

haasn commented Feb 26, 2018

I did outline my thoughts about how a color managed compositor might work and detailed them in this e-mail on the argyllcms mailing list:

I think ultimately, a compositor would be the best component in charge
of establishing a consistent output, using software. For example, in my
ideal world, the compositor might do the following:

  1. Accept a linear FP16 BT.709 buffer from client A, tagged with peak 12.0
    and gamut BT.709
  2. Accept a 10-bit PQ BT.2020 buffer from client B, tagged with peak 6.0
    and gamut DCI-P3
  3. Accept an 8-bit BT.1886 BT.709 buffer from client C (SDR)

Then it would go ahead and mix those into the same backbuffer, by
transforming them to some reference colorspace (e.g. linear fp16
BT.2020, or scRGB or whatever) while simultaneously tone-mapping to the
display peak and gamut if necessary. This can be handled by a
content-specific 3DLUT, which can be generated based on an ICC profile
or set of ICC profiles.

For example, say the actual display only handles a peak of 4.0 and
AdobeRGB gamut - then buffers A and B will need to be tone-mapped since
their tagged peak/gamut exceeds the display's capabilities.

Finally, the GPU would, as part of scanout, encode this to whatever
signal the display expects (e.g. 10-bit BT.2020 PQ) with constant HDR10
metadata and calibration curves that are generated by ArgyllCMS in order
to make the end-to-end response from the fp16 backbuffer to the light
on-screen as linear as possible. (Or alternatively: make the output of
the scRGB backbuffer closely follow the desired curve, matching the one
used by the compositor during compositing)

It's also possible that the use of an intermediate fp16 backbuffer
could be avoided as well, by directly scanning out from separate planes
and baking the overall conversion using the GPU's matrices and LUTs.

@Bananamax

This comment has been minimized.

Bananamax commented Feb 27, 2018

Sounds reasonable

@Bananamax

This comment has been minimized.

Bananamax commented Mar 3, 2018

Any wonderful progress? :)

@Bananamax

This comment has been minimized.

Bananamax commented Mar 13, 2018

We need the MPV goodness combined with real HDR metadata passthrough it would mean we are in heaven!

@Bananamax

This comment has been minimized.

Bananamax commented Apr 11, 2018

Any progress on the HDR on an HDR TV?

@haasn

This comment has been minimized.

Member

haasn commented Apr 12, 2018

No, and asking over and over again isn't going to change that. The status quo remains unchanged. libdrm doesn't support HDR, neither do X.org nor Wayland. Only D3D11 does, but the D3D11 devs seem to be too busy with other things.

@ctlaltdefeat

This comment has been minimized.

ctlaltdefeat commented May 18, 2018

Is it possible to sum the situation up for non-experts in the domain, now that both "HDR TVs" and sources with HDR metadata are becoming ubiquitous?

In other words, for typical circumstances, what special configuration should be done (if at all)? Roughly how does mpv's implementation work and in what cases?

Thanks

@Hrxn

This comment has been minimized.

Hrxn commented May 18, 2018

The short answer: It does not work at all (yet)..

@ctlaltdefeat

This comment has been minimized.

ctlaltdefeat commented May 18, 2018

@Hrxn I was referring to one of @haasn 's posts here where he writes "I think the general consensus for mpv is that our software implementations should be as good if not better than whatever crap device manufacturers etc. put into their hardware."

@jeeb

This comment has been minimized.

Member

jeeb commented May 18, 2018

mpv can handle HDR content and convert/tone map it to target primaries and transfer function with max screen brightess target . The lack of actual HDR output is mostly due to:

  1. non-Windows systems not having the interfaces to tell the GPU to set the connection to BT.2020&PQ. For Windows I have done the initial steps in #5804 , but I have lacked the time to continue working on it in addition to $dayjob. When I get the time and all I will try to push it forwards more.
  2. The renderer's tone mapping actually can actually do very good results, and I wouldn't be surprised if it was better than what various manufacturers do in their screens. What @haasn did is he has a high quality 400 nits monitor (which also has "HDR" mode, albeit it seems like he has various issues with it), and he can tone map to 400nits and show that on screen.
@ctlaltdefeat

This comment has been minimized.

ctlaltdefeat commented May 18, 2018

So basically: if I have an HDR TV then the characteristics of the hardware together with mpv's algorithms mean that I would get an "HDR-like" effect, despite not having "actual HDR"?

@bodayw

This comment has been minimized.

bodayw commented May 18, 2018

So basically: if I have an HDR TV then the characteristics of the hardware together with mpv's algorithms mean that I would get an "HDR-like" effect, despite not having "actual HDR"?

Yes. However getting the characteristics of the screen could be tricky, as Haas mentioned above and in the email list link he posted. But this would be a problem for HDR videos played in the "normal" way as well (and you have even less control).

And yes for now you'll have to manually switch your screen to "HDR mode" and set the --target-peak in mpv.

@haasn

This comment has been minimized.

Member

haasn commented May 18, 2018

I mean in theory if you have a high contrast display with a high peak brightness you can use whatever curve or monitor configuration you want, as long as you 1) inform mpv about this and 2) set up target-peak correctly. What you could even do is control the GPU's VCGT curves in order to make it approximate an SDR BT.1886 response even in HDR mode, at the cost of bit depth. (So only do this on 10 or ideally 12 bit connections). Then you can switch between HDR and SDR by just loading a different ICC profile via software.

I tried doing this with dell's implementation but unfortunately their FALD algorithm is so terrible that it completely kills the gamut, and also causes distortions in general based on the average brightness level. Really, what we'd need to do for a full solution here is to take into account the surrounding brightness level when performing color adaptation, to compensate for the fact that the result will be over- or under-estimated by the dynamic backlighting in that zone. This could in theory be done using compute shaders, but it would be excruciatingly difficult to implement, and also require several blur passes in order to simulate the lapped nature of the backlighting regions.

Or we could just collectively agree to stop considering displays with dynamic backlighting zones as HDR. At best, they're “fake HDR” or “marketing HDR”. I think that's the best way forward. The only true HDR tech currently on the market is OLED, and it's not available for displays..

@bodayw

This comment has been minimized.

bodayw commented May 18, 2018

Actually I've already gave up watching HDR videos in HDR mode on my LED backlit TV. Not just all the cumbersomeness of calibration, tweaking options and switching modes etc, but more importantly, so far I couldn't see any obvious visual improvements over SDR with mpv tone-mapping...

@ctlaltdefeat

This comment has been minimized.

ctlaltdefeat commented May 19, 2018

Thanks for the prompt responses guys.

@Bananamax

This comment has been minimized.

Bananamax commented May 19, 2018

Thats exactly what I wanna do, HDR video through mpv to my Oled! And activating the HDR mode manually doesnt work with shitty Win 10. Everything is wrong if you activate that manually.

@wm4

This comment has been minimized.

Contributor

wm4 commented May 19, 2018

Not just all the cumbersomeness of calibration, tweaking options and switching modes etc, but more importantly, so far I couldn't see any obvious visual improvements over SDR with mpv tone-mapping...

Well, in most cases, the display will do exactly the same thing (tone mapping) with its DSP in native HDR mode. The only thing that could matter is that the display may not expose its full brightness/color range in sRGB mode.

If native HDR mode has an advantage, my guess is that it depends on the display hardware, and which colorspaces the display makes accessible to outside applications.

And of course the aforementioned "marketing HDR" will always suck.

@bodayw

This comment has been minimized.

bodayw commented May 19, 2018

The only thing that could matter is that the display may not expose its full brightness/color range in sRGB mode.
If native HDR mode has an advantage, my guess is that it depends on the display hardware, and which colorspaces the display makes accessible to outside applications.

Color space is not a problem on my "marketing HDR" TV (Sony X900E). It has a setting to switch between several different color gamuts, including BT.2020. Of course the panel doesn't come close to cover full BT.2020 (actually not even full DCI-P3), but you can just remove the limit by selecting BT.2020 and have the screen at its native gamut before you start calibrating it. This setting is separate from the HDR stuff, so you can still get full color gamut in SDR mode.

Full brightness is obviously not available in SDR mode. But my point is that those added details in HDR videos are too obscure to be noticeable in most cases when compared with tone-mapped SDR, at least that's my conclusion from handful HDR movies watched on my "marketing HDR" TV.

@hethhhhh

This comment has been minimized.

hethhhhh commented Jun 18, 2018

there are projectors that have HDR support, and it looks good with MPC, I hope soon we will be able to use mpv instead <3

@Bananamax

This comment has been minimized.

Bananamax commented Jul 25, 2018

Still nothing yet

@Bananamax

This comment has been minimized.

Bananamax commented Oct 4, 2018

Nothing

@jeeb

This comment has been minimized.

Member

jeeb commented Oct 4, 2018

#5804 clearly shows that it's possible and all. And even has a Windows build!

Now I would absolutely love just hacking on my pet projects when I actually have the brain capacity to do something like that after $dayjob , but unfortunately shitty things like debugging issues with modules I do not even use and reviewing bug fixes takes preference because I am an idiot asshole like that.

Please do not add useless comments here unless you want to see this thread closed seemingly this one was already closed by someone. Thank you, and I hope for your co-operation. There are ways of actually contributing to people's willingness to work on things, and you have seemingly done the exact opposite.

@Bananamax

This comment has been minimized.

Bananamax commented Nov 7, 2018

Another month has passed

@mpv-player mpv-player locked and limited conversation to collaborators Nov 7, 2018

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.