Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

HDR video / SMPTE-ST-2084 support #2572

Closed
lvml opened this issue Dec 8, 2015 · 34 comments
Closed

HDR video / SMPTE-ST-2084 support #2572

lvml opened this issue Dec 8, 2015 · 34 comments

Comments

@lvml
Copy link

lvml commented Dec 8, 2015

I noticed that recent commits addressed various "colorspace conversion via LUTs" topics, and that made me wonder whether mpv could be enabled to allow for (automatic or semi-automatic) replay of video material using the electro-optical transfer functions defined in SMPTE-ST-2084.

SMPTE-ST-2084 is supported by HDMI 2.0a, already supported by contemporary high-end TV sets, part of the upcoming UHD-Bluray disc standard and Dolby ITU-6C recommendation, and supported by x265 --transfer smpte-st-2084 (useful when encoding material from cameras with high dynamic range sensors - such as "full frame DSLRs" which easily exceed the dynamic range representable with bt.709).

I understand that one can parametrize mpv in a way to utilize ffmpeg filters to apply a LUT, I tried this once (using (maxval-minval)*pow((max(0\,(pow((val/(maxval-minval))\,.01268331351565596512)-0.8359375))/(18.8515625-18.6875*pow((val/(maxval-minval))\,.01268331351565596512)))\,6.27739454981539438107) as the LUT formula to yield the luminance), but that seriously sucks not only because it slows down replay a lot, but also because one will certainly want to adjust the mapping from the absolute luminosity scale of SMPTE-ST-2084 (0 to 10000 cd/m^2) to the luminosity of the output device (likely much smaller).

The more low hanging goal would be to enable mpv to

  • detect the use of SMPTE-ST-2084 by looking at the SEIs in the stream
  • replay SMPTE-ST-2084 encoded video on non-HDR screens, using some reasonable, GPU-based mapping LUT

A more difficult goal (probably depending on GPU vendors making HDMI 2.0a support available in their hardware/drivers) would be to enable mpv to replay SMPTE-ST-2084 encoded content on actually HDR capable screens.

(I do realize that this feature idea might certainly not be the most urgent one, but I think it's a good idea to have it at least on file / for discussion.)

Here are some links to related material:

@ghost
Copy link

ghost commented Dec 8, 2015

Probably not very hard. Do you have a link to the specification?

@haasn
Copy link
Member

haasn commented Dec 9, 2015

If ffmpeg exports this tag then it should be doable to at least handle it properly in the renderer (i.e. map it to the screen space according to whatever parameters are required).

But to do that, I need to understand how the transfer curve works. Is it simple?

@lvml
Copy link
Author

lvml commented Dec 9, 2015

The official standard text seems to be for sale here, you can also find a document containing the relevant part of the specification here.

BTW: AMD announced HDR / HDMI 2.0a support for GPUs coming in 2016, that will it make much more likely that gameplay videos will be recorded using SMPTE-ST-2048.

@haasn
Copy link
Member

haasn commented Dec 9, 2015

(Well, I'm not buying any standard and I can't open any docx)

@lvml
Copy link
Author

lvml commented Dec 9, 2015

(Well, the free / open source LibreOffice and OpenOffice can open the .docx just fine, if you don't want to install either of those, you can use www.openoffice-online.com from your browser for free - just click "login", then "Demo Only" button, then "Open a Document...", then cut&paste above .docx URL into the "Filename" field of the dialog and then "Open", this allows for browsing the document online in any HTML5 capable browser.)

@ArchangeGabriel
Copy link

(One might not be able to open docx by ideology, see https://www.gnu.org/philosophy/no-word-attachments.html for instance)

@lvml
Copy link
Author

lvml commented Dec 10, 2015

To answer the question whether the transfer curve is simple: Yes, the two functions at https://github.com/quantizationbit/CTLs/blob/master/PQ.ctl#L31 implement them (except for the clamping the standard recommends for the 10bit code values 0 to 3 and 1020 to 1023).

Regarding .docx: I'm probably as disgusted by MS' proprietary formats as every other reasonable person, but I guess the author of that document won't provide us with a different one. You can look at a plain text version of the document in the Google cache, but that version is not that pleasant to read.

haasn added a commit to haasn/mp that referenced this issue May 15, 2016
Currently, this relies on the user manually entering their display
brightness (since we have no way to detect this at runtime or from ICC
metadata). The default value of 250 was picked by looking at ~10 reviews
on tftcentral.co.uk and realizing they all come with around 250 cd/m^2
out of the box. (In addition, ITU-R Rec. BT.2022 supports this)

Since there is no metadata in FFmpeg to indicate usage of this TRC, the
only way to actually play HDR content currently is to set
``--vf=format=gamma=st2084``. (It could be guessed based on SEI, but
this is not implemented yet)

Incidentally, since SEI is ignored, it's currently assumed that all
content is scaled to 10,000 cd/m^2 (and hard-clipped where out of
range). I don't see this assumption changing much, though.

As an unfortunate consequence of the fact that we don't know the display
brightness, mixed with the fact that LittleCMS' parametric tone curves
are not flexible enough to support PQ, we have to build the 3DLUT
against gamma 2.2 if it's used. This might be a good thing, though,
consdering the PQ source space is probably not fantastic for
interpolation either way.

Partially addresses mpv-player#2572.
haasn added a commit to haasn/mp that referenced this issue May 15, 2016
Currently, this relies on the user manually entering their display
brightness (since we have no way to detect this at runtime or from ICC
metadata). The default value of 250 was picked by looking at ~10 reviews
on tftcentral.co.uk and realizing they all come with around 250 cd/m^2
out of the box. (In addition, ITU-R Rec. BT.2022 supports this)

Since there is no metadata in FFmpeg to indicate usage of this TRC, the
only way to actually play HDR content currently is to set
``--vf=format=gamma=st2084``. (It could be guessed based on SEI, but
this is not implemented yet)

Incidentally, since SEI is ignored, it's currently assumed that all
content is scaled to 10,000 cd/m^2 (and hard-clipped where out of
range). I don't see this assumption changing much, though.

As an unfortunate consequence of the fact that we don't know the display
brightness, mixed with the fact that LittleCMS' parametric tone curves
are not flexible enough to support PQ, we have to build the 3DLUT
against gamma 2.2 if it's used. This might be a good thing, though,
consdering the PQ source space is probably not fantastic for
interpolation either way.

Partially addresses mpv-player#2572.
@haasn
Copy link
Member

haasn commented May 15, 2016

The implementation I have chosen will just clip to the display's reproducible range, which has to be supplied by the user. (I don't see of any way to remove this requirement)

In practice I recommend actually overshooting your display's brightness by a bit, especially when your display is really dim like mine (≈ 60 cd/m²). The effect of overshooting is basically just linearly decreasing the brightness of the entire image. (Or you could use the “contrast” controls, which are a constant offset before application of PQ - this adjustment will be perceptually uniform thanks to PQ)

Basically, right now, the best way to play HDR content on a SDR display is to play with the contrast/brightness controls during runtime until you can see all the features you want. (We get to be our own video engineers, wheeee!)

Maybe mpv could implement some sort of “auto-leveling” mechanism (like HDR in video games) to pick a good scale factor based on the source?

@haasn
Copy link
Member

haasn commented May 15, 2016

I think the way I will expose this to the user is by presenting a choice option like hdr-tonemapping=<clip|softclip|scale|whatever|magic|...> with varying implementations.

e.g. clip would provide 1:1 mapping, scale would perform a linear scale from the detected peak down to the absolute peak, etc.

Could add more fancier tone mapping algorithms including local or global ones.

@lvml
Copy link
Author

lvml commented May 15, 2016

Regarding the display's reproducible range, supplication by the user is certainly fine for the moment, especially since with some displays, "energy saving" options will have an influence on the maximum luminance that can be displayed.

If it was possible to fetch the display name as transferred in the EDID (I don't know whether that's possible), then one could think of collecting presets for popular displays from the community, in the form of some preset-config file.

Regarding the hdr-tonemapping - the option sounds like a good idea. I'm not sure which functions the current hardware players utilize when they map SMPTE-2084 luminance values into bt.709 values, but according to experience reports both the Samsung and the Panasonic UHD BluRay player do a decent job at this - they have to do that mapping at least whenever an UHD BluRay disc is to be played via HDMI 2.0 (and not "2.0a"). From the (layman) descriptions I can read in experience reports on these players, it sounds like some form of sigmoid function is used, with a broad linear part and a small range with a less steep slope in the very dark and very bright end.

@lvml
Copy link
Author

lvml commented May 15, 2016

I just compiled mpv from the source as of haasn@0fb307c - but I am somewhat confused about whether the conversion is available with --vf=format=gamma=st2084 alone, as described in haasn@2de08a7

Using --vf=format=gamma=st2084 alone doesn't seem to make a difference - does it work with "-vo gl", only?

@haasn
Copy link
Member

haasn commented May 15, 2016

It only works with -vo=opengl. By default, mpv does not perform gamma mapping at all. To enable it, you have to pick some output curve to map to, e.g. -vo=opengl-hq:target-trc=gamma2.2 (or load an ICC profile for your display).

You should probably also set target-brightness to the peak brightness of your monitor. (Values higher than this will be clipped)

@lvml
Copy link
Author

lvml commented May 15, 2016

Ok, then it will take at least until tomorrow before I can test the feature - the only computer I'm currently with doesn't do well with "-vo gl" (old nVidia GPU, current commercial nVidia drivers no longer supporting it, current open source drivers unstable with acceleration... just the usual nVidia trouble).

haasn added a commit to haasn/mp that referenced this issue May 16, 2016
Currently, this relies on the user manually entering their display
brightness (since we have no way to detect this at runtime or from ICC
metadata). The default value of 250 was picked by looking at ~10 reviews
on tftcentral.co.uk and realizing they all come with around 250 cd/m^2
out of the box. (In addition, ITU-R Rec. BT.2022 supports this)

Since there is no metadata in FFmpeg to indicate usage of this TRC, the
only way to actually play HDR content currently is to set
``--vf=format=gamma=st2084``. (It could be guessed based on SEI, but
this is not implemented yet)

Incidentally, since SEI is ignored, it's currently assumed that all
content is scaled to 10,000 cd/m^2 (and hard-clipped where out of
range). I don't see this assumption changing much, though.

As an unfortunate consequence of the fact that we don't know the display
brightness, mixed with the fact that LittleCMS' parametric tone curves
are not flexible enough to support PQ, we have to build the 3DLUT
against gamma 2.2 if it's used. This might be a good thing, though,
consdering the PQ source space is probably not fantastic for
interpolation either way.

Partially addresses mpv-player#2572.
@lvml
Copy link
Author

lvml commented May 16, 2016

I now had the opportunity to test haasn@e047cc0 with some example HDR files. But so far, I've not found a set of parameters that works for more than one of the demo files - can you give an example which file did look good for you with what parameters?
Maybe the mastering information contained in the SEIs is the only way to not require different parameters for every file?

@haasn
Copy link
Member

haasn commented May 17, 2016

Some more notes:

  1. --vf=format=gamma=st2084 is no longer required, since autodetection is now possible.
  2. You can choose the tone mapping algorithm used by mpv with hdr-tone-mapping

The default mode of operation is to hard-clip all values above the peak brightness of your display, which creates a bloom-like effect (and unfortunately also distorts color accuracy. I want to attempt doing the tone mapping in L*ab or similar spaces in the future)

clipping
clipping

If you want to sacrifice in-range accuracy and contrast in return for not clipping, you could try e.g. hdr-tone-mapping=simple:

simple mapping
simple mapping

What you could also do is adjusting your “contrast” controls (in mpv, should be keys 1/2 by default) to reduce the dynamic range of the video, and your “brightness” controls (should be keys 3/4) to fine-tune which brightness region you want to view with detail.

No adjustment: (this is on 63 cd/m² target-brightness iirc)
image
-30 contrast:
image
-30 contrast, -1O brightness:
image
-30 contrast, +17 brightness:
image

Much like HDR in photography and video games, high dynamic range here means we have details in every part of the spectrum, but on our standard range monitors that means we can only ever see a limited part of that spectrum at a time (or squish the entire thing via tone mapping).

Video games etc. have solved this problem by doing stuff like adjusting the tone mapping dynamically as you moved from indoor to outdoor scenes, or using more sophisticated algorithms that consider local contrast in order to preserve the details in every brightness range. I will try implementing some more of these tone mapping algorithms in the future, since it's fun to play around with them, but for now this is all we've got.

@haasn
Copy link
Member

haasn commented May 17, 2016

Note that the effect is even more pronounced if you use a lower monitor brightness. If I set my display to brightness 100 (350 cd/m²), the result with naive clipping is almost passable:

350 cd/m²

But if I set it to my normal setting of brightness 0 (63 cd/m²), the result with naive clipping is very bad:

63 cd/m²

In a sense, this result codifies the limitations of our existing standard range technology.

@haasn
Copy link
Member

haasn commented May 17, 2016

And finally, for the sake of comparison (although I will probably move this out to a wiki article sooner or later), the other two hdr-tone-mapping settings:

gamma (default param)
gamma

linear (default param)
linear

linear (param = 8.0)
linear 8x

@lvml
Copy link
Author

lvml commented May 17, 2016

@haasn: First of all, thanks a lot for your great contribution on HDR video! - I tried now with 26b6d74, and things look really, really good!

I've experimented a little sitting in front of my laptop, which has a very good IPS panel, and next to it my actually HDR capable LG OLED TV. I set my (uncalibrated) laptop display with xbacklight =100 to its brightest setting, and then tried to match the looks when playing the demo files "Exodus", "Life of Pi" and "Sony 4k HDR camp" on the laptop display as closely as possible to the display on my TV (which played the demo files from USB).

So far, the parameters I found to most closely (and remarkebly close!) resemble the looks of the "original" HDR display on the LG was:
-vo opengl:target-brightness=600:target-prim=bt.709:hdr-tone-mapping=simple:tone-mapping-param=0.7.

Without the target-prim=bt.709, colors from the (bt.2020 using) file looked unnaturally desaturated.

The target-brightness=600 is certainly more than the display can do - but it seems to set an "appropriate" clipping point.

hdr-tone-mapping=simple looks best to me when used with tone-mapping-param=0.7, the default value 0.5 results in too little contast left for the most relevant parts of the picture.

hdr-tone-mapping=clip also looks good, but is loosing a little to many details in the highlights.

hdr-tone-mapping=linear and hdr-tone-mapping=gamma didn't look good to me, regardless of the mapping-param value - they produce "way too gray" images, and I didn't want to play around with the "contrast" setting.

BTW: If you want to experiment with other HDR tonemapping functions, please consider the CIECAM02 Lab based tonemapping as implemented e.g. in https://github.com/Beep6581/RawTherapee - that method gives me marvellous results when I need to squeeze high-dynamic-contrast still images into the bt.709 colorspace. (I'm usually using the parameters strength=0.5 gamma=1.0 edge_stopping=1.4 scale=0.3 reweighting_iterates=0 with this method.)

@haasn
Copy link
Member

haasn commented May 18, 2016

Without the target-prim=bt.709, colors from the (bt.2020 using) file looked unnaturally desaturated.

Yes, the default behavior of mpv is to not adapt gamuts unless the user requests it. (Since more people seem to complain about that than people seem to understand it) Maybe we could change this in the future for BT.2020 files? (using the same reasoning as tone mapping HDR by default)

I've experimented a little sitting in front of my laptop, which has a very good IPS panel, and next to it my actually HDR capable LG OLED TV.

Oh good, so we have somebody with an actual HDR TV so we can run real-world comparisons! It would help a lot if you could somehow measure the brightness of your display and set that as the true display-brightness and then see if clipping looks identical to the HDR TV (in the non-clipped regions of the image). Because in theory, it should. But in practice, I don't trust either the laptop display or the HDR TV particularly much. :p

Note that “overshooting” the target-brightness is a “safe” operation, all it does is effectively make the image darker (linearly) which can help it match your true display brightness better. Using hdr-tone-mapping=linear has a similar effect, in your case of 600 you would be setting the tone-mapping-param=16.6 in order to get a similar output brightness.

BTW: If you want to experiment with other HDR tonemapping functions, please consider the CIECAM02 Lab based tonemapping as implemented e.g. in https://github.com/Beep6581/RawTherapee

I'll have a look, thanks for pointing it out.

@lvml
Copy link
Author

lvml commented May 18, 2016

I was assuming that mpv does already try to automatically adapt colorspaces - mpv's predecessor MPlayer did that starting from 2009, and the "defaults" description at https://github.com/mpv-player/mpv/blame/master/DOCS/man/vf.rst#L218 led me to believe that mpv has probably continued this tradition.

The visible differences between bt.601 and bt.709 when confused are much smaller than with bt.709 and bt.2020 - so I'd say if at all the information which colorspace a file was encoded for is available, an automatic conversion should be enabled if the display colorspace is a different one.

Regarding the measurement of my display luminance: I do not have the hardware for this kind of measurement, plus it's extremely difficult to measure "one value" for LG's OLED displays - maximum luminance depends heavily on the size of the "white" area, some magazine measured 417 cd/m2, small white areas are "brighter", bigger white areas are "less bright". I'm pretty sure LG does apply "more than just clipping" in HDR mode, if only because they use different "whites" depending on the amount of light to be emitted by the display as a whole.

Regarding "Using hdr-tone-mapping=linear has a similar effect, in your case of 600 you would be setting the tone-mapping-param=16.6 in order to get a similar output brightness" - yes, just tested that, looks as good as the above mentioned "simple" setting.

I guess things will become more complicated if "Dolby Vision" HDR material was to be displayed, as "Dolby Vision" uses different parameters "per scene", so it might not look good if such material was replayed without interpreting new SEI information whenever it appears in the data. But that's not something I'd target for now, my current quest is one for an open-source workflow for people to produce their own HDR videos, and most people will certainly be totally fine with static HDR parameters.

@lvml
Copy link
Author

lvml commented May 18, 2016

(I have created issue #3157 for a feature request very similar to the SMPTE-2084 transfer function support, but for V-Log - which would support a HDR video creation workflow "from the other end", right after recording.)

@haasn
Copy link
Member

haasn commented May 19, 2016

I was assuming that mpv does already try to automatically adapt colorspaces - mpv's predecessor MPlayer did that starting from 2009, and the "defaults" description at https://github.com/mpv-player/mpv/blame/master/DOCS/man/vf.rst#L218 led me to believe that mpv has probably continued this tradition.

I think you are confusing two separately different things here. This is a function for auto-detecting the YUV matrix, which is a trend continued by mpv. (In fact, mpv also tries to auto-detect primaries and gamma curves - two concepts of which MPlayer is ignorant of)

mpv does use the auto-detected YUV matrix when converting from YUV to RGB, which is a mandatory step during playback. However, mpv does not use the auto-detect primaries or gamma curve for anything, since colorspace or gamma conversion is not strictly necessary. (Except in the case of HDR, XYZ or linear light input, in which case it will just pick some sane fallback)

The visible differences between bt.601 and bt.709 when confused are much smaller than with bt.709 and bt.2020 - so I'd say if at all the information which colorspace a file was encoded for is available, an automatic conversion should be enabled if the display colorspace is a different one.

I agree in principle, and it might be time to revisit this stance.

@lvml
Copy link
Author

lvml commented May 19, 2016

Ah, got it.

BTW: I mentioned earlier in this issue that AMD announced HDR / HDMI 2.0a support for GPUs coming in 2016, meanwhile nVidia also announced that their new GTX 1070 and GTX 1080 graphics cards will support HDR, also they specifically advertise that their game video capture software "Ansel" will allow to record HDR videos - and given that Youtube also announced HDR support I am confident that the amount of publically available HDR videos will increase quickly, and soon.

@lvml
Copy link
Author

lvml commented Jun 8, 2016

(I noticed there's now a new default tone mapping function "hable" for HDR material in mpv - looks good!)

@lvml
Copy link
Author

lvml commented Nov 17, 2016

Youtube now hosts HDR videos encoded using VP9, in the .webm container. mpv can already replay them, when instructed about the colorspace and EOTF:

mpv -vf format=colormatrix=bt.2020-cl:primaries=bt.2020:gamma=st2084 -vo opengl-hq --ytdl-format=337 'https://youtu.be/9WRW3f8ZGWQ'

But it would be great if mpv could pick up the information that the video is using BT.2020 and SMPTE-2084 automatically from the container - this document seems to describe where to find this information: https://www.webmproject.org/docs/container/

Colour
Preliminary implementation of the Colour element , a child of the Video element (Segment → Tracks → TrackEntry → Video → Colour). See illustration and further description below. ...
MatrixCoefficients 5 [55][B1] 2 u The Matrix Coefficients of the video used to derive luma and chroma values from reg, green, and blue color primaries. For clarity, the value and meanings for MatrixCoefficients are adopted from Table 4 of ISO/IEC 23001-8:2013/DCOR1. (0:GBR, 1: BT709, 2: Unspecified, 3: Reserved, 4: FCC, 5: BT470BG, 6: SMPTE 170M, 7: SMPTE 240M, 8: YCOCG, 9: BT2020 Non-constant Luminance, 10: BT2020 Constant Luminance)
TransferCharacteristics 5 [55][BA] 2 u The transfer characteristics of the video. For clarity, the value and meanings for TransferCharacteristics 1-15 are adopted from Table 3 of ISO/IEC 23001-8:2013/DCOR1. TransferCharacteristics 16-18 are proposed values. (0: Reserved, 1: ITU-R BT.709, 2: Unspecified, 3: Reserved, 4: Gamma 2.2 curve, 5: Gamma 2.8 curve, 6: SMPTE 170M, 7: SMPTE 240M, 8: Linear, 9: Log, 10: Log Sqrt, 11: IEC 61966-2-4, 12: ITU-R BT.1361 Extended Colour Gamut, 13: IEC 61966-2-1, 14: ITU-R BT.2020 10 bit, 15: ITU-R BT.2020 12 bit, 16: SMPTE ST 2084, 17: SMPTE ST 428-1 18: ARIB STD-B67 (HLG))

@haasn
Copy link
Member

haasn commented Nov 18, 2016

But it would be great if mpv could pick up the information that the video is using BT.2020 and SMPTE-2084 automatically from the container - this document seems to describe where to find this information: https://www.webmproject.org/docs/container/

It can. 81ceb7b c676c31

By the way, your vf line is wrong. IT should be bt.2020-ncl, not bt.2020-cl. Constant Luminance is basically unused and furthermore makes very little sense with HDR.

@ghost
Copy link

ghost commented Nov 18, 2016

Should already work in git master as of a few days ago.

Also I think this can be closed.

@ghost ghost closed this as completed Nov 18, 2016
@lvml
Copy link
Author

lvml commented Nov 19, 2016

@haasn: Thanks for pointing this out - great to hear this has already been implemented!

However, I just compiled mpv from the git master head, and the automatic recognition does not seem to have the desired effect for Youtube HDR videos like the example I linked above.

mpv -v --ytdl-format=337 'https://youtu.be/9WRW3f8ZGWQ' gives me a very desaturated, wrong image, and the terminal output looks like this:

[mkv] |  + Track number: 1
[mkv] |  + Track type: Video
[mkv] |  + Video track
[mkv] |   + Pixel width: 3840
[mkv] |   + Pixel height: 2160
[mkv] |    + Matrix: bt.709
[mkv] |    + Primaries: bt.2020
[mkv] |    + Gamma: auto
[mkv] |    + Levels: limited
[mkv] |    + HDR peak: 1000.000000
[mkv] |  + Codec ID: V_VP9
...
[vd] Decoder format: 3840x2160 [0:1] yuv420p10 bt.709/auto/auto/limited CL=unknown
[vd] Using container aspect ratio.
[vf] Video filter chain:
[vf]   [in] 3840x2160 yuv420p10 bt.709/bt.2020/bt.1886/limited SP=1000.000000 CL=unknown
[vf]   [out] 3840x2160 yuv420p10 bt.709/bt.2020/bt.1886/limited SP=1000.000000 CL=unknown

If instead, I use mpv -v --ytdl-format=337 'https://youtu.be/9WRW3f8ZGWQ' -vf format=colormatrix=bt.2020-ncl:primaries=bt.2020:gamma=st2084, then the picture looks marvellous (and correct), the terminal output looks like this:

[mkv] |  + Track number: 1
[mkv] |  + Track type: Video
[mkv] |  + Video track
[mkv] |   + Pixel width: 3840
[mkv] |   + Pixel height: 2160
[mkv] |    + Matrix: bt.709
[mkv] |    + Primaries: bt.2020
[mkv] |    + Gamma: auto
[mkv] |    + Levels: limited
[mkv] |    + HDR peak: 1000.000000
[mkv] |  + Codec ID: V_VP9
...
[vf] Opening video filter: [format colormatrix=bt.2020-ncl primaries=bt.2020 gamma=st2084]
[format] Setting option 'colormatrix' = 'bt.2020-ncl' (flags = 0)
[format] Setting option 'primaries' = 'bt.2020' (flags = 0)
[format] Setting option 'gamma' = 'st2084' (flags = 0)
[cplayer] Starting playback...
[vo/opengl/x11] Disabling screensaver.
[vd] Decoder format: 3840x2160 [0:1] yuv420p10 bt.709/auto/auto/limited CL=unknown
[vd] Using container aspect ratio.
[vf] Video filter chain:
[vf]   [in] 3840x2160 yuv420p10 bt.709/bt.2020/bt.1886/limited SP=1000.000000 CL=unknown
[vf]   [format] "format.00" 3840x2160 yuv420p10 bt.2020-ncl/bt.2020/st2084/limited NP=10000.000000 SP=1000.000000 CL=unknown
[vf]   [out] 3840x2160 yuv420p10 bt.2020-ncl/bt.2020/st2084/limited NP=10000.000000 SP=1000.000000 CL=unknown

Any idea what went wrong, here?

(The same symptom occurs with other Youtube HDR videos as collected in their HDR launch playlist

(If you'd prefer a separate issue opened for this topic, sure can do - I just didn't want to lose context.)

@haasn
Copy link
Member

haasn commented Nov 19, 2016

@lvml Could it be that your ffmpeg version is too old to support ST2084?

It works on my end, but the fact that it comes up as auto on your end is suspicious and indicates that your copy of the appropriate libavutil header doesn't include the ST2084 curve. Also, why the hell is that filed tagged “bt.709” matrix? Shouldn't bt.2020 content be using the bt.2020-ncl matrix? I blame youtube retardation.

@wm4 this might be a reason to implement our own conversion function after all

@lvml
Copy link
Author

lvml commented Nov 19, 2016

@haasn: I was trying with ffmpeg compiled from commit 69449da436169e7facaa6d1f3bcbc41cf6ce2754, Date: Mon Sep 26 20:25:59 2016 +0200. Not the latest, but not that old. A grep in the ffmpeg sources I used finds:

./libavutil/pixdesc.c:    [AVCOL_TRC_SMPTEST2084] = "smpte2084",
./libavutil/pixfmt.h:    AVCOL_TRC_SMPTEST2084  = 16, ///< SMPTE ST 2084 for 10-, 12-, 14- and 16-bit systems
./libavutil/color_utils.c:static double avpriv_trc_smpte_st2084(double Lc)
./libavutil/color_utils.c:        case AVCOL_TRC_SMPTEST2084:
./libavutil/color_utils.c:            func = avpriv_trc_smpte_st2084;

I can try with the latest ffmpeg git head tomorrow, if that promises to help.

@ghost
Copy link

ghost commented Nov 19, 2016

@wm4 this might be a reason to implement our own conversion function after all

Might be a good idea to just use the h264 values in mpv too.

@lvml
Copy link
Author

lvml commented Nov 19, 2016

@haasn: Regarding the strange contradiction between the codec and the container attributes regarding colorspace, I would assume that the original encoder (in the case of above link some Atomos hardware field recorder storing the material in ProRes) simply did not know the correct colorspace, and this information was probably only later attached to the container of the encoded video stream - possibly after the conversion to VP9 by Youtube.

@brunogm0
Copy link
Contributor

brunogm0 commented Jan 8, 2017

Hi,
Reading a lot of the standards, ive found the design reason of rec.2020 is to cover pointer gamut with real primaries.(actual 99.98%) BUT xvYCC also cover 100% of pointer and is bigger than DCI-P3 that UHD-1 requires 90%. Also HDMI-1.3 has deepcolor 10,12,16 bit. Now most plasmas from 2007... have drivers with 13bit or 6144 gradations capability. (CCFL /LED 10bit LCD also support xvYCC)

So a retro compatibility mode where "HDR" content or native xvYCC content( Mastered in 4k blu-rays, youtu.be/CkPDCbaUOBc) can be enabled to improve things for most of us that dont have new 4k HDR panels and bring some improvments.
PS.: There is a bunch of confusion by changing names of standard gamuts or deleting documents with xvYCC. Like "Dr. Pointer's gamut" as used in HDR UHD, Dolby 2084PQ materials was called "Munsel Surface color" or optimal gamut.

as proof there is this:
web.archive.org/web/20090829145618/http://www.sony.net:80/SonyInfo technology/technology/theme/xvycc_01.html
https://www.researchgate.net/publication/292147952_Extended_colour_space_for_capturing_devices table 2.
"Strange they omitt xvYCC: but DCI-P3 comparison is there." tftcentral.co.uk/articles/pointers_gamut.htm

@kkkrackpot
Copy link
Contributor

kkkrackpot commented Feb 1, 2017

@haasn

(although I will probably move this out to a wiki article sooner or later)

Didn't you change your mind?
Can you please do a little wiki on handling HDR stuff in mpv?

This issue was closed.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

6 participants