New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
HDR video / SMPTE-ST-2084 support #2572
Comments
Probably not very hard. Do you have a link to the specification? |
If ffmpeg exports this tag then it should be doable to at least handle it properly in the renderer (i.e. map it to the screen space according to whatever parameters are required). But to do that, I need to understand how the transfer curve works. Is it simple? |
The official standard text seems to be for sale here, you can also find a document containing the relevant part of the specification here. BTW: AMD announced HDR / HDMI 2.0a support for GPUs coming in 2016, that will it make much more likely that gameplay videos will be recorded using SMPTE-ST-2048. |
(Well, I'm not buying any standard and I can't open any docx) |
(Well, the free / open source LibreOffice and OpenOffice can open the .docx just fine, if you don't want to install either of those, you can use www.openoffice-online.com from your browser for free - just click "login", then "Demo Only" button, then "Open a Document...", then cut&paste above .docx URL into the "Filename" field of the dialog and then "Open", this allows for browsing the document online in any HTML5 capable browser.) |
(One might not be able to open docx by ideology, see https://www.gnu.org/philosophy/no-word-attachments.html for instance) |
To answer the question whether the transfer curve is simple: Yes, the two functions at https://github.com/quantizationbit/CTLs/blob/master/PQ.ctl#L31 implement them (except for the clamping the standard recommends for the 10bit code values 0 to 3 and 1020 to 1023). Regarding .docx: I'm probably as disgusted by MS' proprietary formats as every other reasonable person, but I guess the author of that document won't provide us with a different one. You can look at a plain text version of the document in the Google cache, but that version is not that pleasant to read. |
Currently, this relies on the user manually entering their display brightness (since we have no way to detect this at runtime or from ICC metadata). The default value of 250 was picked by looking at ~10 reviews on tftcentral.co.uk and realizing they all come with around 250 cd/m^2 out of the box. (In addition, ITU-R Rec. BT.2022 supports this) Since there is no metadata in FFmpeg to indicate usage of this TRC, the only way to actually play HDR content currently is to set ``--vf=format=gamma=st2084``. (It could be guessed based on SEI, but this is not implemented yet) Incidentally, since SEI is ignored, it's currently assumed that all content is scaled to 10,000 cd/m^2 (and hard-clipped where out of range). I don't see this assumption changing much, though. As an unfortunate consequence of the fact that we don't know the display brightness, mixed with the fact that LittleCMS' parametric tone curves are not flexible enough to support PQ, we have to build the 3DLUT against gamma 2.2 if it's used. This might be a good thing, though, consdering the PQ source space is probably not fantastic for interpolation either way. Partially addresses mpv-player#2572.
Currently, this relies on the user manually entering their display brightness (since we have no way to detect this at runtime or from ICC metadata). The default value of 250 was picked by looking at ~10 reviews on tftcentral.co.uk and realizing they all come with around 250 cd/m^2 out of the box. (In addition, ITU-R Rec. BT.2022 supports this) Since there is no metadata in FFmpeg to indicate usage of this TRC, the only way to actually play HDR content currently is to set ``--vf=format=gamma=st2084``. (It could be guessed based on SEI, but this is not implemented yet) Incidentally, since SEI is ignored, it's currently assumed that all content is scaled to 10,000 cd/m^2 (and hard-clipped where out of range). I don't see this assumption changing much, though. As an unfortunate consequence of the fact that we don't know the display brightness, mixed with the fact that LittleCMS' parametric tone curves are not flexible enough to support PQ, we have to build the 3DLUT against gamma 2.2 if it's used. This might be a good thing, though, consdering the PQ source space is probably not fantastic for interpolation either way. Partially addresses mpv-player#2572.
The implementation I have chosen will just clip to the display's reproducible range, which has to be supplied by the user. (I don't see of any way to remove this requirement) In practice I recommend actually overshooting your display's brightness by a bit, especially when your display is really dim like mine (≈ 60 cd/m²). The effect of overshooting is basically just linearly decreasing the brightness of the entire image. (Or you could use the “contrast” controls, which are a constant offset before application of PQ - this adjustment will be perceptually uniform thanks to PQ) Basically, right now, the best way to play HDR content on a SDR display is to play with the contrast/brightness controls during runtime until you can see all the features you want. (We get to be our own video engineers, wheeee!) Maybe mpv could implement some sort of “auto-leveling” mechanism (like HDR in video games) to pick a good scale factor based on the source? |
I think the way I will expose this to the user is by presenting a choice option like e.g. clip would provide 1:1 mapping, scale would perform a linear scale from the detected peak down to the absolute peak, etc. Could add more fancier tone mapping algorithms including local or global ones. |
Regarding the display's reproducible range, supplication by the user is certainly fine for the moment, especially since with some displays, "energy saving" options will have an influence on the maximum luminance that can be displayed. If it was possible to fetch the display name as transferred in the EDID (I don't know whether that's possible), then one could think of collecting presets for popular displays from the community, in the form of some preset-config file. Regarding the hdr-tonemapping - the option sounds like a good idea. I'm not sure which functions the current hardware players utilize when they map SMPTE-2084 luminance values into bt.709 values, but according to experience reports both the Samsung and the Panasonic UHD BluRay player do a decent job at this - they have to do that mapping at least whenever an UHD BluRay disc is to be played via HDMI 2.0 (and not "2.0a"). From the (layman) descriptions I can read in experience reports on these players, it sounds like some form of sigmoid function is used, with a broad linear part and a small range with a less steep slope in the very dark and very bright end. |
I just compiled mpv from the source as of haasn@0fb307c - but I am somewhat confused about whether the conversion is available with Using |
It only works with You should probably also set |
Ok, then it will take at least until tomorrow before I can test the feature - the only computer I'm currently with doesn't do well with "-vo gl" (old nVidia GPU, current commercial nVidia drivers no longer supporting it, current open source drivers unstable with acceleration... just the usual nVidia trouble). |
Currently, this relies on the user manually entering their display brightness (since we have no way to detect this at runtime or from ICC metadata). The default value of 250 was picked by looking at ~10 reviews on tftcentral.co.uk and realizing they all come with around 250 cd/m^2 out of the box. (In addition, ITU-R Rec. BT.2022 supports this) Since there is no metadata in FFmpeg to indicate usage of this TRC, the only way to actually play HDR content currently is to set ``--vf=format=gamma=st2084``. (It could be guessed based on SEI, but this is not implemented yet) Incidentally, since SEI is ignored, it's currently assumed that all content is scaled to 10,000 cd/m^2 (and hard-clipped where out of range). I don't see this assumption changing much, though. As an unfortunate consequence of the fact that we don't know the display brightness, mixed with the fact that LittleCMS' parametric tone curves are not flexible enough to support PQ, we have to build the 3DLUT against gamma 2.2 if it's used. This might be a good thing, though, consdering the PQ source space is probably not fantastic for interpolation either way. Partially addresses mpv-player#2572.
I now had the opportunity to test haasn@e047cc0 with some example HDR files. But so far, I've not found a set of parameters that works for more than one of the demo files - can you give an example which file did look good for you with what parameters? |
Note that the effect is even more pronounced if you use a lower monitor brightness. If I set my display to brightness 100 (350 cd/m²), the result with naive clipping is almost passable: But if I set it to my normal setting of brightness 0 (63 cd/m²), the result with naive clipping is very bad: In a sense, this result codifies the limitations of our existing standard range technology. |
@haasn: First of all, thanks a lot for your great contribution on HDR video! - I tried now with 26b6d74, and things look really, really good! I've experimented a little sitting in front of my laptop, which has a very good IPS panel, and next to it my actually HDR capable LG OLED TV. I set my (uncalibrated) laptop display with So far, the parameters I found to most closely (and remarkebly close!) resemble the looks of the "original" HDR display on the LG was: Without the The
BTW: If you want to experiment with other HDR tonemapping functions, please consider the CIECAM02 Lab based tonemapping as implemented e.g. in https://github.com/Beep6581/RawTherapee - that method gives me marvellous results when I need to squeeze high-dynamic-contrast still images into the bt.709 colorspace. (I'm usually using the parameters |
Yes, the default behavior of mpv is to not adapt gamuts unless the user requests it. (Since more people seem to complain about that than people seem to understand it) Maybe we could change this in the future for BT.2020 files? (using the same reasoning as tone mapping HDR by default)
Oh good, so we have somebody with an actual HDR TV so we can run real-world comparisons! It would help a lot if you could somehow measure the brightness of your display and set that as the true Note that “overshooting” the target-brightness is a “safe” operation, all it does is effectively make the image darker (linearly) which can help it match your true display brightness better. Using
I'll have a look, thanks for pointing it out. |
I was assuming that mpv does already try to automatically adapt colorspaces - mpv's predecessor MPlayer did that starting from 2009, and the "defaults" description at https://github.com/mpv-player/mpv/blame/master/DOCS/man/vf.rst#L218 led me to believe that mpv has probably continued this tradition. The visible differences between bt.601 and bt.709 when confused are much smaller than with bt.709 and bt.2020 - so I'd say if at all the information which colorspace a file was encoded for is available, an automatic conversion should be enabled if the display colorspace is a different one. Regarding the measurement of my display luminance: I do not have the hardware for this kind of measurement, plus it's extremely difficult to measure "one value" for LG's OLED displays - maximum luminance depends heavily on the size of the "white" area, some magazine measured 417 cd/m2, small white areas are "brighter", bigger white areas are "less bright". I'm pretty sure LG does apply "more than just clipping" in HDR mode, if only because they use different "whites" depending on the amount of light to be emitted by the display as a whole. Regarding "Using hdr-tone-mapping=linear has a similar effect, in your case of 600 you would be setting the tone-mapping-param=16.6 in order to get a similar output brightness" - yes, just tested that, looks as good as the above mentioned "simple" setting. I guess things will become more complicated if "Dolby Vision" HDR material was to be displayed, as "Dolby Vision" uses different parameters "per scene", so it might not look good if such material was replayed without interpreting new SEI information whenever it appears in the data. But that's not something I'd target for now, my current quest is one for an open-source workflow for people to produce their own HDR videos, and most people will certainly be totally fine with static HDR parameters. |
(I have created issue #3157 for a feature request very similar to the SMPTE-2084 transfer function support, but for V-Log - which would support a HDR video creation workflow "from the other end", right after recording.) |
I think you are confusing two separately different things here. This is a function for auto-detecting the YUV matrix, which is a trend continued by mpv. (In fact, mpv also tries to auto-detect primaries and gamma curves - two concepts of which MPlayer is ignorant of) mpv does use the auto-detected YUV matrix when converting from YUV to RGB, which is a mandatory step during playback. However, mpv does not use the auto-detect primaries or gamma curve for anything, since colorspace or gamma conversion is not strictly necessary. (Except in the case of HDR, XYZ or linear light input, in which case it will just pick some sane fallback)
I agree in principle, and it might be time to revisit this stance. |
Ah, got it. BTW: I mentioned earlier in this issue that AMD announced HDR / HDMI 2.0a support for GPUs coming in 2016, meanwhile nVidia also announced that their new GTX 1070 and GTX 1080 graphics cards will support HDR, also they specifically advertise that their game video capture software "Ansel" will allow to record HDR videos - and given that Youtube also announced HDR support I am confident that the amount of publically available HDR videos will increase quickly, and soon. |
(I noticed there's now a new default tone mapping function "hable" for HDR material in mpv - looks good!) |
Youtube now hosts HDR videos encoded using VP9, in the .webm container. mpv can already replay them, when instructed about the colorspace and EOTF:
But it would be great if mpv could pick up the information that the video is using BT.2020 and SMPTE-2084 automatically from the container - this document seems to describe where to find this information: https://www.webmproject.org/docs/container/
|
By the way, your |
Should already work in git master as of a few days ago. Also I think this can be closed. |
@haasn: Thanks for pointing this out - great to hear this has already been implemented! However, I just compiled mpv from the git master head, and the automatic recognition does not seem to have the desired effect for Youtube HDR videos like the example I linked above.
If instead, I use
Any idea what went wrong, here? (The same symptom occurs with other Youtube HDR videos as collected in their HDR launch playlist (If you'd prefer a separate issue opened for this topic, sure can do - I just didn't want to lose context.) |
@lvml Could it be that your ffmpeg version is too old to support ST2084? It works on my end, but the fact that it comes up as @wm4 this might be a reason to implement our own conversion function after all |
@haasn: I was trying with ffmpeg compiled from commit 69449da436169e7facaa6d1f3bcbc41cf6ce2754, Date: Mon Sep 26 20:25:59 2016 +0200. Not the latest, but not that old. A grep in the ffmpeg sources I used finds:
I can try with the latest ffmpeg git head tomorrow, if that promises to help. |
Might be a good idea to just use the h264 values in mpv too. |
@haasn: Regarding the strange contradiction between the codec and the container attributes regarding colorspace, I would assume that the original encoder (in the case of above link some Atomos hardware field recorder storing the material in ProRes) simply did not know the correct colorspace, and this information was probably only later attached to the container of the encoded video stream - possibly after the conversion to VP9 by Youtube. |
Hi, So a retro compatibility mode where "HDR" content or native xvYCC content( as proof there is this: |
Didn't you change your mind? |
I noticed that recent commits addressed various "colorspace conversion via LUTs" topics, and that made me wonder whether mpv could be enabled to allow for (automatic or semi-automatic) replay of video material using the electro-optical transfer functions defined in SMPTE-ST-2084.
SMPTE-ST-2084 is supported by HDMI 2.0a, already supported by contemporary high-end TV sets, part of the upcoming UHD-Bluray disc standard and Dolby ITU-6C recommendation, and supported by
x265 --transfer smpte-st-2084
(useful when encoding material from cameras with high dynamic range sensors - such as "full frame DSLRs" which easily exceed the dynamic range representable with bt.709).I understand that one can parametrize mpv in a way to utilize ffmpeg filters to apply a LUT, I tried this once (using
(maxval-minval)*pow((max(0\,(pow((val/(maxval-minval))\,.01268331351565596512)-0.8359375))/(18.8515625-18.6875*pow((val/(maxval-minval))\,.01268331351565596512)))\,6.27739454981539438107)
as the LUT formula to yield the luminance), but that seriously sucks not only because it slows down replay a lot, but also because one will certainly want to adjust the mapping from the absolute luminosity scale of SMPTE-ST-2084 (0 to 10000 cd/m^2) to the luminosity of the output device (likely much smaller).The more low hanging goal would be to enable mpv to
A more difficult goal (probably depending on GPU vendors making HDMI 2.0a support available in their hardware/drivers) would be to enable mpv to replay SMPTE-ST-2084 encoded content on actually HDR capable screens.
(I do realize that this feature idea might certainly not be the most urgent one, but I think it's a good idea to have it at least on file / for discussion.)
Here are some links to related material:
The text was updated successfully, but these errors were encountered: