Many people are, understandably, confused about brightness levels in content creation and consumption - both for SDR and for HDR content. Even people that do content creation as their job sometimes get it really wrong.
Assuming that all monitors do anything specific at all would be a folly, no. There are no assumptions there, the sRGB spec has no ambiguity when it comes to the transfer function of the display.
That a certain percentage of displays don’t behave like expected is annoying, but doesn’t really change anything (beyond allowing the user to change the assumed transfer function in SDR mode).
the video https://www.youtube.com/watch?v=NzhUzeNUBuM goes more indepth, but it’s a very true statement to say that “some displays decode with the inverse oetf and some don’t” this issue has been plaguing displays for decades now.
There are no assumptions there, the sRGB spec has no ambiguity when it comes to the transfer function of the display.
You are 100% right in saying “the reference display is gamma 2.2” however, we can only wish this is what displays do, Color.org themselves got this wrong!!! https://www.color.org/srgb.pdf and leads people astray.
The most likely actual reason Window uses the piece-wise transfer function for HDR is that it did that in SDR mode too - where however the default ICC profile was also piece-wise sRGB, so it canceled out on 99% of PCs, and had no negative effects.
I don’t actually believe this to be the case, if it was people who use custom ICCs would get extremely wonky results that don’t typically happen. On the other hand it is very true that colors when doing it the way they do, you get the “least offensive” results. Though IMO the best solution would be to simply be to default to the pure 2.2 and allow users to override the transfer. the Color protocol allows for explicit peicewise sRGB anyways, so doing this should fit right into a fleshed out colormanaged setup.
That’s a very different thing. Pushing viewing environment adjustments to the display side makes some amount of sense with SDR monitors - when you get an SDR display with increased luminance capabilities vs. the old one, you change the monitor to display the content comfortably in your environment
I think I am a bit confused on the laptop analogy then, could you elaborate on it?
With HDR though, if the operating system considers PQ content to be absolute in luminance, you can’t properly adjust that on the monitor side anymore, because a lot of monitors completely lock you out of brightness controls in HDR mode, and the vast majority of the ones that do allow you to adjust it, only allow you to reduce luminance, not increase it above “PQ absolute”.
How monitors typically handle this is beyond me I will admit, But I have seen some really bonkers ways of handling it so I couldn’t really comment on whether or not this holds true one way or another. Just so I am not misinterpeting you, are you saying that “if you feed 300nits of PQ, the monitor will not allow it to go above it’s 300nits”? IF so this is not the case on what happens on my TV unless I am in “creator/PC” mode. In other modes it will allow it to go brighter or dimmer.
My current monitor is only a 380nit display so I can’t really verify on that (nor do I have the hardware to atm)
I didn’t claim that PQ had only one specification that uses it, I split up SMPTE ST 2084, rec.2100 and BT.2408 for a reason. I didn’t dive into it further because a hundred pages of diving into every detail that’s irrelevant in practice is counter productive to people actually learning useful things.
ah I see, I was a bit confused on what you had meant then. My apologies.
Can you expand on what you mean with that?
Keep in mind this was based on the above misinterpretation of what I thought you meant.
With libjxl it doesn’t really default to the “SDR white == 203” reference from the “reference white == SDR white” common… choice? not sure how to word it… Anyways, libjxl defaults to “SDR white = 255” or something along those lines, I can’t quite remember. The reasoning for this was simple, that was what they were tuning butteraugli on.
That “directly” is very important, as it does very much make both these signal levels the same. As I wrote in the blog post, the spec is all about broadcasts and video.
Other systems do sometimes split these two things up, but that nearly always just results in a bad user experience. I won’t rant anymore about the crapshow that is HDR on Windows, but my LG TV cranks up brightness of its UI to the absolute maximum while an HDR video is playing. If they would adhere to the recommendations of BT.2408, they would work much better.
I think this is an issue of terminology and stuff, reference white is something the colourist often decides. When you assume that HDR graphics white == SDR white this actually causes more problems then it solves. I would say that it is a “good default”, but not a safe value to assume. This is something the user may often need to override. I know personally even when just watching movies on MPV this is something I very often need to play with to get a good experience, and this is not even counting professionally done work.
That’s just absolute nonsense. The very very vast majority of users do not have any clue whatsoever what transfer function content is using, or even what a transfer function, buffer encoding or even buffers are, the only difference they can see is that HDR gets brighter than SDR.
And again, this too is about how applications should use the Wayland protocol. This is the only way to define it that makes any sense.
this actually isn’t really that true. It is indeed the case that users wont know what transfer function content is using. but they absolutely do see a difference other then “HDR gets brighter then SDR” and that is “it’s more smooth in the dark areas” because that is also equally true.
Users have a lot of different assumptions about HDR, but they all follow some sort of trend “it makes the content look more smooth at a greater range of luminance” and if I were to give a “technical definition that follows general user expectations” the definition would be something along the lines of “A transfer that provides perceptually smooth steps of luminance at a given bit depth up to at least 1000 nits in a given reference environment” which is bad for sure, but at the very least, it more closely aligns with general expectations of HDR given it’s use in marketing.
(I really hate the terms HDR and SDR btw, I wish they would die in a fire for any technical discussion and really wish we could dissuade people from using the term)
the video https://www.youtube.com/watch?v=NzhUzeNUBuM goes more indepth, but it’s a very true statement to say that “some displays decode with the inverse oetf and some don’t” this issue has been plaguing displays for decades now.
You are 100% right in saying “the reference display is gamma 2.2” however, we can only wish this is what displays do, Color.org themselves got this wrong!!! https://www.color.org/srgb.pdf and leads people astray.
I don’t actually believe this to be the case, if it was people who use custom ICCs would get extremely wonky results that don’t typically happen. On the other hand it is very true that colors when doing it the way they do, you get the “least offensive” results. Though IMO the best solution would be to simply be to default to the pure 2.2 and allow users to override the transfer. the Color protocol allows for explicit peicewise sRGB anyways, so doing this should fit right into a fleshed out colormanaged setup.
I think I am a bit confused on the laptop analogy then, could you elaborate on it?
How monitors typically handle this is beyond me I will admit, But I have seen some really bonkers ways of handling it so I couldn’t really comment on whether or not this holds true one way or another. Just so I am not misinterpeting you, are you saying that “if you feed 300nits of PQ, the monitor will not allow it to go above it’s 300nits”? IF so this is not the case on what happens on my TV unless I am in “creator/PC” mode. In other modes it will allow it to go brighter or dimmer.
My current monitor is only a 380nit display so I can’t really verify on that (nor do I have the hardware to atm)
ah I see, I was a bit confused on what you had meant then. My apologies.
Keep in mind this was based on the above misinterpretation of what I thought you meant.
With libjxl it doesn’t really default to the “SDR white == 203” reference from the “reference white == SDR white” common… choice? not sure how to word it… Anyways, libjxl defaults to “SDR white = 255” or something along those lines, I can’t quite remember. The reasoning for this was simple, that was what they were tuning butteraugli on.
I think this is an issue of terminology and stuff, reference white is something the colourist often decides. When you assume that HDR graphics white == SDR white this actually causes more problems then it solves. I would say that it is a “good default”, but not a safe value to assume. This is something the user may often need to override. I know personally even when just watching movies on MPV this is something I very often need to play with to get a good experience, and this is not even counting professionally done work.
this actually isn’t really that true. It is indeed the case that users wont know what transfer function content is using. but they absolutely do see a difference other then “HDR gets brighter then SDR” and that is “it’s more smooth in the dark areas” because that is also equally true.
Users have a lot of different assumptions about HDR, but they all follow some sort of trend “it makes the content look more smooth at a greater range of luminance” and if I were to give a “technical definition that follows general user expectations” the definition would be something along the lines of “A transfer that provides perceptually smooth steps of luminance at a given bit depth up to at least 1000 nits in a given reference environment” which is bad for sure, but at the very least, it more closely aligns with general expectations of HDR given it’s use in marketing.
(I really hate the terms HDR and SDR btw, I wish they would die in a fire for any technical discussion and really wish we could dissuade people from using the term)