-
Notifications
You must be signed in to change notification settings - Fork 719
[css-color-4] sRGB doesn't really use 80 cd/m^2 white luminance #3435
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Comments
Related: SDR and HDR compositing |
The sRGB spec was designed for someone in a very dim office, using decades-old hardware. Modern devices get used in bright rooms, dim rooms, completely dark rooms, outside under moonlight, outside in direct sun, ... Screen brightness varies dramatically from one device to another, sometimes set to be about the same as a sheet of paper under room lighting; other times set to be much brighter than anything else nearby; other times quite dim relative to outdoor light. Ideally authors would figure out exactly what the viewing conditions were, and tailor their content to each class of viewers. Realistically though, it’s hard to pick any kind of sane default, given the wide range of common conditions. You should try asking some real color scientists for advice about this one. And maybe do some sociological research about device users. |
You mean like @svgeesus? |
Well, no. It started with the HDTV standard, ITU-R BT.709, which is designed for a very dim (dark) viewing condition; and then used the same primaries but a slightly different transfer function and increased viewing flare for a typical office environment. I know, I discussed the expected viewing environment with the authors of sRGB when they suggested it at the W3C Print workshop in 1996.
However, you miss the point of this issue, which is primarily compositing SDR content (like sRGB web content) onto HDR video as an overlay. Assuming 80 cd/m^2 gives very bad results, as engineers from, for example, Netflix or the BBC have frequently pointed out. |
Okay fair enough. Is there any HDR video getting used in practice on the web? Do you have a concrete example (say a link) of sRGB content getting composited onto HDR video, and what it looks like? Does black point compensation get used ever on the web? Is CSS compositing going to get more complicated/capable than expressed in https://www.w3.org/TR/compositing-1/? Is there a link to that somewhere? Anyhow, you probably want to assume the same brightness and context for both the HDR and SDR content. If your HDR content has some extreme specular highlights, you could figure out what brightness is used for a diffuse reflector (e.g. a piece of white paper in the HDR video) and use that as the brightness for your SDR content.
Do you have a link? |
In-TV browsers and apps are using it, and getting that content onto the open web is an area of active current development. HDR video players which use TTML captions are also doing SDR onto HDR compositing.
It doesn't (except in WCAG contrast calculations, which assumes a fixed 5% viewing flare), and probably should, especially once color-managed CMYK and other ink profiles get used in Web-to-PDF content.
That would a) be physically impossible, the screen can't display that luminance level on the whole screen, and b) highly undesirable, because of burning out your eyes.
that is kind of the point of HDR
what you are describing is called the paper white or, more generally, the media white. And knowing that level is precisely why I opened this issue.
I primarily meant that they had pointed this out in in-person discussions. But see for example https://downloads.bbc.co.uk/rd/pubs/papers/HDR/BBC_HDRTV_FAQ.pdf and https://www.w3.org/2017/11/07-colorweb-minutes.html#meanings and https://www.w3.org/2019/09/17-colorweb-minutes.html Also (in progress) https://w3c.github.io/ColorWeb-CG/#goals |
I’m not really clear which display you are talking about. This seems like an entirely display-/context-dependent question. My main familiarity is with using image processing to display high-dynamic-range scenes on a standard display, and editing photographs such that a display of above-average brightness (e.g. mobile displays of the past decade; but still using an otherwise standard output pipeline) shows "media white" for the scene dimmer than usual, to give myself more headroom for higher brightness/colorfulness in particular areas of the image, but not necessarily having any extreme specular highlights. (For intended display in well lit environments.) I guess there are now starting to be non-TV displays with explicit “HDR” support? If so, those probably have listed an intended max brightness for “media white”. I would expect those specs to vary widely and to be changing quickly from year to year. My impression is that ideally the media white in a well lit setting (like a well lit office or outside in the shade) should be set to roughly comparable to a white paper in the same lighting. The film people probably have some guidance for what to do in a very dark theater type setting. You can probably get a decent default guess guess with something in the 200–500 nits range. But I’d expect for practical use you’d want to always composite SDR with HDR with the expectation that the SDR content uses the media white for the specific display settings at viewing time. Can that just be declared in the specification? |
I suggest you read up on standards for HDR screens before commenting further. The difference between peak full-screen luminance and peak small-area specular highlight luminance is fairly crucial to understanding how HDR works. |
Here’s what Poynton’s thesis says:
This is now a few years out of date (such displays are just barely starting to now be available), but seems like a reasonable baseline to me. From what I can tell searching around both the display hardware and the expected display processing pipeline is changing significantly from year to year, and there are several competing specifications. I don’t think you’ll be able to get a definitive source until the industry settles down a bit. But irrespective of how hardware evolves, the appropriate brightness for diffuse white is going to depend substantially on viewing context. What is appropriate for looking at a TV in a dark room is not going to be appropriate for a phone display outside. Most industry white papers I can find are pretty useless on this question, and it seems like vendors have been more concerned about peak luminance in highlights or short flashes than specifying a target for diffuse white. This one mentions:
For viewing SDR content in the middle of an HDR program on a television, Report ITU-R BT.2390-7 recommends that SDR content first be converted to linear RGB, then scaled so that its peak brightness is comparable to HDR diffuse white, and then have the HLG or PQ inverse EOTF function applied.
So by this standard 200 nits would seem to be the recommendation for viewing on a TV in dim lighting. |
Related: w3c/ttml2#1118 |
For displaying SDR content,it is common practice for the user to adjust screen brightness as they see fit and in response to (widely varying) viewing conditions. Thus the official 80 cd/m² has no impact on SDR usage. It matters when SDR content is composited with HDR content that uses a absolute luminance scale (PQ); as far as I can see it does not matter with HDR content which uses a relative scale (HLG). And the important thing is to avoid the following obvious traps, in order of seriousness:
Looking at the Reference Level Guidelines for PQ (BT.2100), from Dolby Laboratories, Aug. 9, 2016:
That seems enough of a recommendation to put in a future CSS Color specification which includes HDR (Rec. BT.2100 PQ, Jzazbz which also uses PQ, etc) ; and to close the issue for CSS Color 4. |
Hi Chris @svgeesus and Hi Jacob @jrus I realize this is closed, but just wanted to mention that part of SAPC is a standardized observer environment, intended to supplant the 80 cd/m² white with something relevant. It's a work in progress, but essentially the idea is to set peak white at five times ambient. I.e. if ambient surround is 32 cd/m², then set white at 160 cd/m². This is obviously in keeping with the "ambient surround should be 20% of peak white". In practice, per various surveys, people set their displays and devices somewhere between 140 cd/m² and 320+ cd/m².... not even considering high end phones that display over 1200 cd/m². The point being, it's not about mapping to an absolute level as much as mapping to a level appropriate for the display environment. And also, the IEC standard for sRGB is largely irrelevant in regards to white luminance, as users and automatic brightness adjustment fully dismiss that aspect of the standard. |
Setting diffuse white = 5 times ambient surround is going to be roughly comparable (for a typical room or outdoor setting) to setting the diffuse white to the same brightness as a piece of white paper under ambient illumination, which was my recommendation upthread. Is there a clear spec / description somewhere of exactly what level devices set with automatic brightness adjustment turned on? |
Hi Jacob @jrus, The standard for 20% ambient to white aka 5x ambient is a lot of places as a view condition: ITU, SMTPE, ICC, IEC.. The "Crazy Complete One":If you don't have a copy your might like the ICDM displays standard,, it's a free download and it's over 500 pages. https://www.icdm-sid.org/downloads/index.html It covers everything but I didn't see auto adjustment.... I'll have to dig but I know there have been some research papers out of Samsung and others... but I'm not aware of a specific standard, and considering that devices with automatic brightness also have a luminance level control that is very easy for the user to adjust, and screen technologies with massively different peak white capabilities, not too sure there is much potential for a standard, other than what each manufacturer is doing to out do the other... Then the question is, is there a useful API... or a not useful one... A |
The US Department of Energy (DOE) recommends an S-shaped curve to adjust TV luminance depending on ambient illumination: The luminance levels for dark room conditions are based on a recommendation by the Imaging Science Foundation (ISF), while the luminance levels for brighter conditions are based on a japanese study from 2010 on preferred luminance by Kishimoto et al. (https://www.jstage.jst.go.jp/article/itej/64/6/64_6_881/_pdf/-char/en). That study evaluated preferred luminances depending on ambient illuminance, angle of view and average picture luminance. Tested ambient illuminances were 30 lx, 100 lx and 300 lx. Subjects were divided by age into 24 young subjects (mean age 22) and 24 old subjects (mean age 71). Separate regression analyses for the two age groups lead to the following formulas: Young: log PL = 2.40 + 0.27 log E - 0.22 log SA - 0.32 log AL According to the model, at an illumination of 100 lx, a viewing angle of 20° and an average picture luminance of 25 %, the preferred luminance for young people is 161 cd/m², and for old people it is 248 cd/m². The DOE curve is probably based on a mean of values for young and old people. |
The definition of sRGB, which is also given in css-color-4 says that the white luminance level is 80 cd/m^2.
In practice the level is typically higher, often significantly higher, like the 160 cd/m^2 used by Adobe RGB (1998).
This impacts the black level, which is raised due to flare, when black point compensation is in use.
This also affects compositing SDR content (such as most web pages) onto HDR content (video, images), which is often done for information overlays and mixed HDR/SDR content in general. The result looks very dull if 80 cd/m^2 is used.
Is there a more modern, reference-able recommendation to use a higher white level for sRGB?
The text was updated successfully, but these errors were encountered: