Ok, this has been a really confusing topic that I have to keep reminding myself how it works. But basically, a nit or more properly cd/m^2 Is a measure of how bright a screen is. SDR or standard definition has a range of basically 0 to 100 nits. But high definition range or HDR can theoretically go to 10,000 nits.
And in fact, if you are not using the simple HDR Tools trick where you take SDR content and out that effect on saying HLG to PQ (which makes some sense HLG uses the same color curve as SDR so this will expand the colors pretty close to right)
So, you can do something a little different which is to go to a clip and then in the upper right click on the setting and SDR, which should be Rec. 709 and change the color rendering to rec.2010 PQ. This then expands the color range to that full 10K nits and will look terrible for an SDR clip like from a Zoom session
But it gives you full control in the color palette section. You can add a color wheel and turn on the luminance view on the left and do what is called color grading. This is fixing all the colors and brightnesses and pushes all the colors up to 10K nits. You can really mess this up, so you can just delete the pallette and start over.
The big problem though is that you have to know what your target displays are and that’s complicated for a few reasons:
- The brightness levels depend on how many pixels are turned on. It for instance there is just one bright spot sometimes denoted by what percent do screen is lit then peak brightness. But if the entire screen is lit then it’s much lower. As an example the new Samsung OLED can be a sustained 197 nits but if there is just a peak one part of say 1900 nits it will actually display 700. This is actually a very good performance but you can see that if you set the luminance of your video too high, then all the detail just gets crushed out into a white. So in looking at your video you have to manage it pretty carefully. There is something called Automatic Brightness Level in all TVs that basically knocks down peak brightness and limits it.
- When mastering the normal it to have an absolute maximum brightness of 100 nits for SDR and 400 nits for HDR so you stay way below these levels. If you stay under 400 nits that’s good and safe but you could go higher if you know the targets. And remember for most monitors they can sustain 400 nits across a lot of the screen.
- Even new TV Screens are not that bright. Screens perform really differently. For instance, the very latest Samsung S95B can run at 1.023 nits at 2% in the 2023 model. But even a two-year and very good LG C1 has a 2% of screen to 751 nits. And for a very pesky night hallway scene calling for 1900 nits, it will actually push 639 because of ABL and the screen. And a sustained 50% white is just 263 and 100% is 125.
- 2019 screens and Older screens are even able to produce those peaks. For instance, our LG B9 from 2019 is just 603 nits at 2% of the screen. And that’s a pretty new TV. That was state-of-the-art four years ago.
- IPhones are super bright. If you think about this makes some sense the screen is smaller and closer to you. But it has a peak brightness. Of an incredible 1600 nits for a small part of the screen so that peak is probabl close to the 2% of 1,600 nits. So you can see the problem. Things that look great on an iPhone are going to look terrible on a TV.
- Apple XDR display. This sort of explains why a 27-inch monitor can cost $5K. This monitor can sustain 1000 nits with a 1600-nit peak. Most monitors and TVs have a hard time at 200-300 nit sustained. So I’m not saying it’s worth it but you can actually produce HDR for iPhones using it.
- IPad Pro 12.9” here’s one reason to get the latest version. Its panel also a 1000-nit sustained and 1600-nit peak brightness
Net, net the tutorial I linked to says to try to keep peak brightness under 1,000 nits. What he’s really saying is to make it bright enough for iPhones. But then how do you know it’s good? So when you are grading you should probably look at both an “ordinary HDR display” on something like an LG B9 and then have a panel that works like the XDR if you really want max quality on the highest-end Apple devices.
Net, net I’m kind of drooling for an XDR and in the meantime, I’ll either keep using the HDR Tools trick which is basically a direct map. The SDR looks the same in HDR. But you can do what is called inverse tone mapping or upmapping to give SDR a broader color range artificially by lower the shadows and bumping the brightness. That is what the use of the color wheel is doing. It seems to work best on landscapes and things and looks strange on people shots.
OBS HDR seems to cap at 100 nits
The other interesting thing is that if you put SDR input from say a BRIO and output from OBS Studio as HDR, it respects the 100 nit maximum, so crushes lots of whites. Even if you try to expand it, it doesn’t look super good in “faux HDR”. You can do a basic conversion with HDR Tools but the best is to use Color Wheels to get to say 400 nits for SDR. And do find that anything more than that and that at least with Zoom you don’t get great output.
HDR doesn’t seem to take on YouTube for 6 hours at least
Also, even though I uploaded this with the Apple Device HEVC 10-bit, it doesn’t seem to convert to HDR and there is no way to know why. It looks ok in IINA. Some say it just takes time, so I guess I’ll have to be patient.
And after uploading starting at 11am and ending at 2pm, finally at 9pm the HDR button showed up. So you do have to patient and it’s so frustrating that there was no indication it was even working. The HDR just sort of shows up but it does work.