high rise buildings during night time

Getting the most out of your Sony KD43X85J and HDR decoder and tv optimization ring

0

OK, given how hard it is to get all the trimmings on a LG C9 was super hard, what about a new 2021 43" Sony LED television which makes a nice "smaller" monitor. To summarize, here are the things you want to change:

  1. Overscan. This is an annoying default, but in old analog TVs, the image would be expanded so you wouldn't get a black box around an television image and it is turned on by default even today.
  2. Resolution. There are alot of standard here, but the typical one is a 4K display (aka UHD or 2180p) which is really 3840x2190 which is a strange thing until you realize that this is 4x the resolution of 1920x1080 which is what 1080p television is. As an aside, this strange ratio was created because movies were wide screen had a 16:9 aspect ratio was a good compromised compared to the typical computer monitors at 4:3 and movies which are typically 1.85:1 or 2.39:1.
  3. Full Range Black and White levels. In the old days, monitors would overshoot or undershoot at the extreme highs and lows, so for 8-bit per color signals, the default is limited range where the values can be 16-235 but full range is 0-255. Most televisions pick limited range as a default, so you have to change that typically by some strange magic of changing the icons of the HDMI inputs.
  4. Low Input Lag. When you are watching television, modern television put all kinds of processing to make it look better. You don't care about lag from input to display because you it is non-interactive. But if you are using the monitor for gaming or a PC, then you want low input lag and you want all those modes taken out. Typically, you do this by setting the input to a Game console or a PC. A typical PC might be 40-50ms of lag and in low lag it is 15ms, so not like a real human would notice, but if you have 200ms to get an alien in a game that matters.
  5. Refresh Rates. Typical movies run at 24 frames per second and most televisions run at 60 fps. But, if you want but butter smooth movement like in a computer game, you can go to 120 fps in many monitors and the fastest ones can run at 144 fps. This is really important for gaming
  6. Variable Refresh Rates. One trick with gaming is that instead of running at a fixed refresh rate like 60 Hertz, the computer can speed things up or down. This means that you don't get screen tearing in the videos where the game can't keep up and the top half of your screen is different than the bottom. There as usual was a standards war, so Adaptive-Sync in DisplayPort 1.2a or VRR is part of the HDMI 2.1 standard and before that there were two proprietary standards nVidia G-Sync and AMD FreeSync. So it would be nice to get VRR in your television as well. But for example LG TVs from 2019 on have VRR and G-Sync in them.
  7. High Dynamic Range (HDR) and Wide Color Gamut (WCG). These are usually just lumped into HDR, but what HDR means technically is much more exposure latitude, that is the whitest whites and the blackest blanks are greater than ever. WCG is a related concept which says all the colors are bolder, so blues are bluer, etc. What is called Standard Dynamic Range and regular color gamut uses a standard call Rec.709. The next one is Rec.2020 and the holy grail is called Digital Cinema P3 that is used in movies and this has to get mapped into what a television can actually project, typically some percentage of Rec.2020
  8. Static or Dynamic HDR Formats. OK, it gets worse, so there are competing formats for the color content. As you can imagine, there is a lot of money to be made. The last generation, the HDR, that is the additional color data was encoded in either HDR10 which is the default on all television, but ideally you would want a TV that could decode HLG which compatible with SDR and usable if we ever get broadcast HDR. Finally, there are new dynamic formats like the the proprietary Dolby Vision which does dynamic encoding so that it can compress the HDR data (there is way more of it!) and Samsung is fighting back with called HDR10+
  9. Color depth or bits per color. So this means for each color, how many bits do you get the standard has been what is called true color (aka 24-bit color, aka 8 bits per color or 8 bpc). In the MacOS world, this is called "millions of colors" and is the default. The problem is that with HDR/WCG, you get banding since 8 bits isn't enough for a really wide gamut like Rec.2020, so you can expand to Deep color which is 10 bpc and it allos 2^30 or billions of colors. And there are 12 bpc and even 16 bpc. The last is used in digital editing software so that in processing you don't have
  10. Chroma subsampling. So a combination of jacking up the colors from 8-bit to 10 to 12 bit means way more data, so one way around this is a lossy compression based on the idea that we don't see colors that well. The way this is expressed is really confusing, so 4:4:4 means for a 4x2 array of pixels, the first 4 means that we show luminance (that is more black or more white) across all 4 pixels and the next down. This is what we are most sensitive too when watching a movie. Then the next 4 says that we get color for each of the two rows and then the final 4 says we get color for each the for columns. This is what is called uncompressed color. But, you can reduce the bandwidth needed by going to 4:2:2 which means that for a 4x2 set of pixels, you get 2x2 colors or even 4:2:0 which means that for that 4x2 you only get 2 color columns and all the rows are the same. Obviously for a computer monitor, getting 4:4:4 is going to be less blurry.
  11. Display Stream compression. Well all this means a lot of data across a cable, so there is lossy Display Stream Compression in the DisplayPort or HDMI specification as an example which compresses this (this is lossless unlike Chroma subsampling). But the computer, the hub if you have it and the monitor has to be able to deal with it.

So what hardware connectors do you want from a monitor:

So the holy grail for a modern television or monitor is:

  1. HDMI 2.1. You really want a television that supports the 48 Gbps HDMI 2.1, but then the question is how high a refresh would you like etc., so the ideal is that on input you can deal with
  2. DisplayPort 2.0. OK one confusing thing is that DisplayPort 2.0 can run on a DisplayPort hardware connector or it can run over USB 4 as an Alternate Mode on a USB C connector. This has a maximum 77 Gbps bit rate over the DisplayPort hardware. As an aside, obviously, if it is running over USB 4 or through a Thunderbolt 4 dock, it is limited to that standard 40 Gbps (a two lane format where each lane of pair of data supports 20Gbps). You can see from this as an aside that on a Thunderbolt 3/4 with 4 high speed lanes, in shared mode, you can have 2 lanes of USB 4 and then 2 lanes of DisplayPort 4 and that's how hubs can supply both data and video through a single Thunderbolt 4 port.

Most commercial televisions are HDMI 2.1 but with PC monitors, you can get DisplayPort 2.0 which is nice.

What is the holy grail of connection these days for a MacBook Pro 2021

While you can go crazy for a Mac what would be the ideal as a monitor, the goal is to be as close to what the native panel can actually support which is typically for say a Sony KD43X95J (translation a television that is 43" diagonal using an X or LED panel, in the high range 95 series and in the year J year or 2021):

  1. Resolution 4K UHD. That is showing the full panel resolution
  2. Frame rate 120 Hertz. That's what the panel can do
  3. Variable refresh rates. This TV is getting VRR so nice to get it running
  4. Low input lag
  5. Full range black levels at 0-255
  6. Rec. 2020 color gamut. for HDR/WGC
  7. HDR10 or ideally DolbyVision format
  8. 10 bits per color
  9. Chroma uncompressed at 4:4:4
  10. DSC 1.2 compression so all that can be delivered

So it's no wonder that it is hard to get a television to work well with a computer given you have to make no less than 10 parameters work correctly. And this has to work for both the monitor and the PC. So a good way to look at it is to look at what the television can do and then compare it with what a MacOS or output device can do. For the LG C9 we worked on yesterday, here is where we are and I suggest making the same table for what you want, like Windows. The main issue is that the MacBook has an HDMI 2.0 port and Thunderbolt 4 ports, so the column is split by whether you are using the HDMI port or DisplayPort Alternate mode over Thunderbolt 4 that then goes to an HDMI 2.1 adapter

FeatureMacBook Pro 2021 HDMIMBP 2021 TB4 to HDMI 2.1 AdapterLG OLED55C9PUASony KD45X85J
OverscanN/AN/ASettings > All Settings > Picture > Menu > Picture Settings > Graphics Mode (On other Tis it is called Full Pixel)
Resolution4K4K4K4K
Frame Rate6060 (current software)120120 on HDMI 3 or 4
Variable RefreshNoDisplayPort Adaptive-SyncHDMI VRRHDMI VRR (update available)
Low Input LagYesYesGame or PC iconEnable Game or Graphics mode
Full rangeYesYesPC icon for inputEnable Game mode
HDR/WBCRec. 2020Rec. 2020Enable Picture/Additional Settings/Ultra HD Deep ColorEnable Menu> HDMI Signal Format/Enahnced Format
HDR EncodingDolby Vision/HDR10/HLG for Apple
HDR10 for external displays
HDR10 (current software)HDR10
Dolby Vision
Enable Picture Settings/Video Signal/HDR10

HDR10, HLG
Dolby Vision
Color depth8 or 10 bpc8 or 10 bpc or even 12bpc8, 10, 12 bpc10 bpc
Chroma4:2:2 on HDMI4:4:44:4:44:4:4
CompressionNoDSC 1.2HDMI 21
Comparing MacOS to other

So it's a little complicated, but the basic thing is that with the HDMI 2.0 port for quick an dirty and changing two settings on the Sony television (Game mode on the input and the HDMI Signal Format), you will get to HDR10, Rec.2020 with Chroma 4:2:2 at 4K@60 hertz which is not bad.

If you get a Thunderbolt 3 to HDMI 2.1 adapter, you will get an active adapter that is taking the output from MacOS Monterey. The big question is what version of DisplayPort does the macOS support. For instance, they support DisplayPort 1.2 but without Multistream Transport so you can have multiple monitors.

Right now, Apple does not sell an adapter that goes from USB-C to HDMI 2.1. It only has an HDMI 2.0 adapter. And figuring this out is complicated, it depends on both the Thunderbolt controller being used, then the graphics hardware itself and finally the software in it. As an example the 2018 MacBook Pro's used Radeon Pro and theoretically support DisplayPort 1.4 at High-Bit Rate 3. HBR3.

Theoretically, the MacBook Pro 2021 supports USB 4 which in turn requires DisplayPort 2.0 support. As an aside, the trick with DisplayPort 2.0 to get 80Gbps isn't super novel, the USB specs are 40Gbps bidirectional, so to get 80Gbps to the monitor, you just make them all one way across four lanes.

Caldigit TS3 Plus DisplayPort 1.2 to HDMI 2.0 4K@60

So if you have an older Caldigit, you can still use the DisplayPort 1.2 because it will support 4K@60Hz 4:4:4 or even 4K@120 4:2:2 so this is not a bad compromise until Apple fixes the 120 Hertz issue.

Related Posts