Plus what to look for when buying one
If you’re in the market for a new HD TV, you’ve probably noticed that some 4K models also have a feature called HDR (High Dynamic Range).
HDR is a technology that dramatically improves the brightness and contrast of a display. HDR is objectively superior to older TV technology, but not all sets are created equal, and HDR isn’t a monolithic standard. Nonetheless, HDR provides the best TV experience, and we’ll explain why.
The Easy Part: 4K
The easiest part of the technology to understand is the “4K” bit. This simply refers to the resolution of the television. “Resolution” in this context means the number of pixels the TV has. Most “4K” TVs are UHD or “Ultra-high Definition,” which is slightly lower resolution than the proper 4K standard used in professional Hollywood cinematic film production.
A UHD TV has a pixel grid of 3840×2160 pixels. This is an incredible four times the number of pixels in an FHD (Full HD) display. UHD resolution isn’t related to HDR at all. Displays can offer HDR regardless of what resolution they have. For example, 1440p computer monitors and mobile phone panels offer HDR, despite having a lower resolution than 4K UHD.
HDR is something you’ll find almost exclusively paired with 4K or higher resolutions when it comes to television. So it’s no surprise that these two TV specs are spoken of in the same breath.
What Is Dynamic Range?
The dynamic range of a television is essentially the distance between how dark and how bright the screen can get. If that sounds a little like the contrast ratio, you aren’t entirely wrong.
However, dynamic range is more about how much detail can be retained in an image’s darkest and brightest parts before you get “crushed” blacks and “blown out” whites.
You might recall that infamous episode of Game of Thrones where scenes were so dark that many viewers could see nothing but a mushy black image. The show’s creators had pushed so far down the dynamic range that lower-end TVs (which most people have) simply weren’t capable of reproducing the details.
Standard Dynamic Range vs. High Dynamic Range
The dynamic range of content and displays is standardized so that the people who master video content know the limits within which they can work. SDR or Standard Definition Range content results from technological limitations in camera and display technology.
Modern cameras and displays can capture and reproduce a far wider range of brights and darks. Not only that, they can capture and reproduce details within those dark and bright parts of the image that would have been lost before.
All HDR does is widen that range and increase the available information cameras capture and screens can show. If you make content with an SDR camera, you won’t see any improvement on an HDR screen. Likewise, if you put HDR content on an SDR screen, it will look like SDR content.
There are five HDR standards at the time of writing: HDR10, HDR10+, HLG, Dolby Vision, and Advanced HDR by Technicolor.
The most widely-supported HDR standard is HDR10. Virtually all HDR displays support HDR10, and most HDR content is available in HDR10. Other standards improve on this original implementation of HDR, and it’s usually only cheaper sets that only support HDR.
HDR is a relatively simple open standard created by the UHD Alliance, the consortium responsible for defining the UHD resolution standard. For a TV to be HDR10 compliant, it must meet specific technical peak brightness and contrast ratio standards.
The HDR metadata, which is additional information about light levels encoded in HDR content, is static in the case of HDR10. That means the specified brightness and contrast levels are the same regardless of the display or specific scene you’re watching. That’s as opposed to HDR standards that use dynamic metadata, which changes those brightness and contrast values on a per-scene basis.
The UHD Alliance does not define HDR10+. Instead, it’s defined by Samsung, one of the largest TV manufacturers in the world.
As the name suggests, HDR10+ builds on the foundation of HDR10. It does this by adding dynamic metadata, which means that HDR targets are based on the current scene. Samsung has made HDR10+ an open standard just like the original HDR. So anyone can put this certification on their TV if it complies with the on-paper requirements.
Dolby Vision HDR
Dolby Vision is one of the significant HDR standards, and you’ll find a fair number of higher-end TVs and media devices to support it. The latest generation of Xbox consoles supports Dolby Vision, for example.
Dolby Vision certification is somewhat more challenging than HDR10 or HDR10+ since it’s a licensed standard. TVs and other HDR devices must pass their certification to display the Dolby Vision sticker.
This standard uses dynamic metadata. This means the image is adjusted to the capabilities of your particular Dolby-certified HDR TV, which has manufacturer settings built into it that help interpret how to display content mastered in Dolby Vision.
Hybrid Log-Gamma (HLG)
Hybrid Log-Gamma works differently from either HDR10 or Dolby Vision. This standard doesn’t have metadata. It instead uses a calculation to work out what the brightness level should be on an HDR display along an SDR gamma curve.
The standard was developed for broadcasters to allow a single signal to work on both SDR and HDR sets. However, very few 4K TVs currently support HLG so unless there’s a greater level of adoption, HLG has an uncertain future.
Advanced HDR by Technicolor
Technicolor is a name known to anyone who has an interest in cinema. This company pioneered much of the film industry’s display technology.
Advanced HDR by Technicolor is an attempt to bring some of that know-how into HDR, but it’s by far the smallest compared to Dolby Vision and HDR 10, so it’s bound to be an uphill battle.
To make things worse, there are three standards within the Technicolor HDR family: SL-HDR1, SL-HDR2, and SL-HDR3. SL-HDR1 is backward compatible with SDR, making it a viable choice for broadcasts like HLG. SL-HDR2 has dynamic metadata and is the competing standard to HDR10+ and Dolby Vision. SL-HDR3 is still in development.
LG, a major rival to Samsung, tends to include a broader range of supported HDR standards with its TVs, supports Technicolor, and you’ll also see sets with support for this standard sold under the Philips brand.
HDR Affects Color Reproduction
While HDR is mostly about peak brightness and darkness, color is also affected. With the extra luminosity data embedded in HDR video, it becomes possible to capture and reproduce more shades of color.
Therefore, good HDR displays are brighter and more colorful than typical SDR screens. An HDR display can have poor color reproduction for other reasons than its HDR range, but in practice, better color usually goes hand-in-hand with better HDR.
HDR Color Gamut
Those mastering HDR content to comply with specific HDR standards have a defined color gamut. Dolby Vision uses the REC.2020 wide color gamut. HDR10 uses the narrower DCI-P3 gamut, but wider than the standard HD gamut, REC.709.
Just because a given HDR standard offers a wide range of colors, that doesn’t mean every HDR TV can reproduce all of them or do so accurately. Screens are often rated as covering a percentage of a certain color gamut, with higher numbers being better.
You Need HDR Content
If it isn’t clear from the discussion so far, you need to feed your 4K HDR TV with HDR content to get any benefit from it. Not only that, but the TV shows or movies have to be mastered in the HDR standard that your TV supports.
For example, Netflix streams using two HDR formats: HDR10 and Dolby Vision. The Netflix app automatically detects what type of HDR your TV and streaming device supports and then streams the correct kind of content. Different streaming services usually support at least HDR10. Amazon Prime Video supports HDR10+, and some titles are also available in Dolby Vision.
When it comes to collecting physical media mastered in HDR, your only option is 4K Ultra HD Blu-ray. This is different from standard Blu-ray technology, which only supports a 1080p resolution and doesn’t have enough spare room for HDR information. You’ll also need a UHD Blu-ray player, which must also support HDR.
Turning SDR Into HDR
It’s possible to get more out of SDR content by “converting” it into HDR. Many televisions have an option to activate a sort of pseudo-HDR where SDR content is analyzed and the TV’s software “guesses” what it would look like if it were HDR.
The results can be pretty mixed, depending on the specific algorithm the TV uses. But in many cases, it does offer an enhanced picture.
On the latest Xbox consoles, you’ll also find a feature called “Auto-HDR,” which injects HDR information into games that were not created with HDR support. How well this works once again varies on a case-by-case basis.
What to Look For When Buying 4K HDR TVs
Just because a new TV is labeled as an HDR 4K TV doesn’t mean you’re getting the picture quality benefits you think. There are several aspects of any new TV with HDR you should pay careful attention to.
Advanced HDR Standard Support
Virtually all HDR TVs support HDR10, but you should avoid TVs that only support HDR10. Try to go with a set that supports at least HDR10+, Dolby Vision, or both. For now, these are the two most common standards, and they offer a significant step up over standard HDR10.
True HDR Compliance
What does the HDR label on a 4K TV actually mean? One of the essential considerations is peak brightness. Brightness is measured in “Nits,” and good HDR TVs usually produce at least 600 Nits of peak brightness, with high-end HDR TVs making 1000 Nits or more. In practice, many low-end TVs only produce 100-300 Nits, so they can’t reproduce a proper HDR image.
Backlight and Display Technology
There are several TV technologies on the market, and they have different approaches to producing images and creating brightness.
OLED (Organic Light-emitting Diodes) are generally the best HDR displays. OLED is an emissive technology, which means that the pixels in the screen produce their light. OLED TVs can have perfect blacks since the pixels can be very dimly lit or even turned off completely. Although most OLEDs don’t get all that bright, the contrast ratio helps them produce fantastic HDR imagery as long as you’re watching in a darkened environment.
LED LCD TVs are the most common type of TV you’ll find. LCD is a transmissive technology, which means that the light is provided by a backlight that shines through the LCD panel. This limits how dark the screen can get since the backlight still shines through when the pixels are off.
New LED technologies such as local dimming zoned, QLED, Mini LED, and Micro LED bring LCDs closer to OLED displays without the drawbacks of OLED screens. An LED screen with many local dimming zones or Mini LED technology will likely produce much better HDR imagery than an edge-lit LED with no dimming.
Limited HDR Inputs
While your TV may support HDR and even offer a decent HDR image, it may not support HDR on all its inputs. Some mid-range or lower-end HDR TVs only support HDR on HDMI input 1.
So if you have multiple HDR-compatible devices, such as a PlayStation 5, Apple TV, Roku, or Google TV device, you’d have to use an HDMI splitter or switch to enjoy HDR content on both devices. If you have a smart TV, any apps running on the TV will have HDR as long as they support it.
Devices that don’t support HDR, such as the Nintendo Switch, should be plugged into non-HDR inputs. The good news is that you don’t need a special HDMI cable for HDR. Any certified HDMI cable will work.
Professional Reviews Matter
It’s vital to read professional reviews by publications that use specialized equipment to check whether the claimed performance measures up to the real-world performance. It only takes a few minutes to see if the 4K HDR TV you want to buy is as good as the on-paper numbers suggest.
Looking On The Bright Side
TV makers such as Sony, Samsung, and LG have been hard at work to push HDR on their products and support several competing standards. While it’s still anyone’s guess which HDR standards will become the most universal, there’s almost no HDR TV you can buy that won’t support HDR10 or Dolby Vision.
We don’t think the average consumer should be too concerned about the HDR format wars. It’s better to pay attention to the core specifications of the TV you’re considering while making sure that your other devices, such as consoles, set-top boxes, and UHD Blu-ray players, will work with the specific standards your TV supports.