Click to Skip Ad
Closing in...

HDR formats explained: HDR10, Dolby Vision, HLG, and more

Published Jan 27th, 2022 6:36PM EST
HDR Formats
Image: LG

If you buy through a BGR link, we may earn an affiliate commission, helping support our expert product labs.

Many new TVs, phones, tablets, and other display-based devices support HDR. Billed as a way to get brighter colors and a better image, HDR formats are becoming an increasingly important aspect of buying any new TV. But what exactly is HDR? And what’s the difference between HDR formats?

HDR essentially allows you to get brighter images and more vibrant colors — as long as the screen and the content support the tech. From HDR 10 to Dolby Vision, here’s everything you need to know about HDR.

What does HDR mean?

HDR stands for “High Dynamic Range.” The term basically refers to a technology that allows displays to show more dynamic range than in “SDR” or “Standard Dynamic Range.” But what is dynamic range? It’s basically the difference between the blackest blacks and whitest whites that a display is capable of producing.

Apple TV 4K 2021 InterfaceImage source: Christian de Looper for BGR

There are good reasons that you might want more dynamic range in an image. Higher dynamic range basically helps you see better detail in brighter images, making it look more realistic. Non-HDR images can look muted and flat compared to real-life — while HDR video gets closer to the brightness and colors of the real world.

Of course, just as HDR rose out of technological advancements, newer and better HDR formats are likely to come as TVs and displays get more capable. Over time, displays have gotten brighter and brighter, enabling more realistic images. The idea isn’t just that you want infinite brightness — after all, at a certain point, that would hurt your eyes. Instead, brighter displays can show more detail in bright sections of an image. The same is true for dark sections of an image — the darker an image can get, the more detail you’ll get in shadows and other dark pictures. We likely won’t get much darker than OLED and Micro-LED displays though, which can display true black by turning off sections of the display.

Brightness aside, HDR formats can also display more colors, also making for a more detailed image that looks more like real life.

What is HDR10?

HDR10 Logo

There are a number of different types of HDR that you’ll find on displays these days. The most popular include HDR10, HDR10+, Dolby Vision, and HLG. The most common of these is HDR10.

There are a number of reasons HDR10 is so popular. For starters, it’s an open standard and is used by a huge range of streaming services, including Netflix, Disney+, Apple TV+, and more.

HDR10 supports 10-bit color depth, which means it can support far more colors than 8-bit SDR images. In fact, while 8-bit images can support almost 17 million colors, 10-bit images can support over 1 billion colors. There are some TVs that support 12-bit color depth, and that means that they can display a massive 68 billion colors.

HDR10 is built to produce 1,000 nits of brightness — which is partly why the standard is aging a little. Most higher-end TVs these days can produce much more than 1,000 nits of brightness.

HDR10 uses something called “static metadata.” That basically means that a set of metadata is sent to your display at the beginning of a scene. The advantage of that is that it takes up less bandwidth than a format like Dolby Vision, which can send metadata frame-by-frame.

What is HDR 10+?

HDR10+ Logo

Dolby Vision may be better than HDR10, but because it’s a propriety standard, it costs manufacturers licensing fees. This led to the rise of HDR10+.

HDR10+ doesn’t quite reach the technical ambitions of Dolby Vision, but it does expand on HDR10. It keeps the 10-bit color support but raises the maximum brightness to 4,000 nits.

Lots of TV manufacturers build HDR10+ compatible TVs, but content and other devices are a little uncommon. The goal is that it will get more popular over time, and compete more with Dolby Vision.

What is Dolby Vision?

Dolby Vision LogoImage source: Dolby Labs

Dolby Vision is a format developed by Dolby Labs. Of course, given the fact that it’s a propriety format, Dolby Labs also licenses it. In other words, it’s not an open standard like HDR10 is, and because of that, not all TVs support Dolby Vision. More and more content supports it though, including on services like Netflix, Apple TV+, Disney+, and more.

Dolby Vision offers a number of clear advantages over other HDR formats. Namely, the standard supports 12-bit color, and a theoretical maximum brightness of a hefty 10,000 nits. There isn’t really any content that can actually reach those limits on color or brightness — which means that Dolby Vision is much more future-proof than other HDR standards. Eventually, technology will probably outgrow Dolby Vision, but it may be some time before that happens.

Unlike HDR10, Dolby Vision uses “dynamic metadata.” That means that each frame in a video can be tweaked. The downside of that is the fact that it uses a lot more bandwidth, but it does end up looking a lot better than HDR10 content.

What is HLG?

HLG, or Hybrid Log-Gamma, is a little different than other HDR standards. For starters, it was developed specifically for broadcast signals, by the BBC and NHK, a Japanese broadcaster. Unlike other HDR formats, HLG doesn’t use metadata to tell a TV how to display an image. Instead, HLG takes an SDR signal and adds HDR information layer over the top of it. This means that TVs that don’t support HLG will just display the image in regular SDR. And, TVs that do support it will get the extra information about show a brighter, more vibrant image.

There are some downsides to this approach. HLG can make images brighter and more vibrant, but it can’t do much to the black levels of an image — so you won’t really get much better detail in shadows and night scenes.

HLG is still in its infancy, and as a result, there isn’t much HLG content out there. We’ll have to wait and see if that changes over time.

Is HDR only for 4K?

No. In fact, the two aren’t really related. 4K refers to the resolution of a TV, while HDR refers to the color range on offer by the display. Most 4K TVs support some kind of HDR, but some 1,080p TVs support HDR too.

How to watch HDR

If you want to take advantage of HDR content, you’ll need a few things. For starters, you’ll need a TV that supports an HDR format — and they’re not cross-compatible. That means that you won’t get the benefits of Dolby Vision on a TV that only supports HDR10+. You’ll also need to find content that is available in that format. There’s plenty of content in Dolby Vision and HDR10+ these days, but not all of it is. Most streaming services label the formats you can watch in.

If you use a streaming device, like an Apple TV or Roku, that device will need to support the format too.

Christian de Looper Senior Reviews Editor

Christian de Looper is based in sunny Santa Cruz, California. He has been expertly reviewing tech products for more than 8 years, and brings experience in deep technical analysis of consumer electronics devices to BGR's reviews channel.