HDR stands for High Dynamic Range, and when you see it taped to a gaming monitor, TV, or game, it means they support a wider and deeper range of colors and contrast than SDR or Standard Dynamic Range. While support for PC is slowly rising, HDR is gradually becoming quite common in PC gaming, and you can now enjoy a plethora of games with HDR enabled.
However, there is some bad news. HDR is not a one-size-fits-all standard. In fact, there are many levels of HDR, and how good it looks depends on the capabilities of your monitor or TV. The HDR confusion has eased somewhat with the introduction of VESA’s DisplayHDR standard, which attempts to provide customers with an easily recognizable rating scale, though there’s a lot more work going on behind the scenes than these specifications reveal.
What is HDR?
HDR, or high dynamic range, is an umbrella term for a family of standards designed to extend the range of color and contrast of video displays far beyond what current hardware can produce. Despite what you may have heard during the 4K push, resolution isn’t the most important thing when it comes to image quality alone.
Contrast, brightness and vibrant colors all become more important to image quality once resolution requirements are met, and improving these is what HDR is all about. This is not an incremental upgrade either. The aggressive requirements of HDR mean that almost everyone needs new hardware, and you don’t need benchmarks or a trained eye to perceive the difference.
Brightness is measured in cd/m2, and you’ll see the display offer a sliding scale of 100–1500 cd/m2 peak brightness. Don’t worry if you see brightness measured in nits, as this is another unit for describing brightness and is used interchangeably with cd/m2. Again, you may find that brightness and brightness are used interchangeably, and while they’re technically different, it’s generally fine. The best measure of how much light an object emits is brightness, although our perception of this is how bright the object is.
HDR specs do require at least 1000 cd/m2 or nits For LCD screens, but you’ll find the DisplayHDR 400 standard has a maximum brightness of 400 cd/m2. That’s not much brighter than most SDR panels, in fact, some SDR gaming monitors were in the 300/400 cd/m2 range long before DisplayHDR came along. If you really want to experience HDR to the fullest, you’ll need to go beyond DisplayHDR 400 and get higher brightness levels.
Good laptops don’t do much better with HDR content, as they can usually only push around 100 nits or so. Phones with sci-fi sunlight-viewable screen technology can reach just under 1000 nits, such as the Samsung S21 Ultra.
In terms of brightness, however, HDR-compliant OLED screens typically keep all these displays in the dark. Much depends on the basic structure of OLEDs: OLED displays emit visible light without a backlight, which is required for LCD panels. This means that the contrast ratio of an OLED panel can be much higher than that of an LCD, even one with a very localized backlight, which makes OLED ideal for HDR content.
Color has also been transformed through the HDR specification, requiring a full 10- or 12-bit color space per channel, fully accessible across the entire operating system, and managed through a valid set of standards. Most PC monitors provide 6- or 8-bit color per channel using only a subset of the full color space called sRGB, which covers one-third of the HDR visual spectrum. However, even though the hardware can do more, the software features make it cumbersome to use the traditional enhanced color mode.
Currently, support Wide gamut colors, or WGC, usually reserved for professional use such as image processing or medical research applications. Games and other software simply ignore the extra color, and when the reduced color space they use is mapped to a wide-gamut display, they often look incorrectly adjusted unless the hardware takes special steps to simulate the reduced color space .
The HDR standard avoids confusion by including metadata in the video stream, which helps manage color spaces properly, ensuring applications look correct and take advantage of improved display capabilities. To help with all the extra data, some HDR variants have HDMI 2.0a or HDMI 2.1 as a minimum display connector requirement; a long overdue upgrade to the ubiquitous low-bandwidth HDMI 1.4 standard.
Which HDR standards should I be looking for?
Mostly, it boils down to two: the proprietary Dolby Vision has 12-bit color and dynamic metadata, while the open standard HDR10 supports 10-bit color and only provides static metadata at the beginning of the video stream.
Most gaming hardware makers and software studios turned to HDR10 first, as Dolby Vision, with its licensing fees and additional hardware, was a more expensive implementation, slowing its adoption. Microsoft’s Xbox Series X/S supports HDR10, as does Sony’s PlayStation 5. Some older consoles also support HDR10.
However, Microsoft recently released Dolby Vision Gaming Supports Xbox Series X and Series S, with over 100 HDR games available or coming soon on the platform.
When it comes to PC gaming, hardware support is usually there, but it boils down to each game’s support. Most PC HDR games support HDR10, and only a few support Dolby Vision. One such game is Mass Effect: Andromeda, if you want to try it out.
Dolby Vision is often seen as the best option, but you really need a good HDR gaming monitor to get the most out of it. Dolby Vision proponents tout its greater color depth, more demanding hardware standards, the ability to dynamically adjust content frame by frame, and its HDR10 compatibility, but games are moving towards a cheaper, good enough HDR10 standard to pass itself.
Because HDR10 uses static data, very bright or dark content means it can be difficult to render occasional scenes shot on the other end of the spectrum that can appear blurry or exploded. HDR10+ adds Dolby Vision-style dynamic metadata and removes this problem, and it is now the default variant of the HDMI 2.1 standard. HDR10+ also retains the open source model that makes HDR10 easy for manufacturers to adopt.
Recently, the HDR10+ GAMING standard was announced because we need more standards in the HDR space. It was created to give developers and gamers even more reasons to use HDR10+. It uses Source Side Tone Mapping for more accurate game output to compatible monitors, and also includes automatic HDR calibration.
If you’re a video or movie lover who buys based on image quality and rated content, Dolby Vision might be more interesting.For example, Netflix is Especially interested in Dolby Vision, almost everything they make in-house supports it. Vudu and Amazon also offer Dolby Vision content.
In case you were wondering, there are other HDR standards: HLGor Hybrid Log Gamma, developed by the BBC and used on YouTube; and Advanced HDR Developed by Technicolor and used mainly in Europe, it allows the playback of HDR content on SDR displays through the trick of the gamma curve.
So what is DisplayHDR?
DisplayHDR is a VESA specification that sets the rules for what constitutes a good HDR experience. Each standard from DisplayHDR 400 to DisplayHDR 1400 ensures proper peak brightness, dynamic contrast, color gamut, and local dimming for a good HDR user experience.
Basically, the display is quoted as being HDR10 compliant, but doesn’t really offer the hardware needed to make HDR10 content really pop. So VESA, a company that aims to create an interoperable standard for electronics, stepped in.
If you’re looking for a compatible HDR gaming monitor right now, the DisplayHDR specification is the best place to start. You can dig into the details from there, but knowing whether you want DisplayHDR 1400 sets or DisplayHDR 600 sets really helps narrow down your options.
Is my graphics card HDR ready?
Almost certainly, yes. One place where PCs are ready for HDR is the graphics card market. While displays lag behind their TV counterparts in HDR implementation, mid-to-high-end GPUs have been nearly a year ready for this revolution thanks to healthy competition between Nvidia and AMD.
Lest you think this is one of the HDR areas that isn’t plagued by competing formats, AMD and Nvidia also have HDR-compatible Variable Refresh Rate (VRR) technology. Nvidia’s G-Sync Ultimate has a loose promise of “realistic HDR,” while AMD’s Freesync Premium Pro has low-latency SDR and HDR support. These are the top VRR technologies from Nvidia and AMD, so keep that in mind when researching potential HDR gaming monitors.
Are all games HDR ready?
Many new games today offer support for HDR. While it’s true, older software doesn’t support wider color and contrast capabilities without tinkering. Those older games might play fine on HDR-equipped systems, but you won’t see any benefit without adding some new code.
Fortunately, taking advantage of HDR’s superior technology doesn’t require rewriting the underlying software. A fairly simple mapping process that extends SDR colormaps to the HDR range through algorithmic transformations can be used to convert SDR titles without a lot of work.
One way is Windows’ Auto HDR feature, which actually takes SDR games and retroactively makes them HDR capable.