HDR是过去几年网络上讨论最多的技术之一。无论是关于电视和电影(TVs and movies),还是电脑显示器和游戏,HDR都在我们的许多设备上站稳脚跟。你想知道什么是HDR吗?你知道HDR的不同类型、标准和认证是什么吗?如果您想了解为什么某些屏幕在HDR方面比其他屏幕更好,并了解HDR10、HDR10+、杜比视界(Dolby Vision)、DisplayHDR 400或DisplayHDR 1000等术语的含义,请阅读本文:
什么是 HDR?
HDR 是 High Dynamic Range 的首字母缩写词,它是一种旨在使图像尽可能接近真实世界的技术。(HDR is an acronym for High Dynamic Range, and it is a technology designed to make images resemble the real world as closely as possible.) HDR是一个您可以在摄影以及与屏幕相关的所有事物中听到的术语。
为了使图像尽可能真实,具有HDR 的设备使用(HDR use)更广泛的颜色、更亮的光区和更深的黑色作为阴影。所有这些,再加上更平衡的对比度,使图像看起来更真实、更准确,更接近人眼在现实世界中看到的东西。
当涉及到显示器、电视或任何其他类似设备上显示的数字图像时,HDR在具有复杂颜色组合以及明暗区域的图片或视频中尤为明显。这样的例子可以是日落和日出、明亮的天空、白雪皑皑的风景等。
有哪些不同的HDR格式:HDR 10、HDR+、杜比视界(Dolby Vision)和HLG?
在屏幕方面,有三种主要的HDR格式或配置文件,如果您愿意的话:HDR 10、HDR+和杜比视界(Dolby Vision)。所有这些媒体配置文件都适用于视频内容的录制或渲染方式,以及具有HDR(HDR)屏幕的设备如何显示该内容。尽管它们都针对同一件事——显示更逼真的图像——但它们有不同的要求、规格和属性。
定义不同HDR配置文件的基本标准与图像质量(image quality)有关。请参阅下表进行比较:
让我们一一介绍这些标准:
位深度。(Bit depth.)通常,显示器、笔记本电脑屏幕、电视(TVs)和大多数其他屏幕,包括智能手机上的屏幕,都使用 8 位颜色。这使它们可以显示 1670 万种颜色。HDR屏幕具有 10 位或 12位深度(bit depth),这允许它们分别显示 10.7 亿色或 687 亿色。HDR 10和HDR10+具有 10 位颜色,而杜比视界(Dolby Vision)支持12位深度(bit depth)。所有这些都是令人印象深刻的巨大数字。但是,你应该知道,至少目前市面上只有 10 位屏幕(HDR and HDR+),所以即使杜比视界(Dolby Vision)听起来很棒,目前您无法在任何消费者屏幕上享受它。
峰值亮度。(Peak brightness.)这是指具有HDR(HDR)的屏幕可达到的最小峰值亮度。对于能够显示HDR图像的屏幕,它们需要比常规SDR(标准动态范围(Standard Dynamic Range))屏幕更高的亮度级别。峰值亮度(Peak brightness)以 cd/m² 为单位测量,通常必须至少等于 400 cd/m²。阅读本教程的下一部分,了解基于峰值亮度的不同(peak brightness)HDR标准。
最大黑色亮度。(Maximum black brightness.)如您所知,HDR屏幕旨在显示尽可能接近现实的图像。为此,除了明亮图像区域的高峰值亮度(peak luminance)外,它们还必须能够使用非常深的黑色显示黑暗区域。这就是最大黑电平亮度(maximum black level luminance)发挥作用的地方。该属性的典型值小于 0.4 cd/m²,但就HDR协议而言没有要求。但是,VESA DisplayHDR标准确实有最大黑电平亮度的特定值(level luminance),正如您在本文的下一部分中看到的那样。任何可以在低于 0.0005 cd/m² 的亮度下显示黑色的屏幕都被认为是真黑(True Black)。
色调映射。(Tone mapping.)使用HDR(HDR)创建的内容(例如电影或游戏)的亮度值可能比HDR 屏幕(HDR screen)实际显示的亮度值高得多。例如,电影中的某些序列的亮度水平可能超过 1000 cd/m²,但您正在观看的HDR 屏幕的峰值亮度为 400 cd/m²。(HDR screen)那会发生什么?您可能会认为图像中亮度超过 400 cd/m² 的任何部分都会丢失。他们不是,至少不完全是。HDR屏幕(HDR screen)所做的是一种叫做色调映射(tone mapping)的东西,基本上是使用算法来降低拍摄图像的亮度,使其不会超过它们的峰值亮度。当然(Sure),这样会丢失一些信息,而且对比度实际上看起来比SDR(标准动态范围(Standard Dynamic Range))屏幕上的要差。但是,图像仍然比SDR屏幕上的细节更多。
元数据。(Metadata.)为了让HDR 屏幕(HDR screen)能够显示HDR 内容(HDR content),无论是电影还是游戏,都必须使用HDR创建内容。例如,您不能只用SDR(标准动态范围(Standard Dynamic Range))拍摄电影,然后期望它在电视上以HDR显示。(HDR)使用HDR(HDR)创建的内容(Content)存储称为元数据的信息,用于说明其应如何显示。然后,播放内容的设备会使用该信息来正确解码内容,并使用恰到好处的亮度。问题是并非所有的HDR格式使用相同类型的元数据。HDR10使用静态元数据,这意味着应用于内容显示方式的设置从头到尾都是相同的。HDR10+ and Dolby Vision使用动态元数据,这意味着可以即时调整显示的图像。换句话说,HDR 内容(HDR content)可以在不同的场景,甚至是视频的每一帧中使用不同的亮度范围。
您可能已经注意到,我们还没有提及HLG。HLG 来自Hybrid Log Gamma,代表一种HDR 标准(HDR standard),允许电视公司等内容分销商使用单个流播放SDR(标准动态范围(Standard Dynamic Range))和HDR(高动态范围)的(High Dynamic Range)电视内容。(TV content)当该流到达您的电视时,内容会以SDR或HDR显示,具体取决于您的电视的功能。
什么是 DisplayHDR?
此外,为了让事情变得更复杂,除了HDR格式之外,还有称为DisplayHDR的(DisplayHDR)HDR 性能(HDR performance)规范。获得DisplayHDR认证的设备符合一系列标准,以确保它们能够以一定的质量显示带有HDR的图像。(HDR)如果您在互联网或电子商店搜索要购买的新电视或显示器,您可能会偶然发现DisplayHDR 400、DisplayHDR 600或Display HDR 1000或其他类似术语。他们的意思是什么?
VESA(视频电子标准协会(Video Electronics Standards Association))是一个由全球 200 多家公司组成的国际协会,为包括电视和计算机显示器在内的各种视频显示器制定和维护技术标准。他们建立此类标准的领域之一是HDR。他们对HDR 显示器(HDR display)的标准称为DisplayHDR,并且都适用于至少支持HDR10的屏幕。要获得DisplayHDR 认证(DisplayHDR-certified),电视、显示器和任何其他配备HDR 显示器(HDR display)的设备必须满足以下亮度标准以及其他更多技术规格:
如何在我的 Windows 10 PC 上启用HDR ?
作为 HDR 的最后一点,(HDR)我们想告诉你一些关于HDR 和 Windows 10(HDR and Windows 10)的信息。如果您使用的是 Windows 10计算机或设备(computer or device),一个有趣的事实是该操作系统(operating system) 仅支持 HDR10(supports only HDR10)。
此外,只有在使用Nvidia GTX 900系列、GTX 10 系列、GTX 16 系列或RTX 20 系列显卡时,您才能在 PC 上享受游戏或电影中的HDR 内容。(HDR content)如果您是AMD 用户(AMD user),则需要有AMD Radeon R9 38或 39 系列显卡或Radeon RX 460及更高版本。如果您的 PC 符合这些标准并且您还有HDR 显示器(HDR monitor),您可能想了解如何在其上启用HDR:如何在我的 Windows 10 计算机上打开HDR ?
关于HDR(HDR),您还有什么想了解的吗?
我们希望我们已经成功地阐明了什么是HDR以及您为什么希望它出现在屏幕上。您打算购买提供HDR 支持(HDR support)的显示器或电视吗?您已经拥有一个或多个?在下面的评论部分中分享您的意见和问题(如果有的话),我们将尽最大努力提供帮助。
What is HDR? What's different between HDR formats?
HDR is one of the most dіsсussed technologies on the web in the past few years. Whethеr it's аbout TVs and movies, or computer monitors and games, HDR іs gaining a foothold on many of our devices. Do you want to know what HDR is? Do you know what the different types, standards, and certifications of HDR are? If уou want to understand whу sоme screens are better thаn others when it comes to HDR, and to see what terms such as HDR10, HDR10+, Dolby Vision, DisplayHDR 400 or DisplayHDR 1000 are all about, read this artіcle:
What is HDR?
HDR is an acronym for High Dynamic Range, and it is a technology designed to make images resemble the real world as closely as possible. HDR is a term that you can hear of in photography, as well as in everything that's screens-related.
In order to make images as authentic as possible, devices with HDR use wider ranges of colors, brighter light areas, and darker blacks for shades. All these, together with much more balanced contrast ratios, make images look more realistic and accurate, closer to what the human eye would see in the real world.
When it comes to digital images displayed on a monitor, TV, or any other similar devices, HDR is especially noticeable in pictures or videos that have complex combinations of colors, and light and dark areas. Such examples can be sunsets and sunrises, bright skies, snowy landscapes, etc.
What are the different HDR formats: HDR 10, HDR+, Dolby Vision, and HLG?
In terms of screens, there are three major HDR formats, or profiles if you prefer: HDR 10, HDR+, and Dolby Vision. All these media profiles apply both to how video content is recorded or rendered, and to how that content is displayed by devices with HDR screens. Although all of them aim for the same thing - to display more realistic images - they have different requirements, specs, and properties.
The essential criteria that define the different HDR profiles are related to image quality. See the table below for a comparison:
Let's cover these criteria one by one:
Bit depth. Usually, monitors, laptop screens, TVs, and most other screens, including those on smartphones, use 8-bit colors. That allows them to show 16.7 million colors. HDR screens have a 10-bit or 12-bit depth, which allows them to display 1.07 billion colors, or 68.7 billion colors respectively. HDR 10 and HDR10+ have 10-bit colors, while Dolby Vision supports a bit depth of 12. All are impressive, huge numbers. However, you should know that, at least for the time being, there are only 10-bit screens on the market (HDR and HDR+), so even if Dolby Vision sounds fantastic, you don't get to enjoy it on any consumer screens for the moment.
Peak brightness. This refers to the minimum amount of peak luminance attainable by a screen with HDR. For screens to be able to display HDR images, they need higher brightness levels than regular SDR (Standard Dynamic Range) screens. Peak brightness is measured in cd/m² and usually has to be equal to at least 400 cd/m². Read the next section of this tutorial to see the different HDR standards based on peak brightness.
Maximum black brightness. As you know, HDR screens aim to display images that are as close to reality as possible. To do that, besides a high peak luminance for bright image areas, they must also be able to display dark areas using very dark blacks. That's where the maximum black level luminance comes into play. The typical values for this attribute are less than 0.4 cd/m², but there is no requirement as far as HDR protocols are concerned. However, VESA DisplayHDR standards do have specific values for the maximum black level luminance, as you can see in the next section of this article. Any screen that can show blacks at a brightness of less than 0.0005 cd/m² is considered to be True Black.
Tone mapping. Content that was created with HDR, such as movies or games, can have much higher brightness values than what an HDR screen can actually display. For instance, some sequences in a movie might have brightness levels of over 1000 cd/m², but the HDR screen on which you're watching it has a peak brightness of 400 cd/m². What happens then? You might be tempted to think that any parts of the image that are brighter than 400 cd/m² are lost. They're not, at least not entirely. What HDR screens do is something called tone mapping, basically using algorithms to reduce the brightness of the filmed images, so that it doesn't go beyond their peak brightness. Sure, some information is lost this way, and contrast can actually look worse than on an SDR (Standard Dynamic Range) screen. However, images still have more details than on SDR screens.
Metadata. In order for an HDR screen to be able to display HDR content, regardless of whether it is a movie or a game, that content must be created with HDR. You can't just film a movie in SDR (Standard Dynamic Range) and expect it to be displayed in HDR on a TV, for example. Content that's created with HDR stores information called metadata about how it should be displayed. That information is then used by the devices on which you play the content to decode the content correctly, and use just the right amount of brightness, for example. The problem is that not all HDR formats use the same kind of metadata. HDR10 uses static metadata, which means that the settings applied to how the content is displayed are the same from beginning to the end. HDR10+ and Dolby Vision, on the other hand, use dynamic metadata, which means that the images displayed can be adjusted on-the-fly. In other words, HDR content can use different ranges of brightness on different scenes, or even for each frame of a video.
You might have noticed that we didn't mention anything about HLG yet. HLG comes from Hybrid Log Gamma and represents an HDR standard that allows content distributors, such as television companies, to broadcast TV content that's both SDR (Standard Dynamic Range) and HDR (High Dynamic Range) using a single stream. When that stream reaches your TV, the content is displayed either in SDR, or in HDR, depending on what your TV is capable of.
What is DisplayHDR?
Furthermore, to make things a bit more complicated, besides HDR formats, there's also the HDR performance specification called DisplayHDR. Devices that bear the DisplayHDR certification meet a series of standards that ensure that they can display images with HDR at a certain quality. If you've searched the internet or electronic shops for a new TV or a monitor to buy, you might have stumbled across the terms DisplayHDR 400, DisplayHDR 600 or Display HDR 1000, or others like that. What do they mean?
VESA (Video Electronics Standards Association), which is an international association of over 200 companies all over the world, creates and maintains technical standards for all kinds of video displays, including TVs and computer monitors. One of the fields in which they have established such standards is HDR. Their standards for HDR displays are called DisplayHDR, and all apply to screens that support at least HDR10. To be DisplayHDR-certified, a TV, monitor, and any other device with an HDR display must meet the following brightness standards, among other more technical specifications:
How do I enable HDR on my Windows 10 PC?
As a final note on HDR, we'd like to tell you a bit about HDR and Windows 10. If you are using a Windows 10 computer or device, an interesting fact is that this operating system supports only HDR10.
Furthermore, you can enjoy HDR content in games or movies on your PC only if you're using an Nvidia GTX 900 series, GTX 10-series, GTX 16-series, or RTX 20-series graphics card. If you're an AMD user, you need to have an AMD Radeon R9 38 or 39-series graphics card or a Radeon RX 460 and up. If your PC meets these criteria and you also have an HDR monitor, you might want to learn how to enable HDR on it: How do I turn on HDR on my Windows 10 computer?
Is there anything else you would like to know about HDR?
We hope that we've managed to shed a bit of light on what HDR is and why you would want it on your screens. Do you intend to buy a monitor or a TV that offers HDR support? Do you already have one or more? Share your opinions and questions, if you have some, in the comments section below, and we'll do our best to help.