AMD Bringing Better Pixels to PC, HDR and Larger Color Space to Consumers

Posted Dec 8, 2015
104Shares
Share Tweet Submit

At the recent Radeon Technologies Group Summit, AMD announced their plans to increase the quality of color in pixels on being pushed to the screen. The idea is to increase the amount of color and luminescence information in each pixel, and encode and map it in such a way that it’s vastly superior looking. By doing so, they’ll be able to render scenes that more closely match what the human eye can actually perceive.

Better pixels, more color, and a better experience, AMD bringing HDR to everyone.

The basics are that AMD wants to bring 10-bit color rendering to every type of workflow, especially games and that they also want to introduce the world of true HDR to consumers. We’ll go over what HDR actually is at the end.

This philosophy starts with being able to render and send full 10bpp color encoding to the monitor, which all current-generation Radeon GPU’s do. In contrast, only NVIDIA’s professional line actually support that. They don’t want to stop there, though.

The idea is to eventually have 10bpp capable monitors available in the mainstream, with monitors and TV’s that are capable of higher nits, though not just to have a “brighter” image, but to be able to provide a better image as well. Currently there are very few monitors aside from professional monitors that can actually support true 10-bit color. Most that even claim to support over a billion colors are actually 8-bit panels making use of FRC to for approximations and dithering of actually displayable colors to make the others. The result is generally poor. AMD expects more of these kinds of monitors to be available in the second half of 2016.

So how are they planning on going about bringing us better colors? Better encoding of colors, better tone-mapping and supporting the creation of better monitors. In order to  actually display colors on our monitor, information about each pixel is encoded into it that ends up being what we see. Tone-mapping refers to processing techniques used to map one set of colors, likely with more information, to another with less information in order to approximate the appearance of high-dynamic range. It doesn’t always work out for the better. And of course there are very few actual monitors that are true 10-bit capable.

Spencer: 4K & HDR Don't Offer the Same Transformative Leap We Saw from SD to HD

Currently most media seems to be designed around a standard known as Rec.1886, which is includes 8-bits of color per pixel. It’s limited and generally covers the sRGB color-space. The 10-bit ST 2084 standard of encoding is that which is used by professionals working in the imaging sector and covers far more color than you could imagine.

Though we’re trichromatic and can discern only around 7-10 million different colors, the larger number a monitor and media, can display, the better and more realistic it might actually look with different light bearing parameters being able to be replicated correctly. The problem then lies in mapping those colors to what your display is capable of, whether it’s an HDR capable monitor or not. That mapping is called tone-mapping, and it is here that new headway will also be made. The better that we can map a particular shade of green that an artist is trying to convey to the appropriate color that your hardware is capable of, the better it’ll be for everyone and all types of content.

AMD wants to make HDR a big part of all aspects of computing in the near future. But before we get into the specifics of what AMD is planning and what that means for the industry, it’s vitally important to discuss what HDR actually is.

For most consumers, High Dynamic Range is associated to photos and videos that are overexposed and have over exaggerated and over saturated colors that can look good in some instances, but horrid in others. What’s being sold as “HDR” is actually an artistic design so that something “pops” and looks more catchy. It’s a blatant over exaggeration, and it’s simply not true.

Microsoft Isn’t Trying To Turn Consoles Into NVIDIA’s & AMD’s GPU Market

What really is HDR then? Put simply, it is the pursuit of showing a higher dynamic range of luminance in order to better represent the colors and brightness that the human eye can actually perceive. It’s the absolute opposite of what you may have thought. It’s not a “neutral” representation of a given scene by any means, but a representation of what you might actually be seeing at any given time. 

So then, AMD isn’t looking to over saturate games and images to give make it look ridiculous. They want to match the number of colors and amount of brightness that your GPU can render to that of our natural ability so that we have a much better looking image. A natural image. 

There are several things that are needed In order for AMD to achieve their dreams of having better pixels brought to consumers. This is something that needs a whole solution from GPU to monitor and everything in-between to actually be viable. Certainly it’s possible, and for AMD, it’s all about making the experience better, not necessarily outright faster.

We need better pixels

And we do, not necessarily drastically so, but this type of investment in the future of visual computing is a welcome one. If we can get past the typical assumption of what HDR is, then you can hopefully grasp what this means for gaming. This won’t be as dramatic a difference as moving from 16-bit internally rendered color to 32-bit, but it’ll still be quite the difference when combined with increased luminance ranges and contrast ratios. Of course, it all has to be used right to make a drastic difference.

Share Tweet Submit