What Happened To Radeon FRTC And Why You Should Chill


Let's talk about Radeon Chill, but first, where is FRTC? AMD Radeon introduced FRTC, or Frame Rate Target Control, back when they launched the Fury line of graphics cards as a way of targeting a maximum framerate in 3d applications. This had a couple of major benefits from doing so. The first and most obvious being the limiting of the top end fo the frames your graphics card renders and is able to keep your game from exiting the top end of the Freesync window on Freesync panels, since this was the time the push became really strong for variable refresh rate. The second major benefit is in the realm of power savings. By no longer having the GPU run full throttle when not necessary you could greatly reduce power consumption.

In the recent major update to Radeon Software in December of 2019 many users noticed this feature was gone. Poof, finito, adios muchachos.  But, there's no need to panic or worry if that was something you used as there is still a built-in feature to Radeon Software to handle the needs of this. Introducing, for those who are unfamiliar with it, Radeon Chill. Radeon Chill has been with us for some time and offers a similar result, at least on one end, as what FRTC delivered but gives us the added benefit of setting a lower framerate for the GPU to target in low-intensity situations. Those wanting to set it to work like FRTC simply need to set the Min FPS and Max FPS to the same number.

AMD RDNA 3 Flagship, Navi 31 GPU Rumored To Offer Up To 384-bit Bus & 24 GB GDDR6 Memory, Navi 32 Gets 256-bit For Up To 16 GB Memory

Now on to Chill and why you should consider it. Radeon Chill uses a dynamic algorithm to maintain FPS within a set parameter. Now, don't mistake this for some type of dynamic settings and resolution change because it doesn't change anything, but the target framerate, so you can't set something higher than your GPU can achieve and think you're magically going to get more performance. In the simplest of explanations, the Chill algorithm detects movement intensity from your mouse (or camera control stick on a gamepad) and will ramp the GPU payload to increase the framerates to the target maximum. You stop moving the camera and the GPU will relax to allow the framerate to fall down to the lower limit.

But who would want to do something that crazy? Perhaps if you're one of those people who want to maintain their performance within the variable refresh rate window of their panel then you'll definitely benefit from this. The results we are about to look over is a basic example of a realistic scenario, where someone has purchased a new Radeon RX 5500XT 4GB video card and a 1080p Freesync monitor with a variable refresh rate window of 40-75Hz (very common), and we're going to look at the impact of setting Radeon Chill to target that framerate window and what the difference would be if we let it run wide open and tear all over the place. We'll be using our test run of Resident Evil 2 set to 'Balanced' graphical preset.

Radeon Chill settings used.

Thankfully for our results, we were able to draw the numbers from Radeon Software's Performance Logging. If that is something you guys would like a tutorial on how to use sound off in the comments. We set the logging to a .25 second interval so in the timeline of the charts each tick represents 1/4 of a second. We did this to get as accurate of measurements across the test pass as possible.

Frame Rate

You can see in the frame rate charts the RX 5500XT was able to maintain over 100 frames per second across the entire test run, but on a 75Hz panel you could almost consider that wasted performance as the monitor wouldn't be able to display all of those frames resulting in tearing. If you used VSYNC on a 75Hz panel you could easily lock it to that frame rate. But, since or scenario is making use of a 40-75Hz Freesync window we're able to let the game slow down all the way to the bottom end of the window and still maintain a smooth, tear and stutter free experience. But how does this translate to the card itself?

AMD Marketing Claims Radeon RX 6000 GPUs Offer Better Performance Per Dollar & Higher FPS Per Watt Versus NVIDIA’s RTX 30 Series

Core Frequency

Monitoring the Core Frequency of the GPU itself is very interesting to observe over the run. We see that under the default configuration the core clock maintains a very consistent line, but things are much different when Radeon Chill is in effect. We can see that during the early part of the run where it is most intense the GPU core has to maintain a higher frequency to hit the targeted frame rates but as the test continues into a less intensive scene you can see it drop significantly. Take particular note of the bottom sections, these are scenes where I stopped moving for 5 seconds to see if the game would detect and drop the framerate and if I could even perceive it happening. In all fairness, I was convinced it didn't work because I didn't notice a change at all.

Edge Temp

Thermals were definitely impacted by the use of Radeon Chill, along with fan curves as it was noticeably quieter with Radeon Chill enabled. Part of the reason the temperatures kept going is the card was initially warm from running several passes of the unrestricted run of the game, so the thermals would likely continue to fall over time. By the end of the pass, the delta between the two configurations of the exact same card was quite substantial.

Core Power

Power draw while using Radeon Chill is probably one of the parts I was most interested in seeing. This feature did not disappoint. You can see in the early portion of the test where the game is effects heavy that the power was much closer to the default run, but when we moved into the less intense sections the power requirements to maintain the frames fell significantly. Now it is very important to remember that mileage WILL vary depending on game/scene/targeted framerate/settings and more.

As Mr. Freeze Would Say, Alright Everyone! Chill!

Was I sad to see FRTC go? Kinda, but once I got to working with Radeon Chill and looking over the results I'm not so concerned about it anymore. Radeon Chill is a pretty good piece of tech nestled in the heart of Radeon Software that I think has been getting a bit overlooked as times go on. While I checked out the more entry level Navi based cards, I can't help but wonder what kind of impact it might make on the much more powerful Radeon RX 5700XT, perhaps this might need further exploration. But, at the end of the day, I can't help but recommend that those with a variable refresh rate monitor and a modern Radeon graphics card give Radeon Chill a go and see how it works for them.