Does Riva Tuner Statistics Server Overlay Impact AMD Radeon Performance?
Using MSI Afterburner along with Riva Tuner Statistics Server Overlay is an amazingly excellent tool for seeing how your system is performing in real time, from CPU utilization to frame times it's about as hand as it gets. However, I've seen claims of performance impact on Radeon cards when using this program, to the point where there has been talks of dropping support for RTSS over it. I've been just as guilty as others in the past of using RTSS when recording gameplay footage to show the performance differences between Radeon and GeForce cards. My understanding from the internet is that Radeon is affected whereas GeForce is not. If this is true then when we see performance comparisons between say a RX 480 and a GTX 1060 what we see is what is happening, but it could be lesser on the RX 480 than what it is capable of in reality.
Thankfully for all of my benchmark testing I've never run with any overlay on, even with OCAT I keep the overlay disabled along with all other monitoring software just so that I don't introduce any possibility of interference. Today, I'm going to test in a few games and Firestrike if this is measurable and whether it's an internet myth or there is reality to it.
Testing was fairly simple on our GPU test bench. I decided to go with the RX 480 8GB and the GTX 1060 6GB as their representative of the majority of users out there these days and offer a baseline that is easy to hit. Higher end cards like the GTX 1070 and RX Vega 56 are a little more fickle about maintaining a constant clock speed through testing. This is easy to do with the RX 480 8GB and the GTX 1060 6GB as with the Radeon card moving the power target slider in Radeon Settings allows it to stay locked at its boost frequency and the GTX 1060 6GB in all our testing has stayed at 1886MHz over prolonged runs. I did allow for a cool down time between runs and let the cards return to idle temps Consistency is key here as we'll be looking for changes in performance. We are running four different games across various APIs and engines along with Firestrike to see if there is an impact with Riva Tuner Statistics Server on with the overlay and without it or Afterburner running. And for those curious the GTX 1060 6GB was included as a control to see how this fares on the Radeon competitor to verify whether or not they're impacted as well.
|CPU||i5 8600k @ 5GHz|
|Memory||16GB Geil EVO X DDR4 3200|
|Motherboard||EVGA Z370 Classified K|
|Storage||Adata SU800 128GB
2TB Seagate SSHD
|PSU||Cooler Master V1200 Platinum|
Graphics Cards Tested
|GPU||Architecture||Core Count||Clock Speed||Memory Capacity||Memory Speed|
|NVIDIA GTX 1060 FE 6GB||Pascal||1280||1506/1708||6GB GDDR5||8Gbps
|XFX RX 480 8GB||Polaris 10||2304||1266||8GB GDDR5||8Gbps
UPDATED: Vulkan Results
After running the first few tests we were clued in that there is a pretty noticeable impact on DOOM and Wolfenstein 2 so we had to try that out, and sure enough, yes there was.
So, is RTSS crippling Radeon cards? Not quite, but there is enough of an impact at times to not risk it for quality of results. In the games we tested it had minimal impact on the averages but half of the time it did impact the minimum .1% numbers outside of the margin of error levels. This was something I really wanted to look into after seeing some pretty egregious claims of 5% or better floating about the web. While these games aren't representative of the entirety of the games that can be played it does show that a level of impact is there, and would likely magnify on higher end Radeon cards like the RX Vega lineup and could skew results and side by side comparisons of competing cards. We saw no impact on the GTX 1060 6GB other than margin of error so no worries there. If you're a Radeon owner and still want the overlay but don't want to risk performance impact the the built in Radeon Overlay in Radeon Settings is a very valid option. Part of me wonders if the large swing that some people online have found is from multiple runs with constant core temp increases resulting in lower performance from one run to the next, which is why we allowed for normalized thermals before proceeding and ensuring we ran in a way that clock speeds would stay 100% constant so that the only variable was whether RTSS was engaged and the Overlay present.
Have you found anything like this in your own personal testing? Is this something that you've heard about or noticed? Is there anything else like this type of work that you would like to see in the future? Sound off in the comment section.