AMD Radeon RX Vega 64 Liquid Cooled Power Profiles Exploration
I want to start this one off by ensuring that everyone is aware that this is not a review of the Radeon RX Vega 64 Liquid Cooled Edition. What this is however is a look across 15 games the performance delta between the 'balanced' and 'power save' settings. RX Vega 64 launched two months ago as of the date of this publication on August 14, 2017. Having been available since then there are plenty of comprehensive comparative reviews, I urge you to seek those out if you're looking for in depth comparisons.
One of the most critical aspects of Radeon RX Vega 64 has been its power consumption, which seems to be sensitively effected by clock rates, so we wanted to explore AMD's built in solution to help reduce power draw. That is our main focus with these results.
The above images shows the four different power options available in Wattman: Power Save, Balance, Turbo, and Custom. This is where the ever popular undervolting solution is handled through the Custom setting. We however will be focusing on the 'Balance' and 'Power Save' to see how flexible RX Vega is when allowed to run own it's predefined settings.
Above Shows the RX Vega Power Limits based on vBIOS switch and software settings and the highlighted segments are how we tested. The standard out of the box vBIOS and the Balanced Profile and we went to the secondary vBIOS Power Saver for those tests. One thing of important note here is the fact that our card is the liquid cooled model, air cooled variants will have different results because of varying thermals.
Test Methodology and Setup
For DX11 and OpenGL we utilized FRAPS for capturing performance during our runs. For our DX12 testing I wanted to take a moment and thank a friend of the site again for helping develop a tool that will now allow us to deliver the same level of results for DX12 and Vulkan as we have with DX11. DX12 tests were conducted with OCAT and once the runs are all completed three times the averages for average FPS, 1% low, and .1% low are taken and plotted. We use 1% and .1% lows rather than absolute minimum as the absolute minimum typically represents an outlier frame and doesn’t typically represent actual gameplay.
X370 Test Bench
|CPU||Ryzen 7 1700 3.9GHz|
|Memory||16GB G.Skill Flare X DDR4 3200|
|Motherboard||MSI X370 XPower Gaming Titanium|
|Storage||Adata SU800 128GB
2TB Seagate SSHD
|PSU||Cooler Master V1200 Platinum|
Graphics Cards Tested
|GPU||Architecture||Core Count||Clock Speed||Memory Capacity||Memory Speed|
|NVIDIA GTX 1080 FE||Pascal||2560||1607/1733||8GB GDDR5X||10Gbps|
|RX Vega 64 Liquid Cooled||Vega 10||4096||1406/1677||8GB HBM2||945Mbps
It's very important to note that the GTX 1080 FE is included in here as a point of reference for comparison, this is not a direct comparison of the two graphics cards. A better comparison for purchasing decisions would have been to use the RX Vega 64 Air Cooled, or to have compared the RX Vega 64 LC to the GTX 1080ti as the pricing is more in line. Needed to get that cleared up in case anyone is wondering about the inclusion of the GTX 1080.
Ashes of the Singularity: Escalation
AotS:Escalation was tested using the built in benchmark utility at the High preset using DX12. Interestingly is how close performance stays with the two RX Vega profiles at 1440p, but the disparity begins to show at 4K. This trend will continue more often than not, and be more noticeable on modern APIs.
Battlefield 1 was tested using the same scene as our performance review of the game using DX11 and the Ultra preset. Notice in DX11 there is very little performance difference between the two.
Dirt 4 was tested for a 60 second run on the opening track from inside the car using the Ultra Preset accompanied by 2xMSAA. Here again we see very little difference between the Balance and Power Save settings.
DOOM was tested using Vulkan and Ultra Preset while avoiding the use of AA, async compute is still functioning with no AA. Testing in The Foundry we finally see our biggest delta between the two settings, especially at 4K. DOOM appears to be a bit more sensitive to clock rates than anticipated, especially at 4K where the .1% metrics tank.
Deus Ex: Mankind Divided
DXMD was tested much like our performance review of the game using the built in benchmark tool utilizing the High preset and no AA. at 1440p the performance holds strong, but again at 4K with the new API we see a large tank on the .1% minimums.
For Honor is still using DX11 and pairing the Ultimate detail settings we see the PS holding tight with the Balanced profile, even at 4K.
Gears of War 4
GoW 4 was tested using the same method as our performance review of the game. Using the Ultra setting, and verifying the settings stayed consistent, we once again see the PS fall short at 4k, but hold fine at 1440p.
Grand Theft Auto V
GTA V is a favorite old one of readers so it had to be included. Testing using the built in benchmark we ran it at all 'Very High' settings but instead of any MSAA we opted for FXAA. Just as expected with DX11 we see no real difference in power profiles.
HITMAN was tested using the highest settings along with SMAA with the built in benchmark tool. This is the first DX12 titel that isn't majorly impacted by the clock rate of the GPU. 4K sees a slightly lower average, but maintains the same minimums.
Prey was tested just like the performance review we did using the Very High with FXAA and once again we see very little change, at least on the minimums, the average dropped slightly with the PS.
Resident Evil 7
RE:7 saw a reprisal of our test run from our performance review of the title. Running at the Very High settings along with SSAO and FXAA show little change in the performance based of power profile as expected at this point from a DX11 title.
Rise of the Tomb Raider
RotTR was tested in DX12 using the High preset with no AA. Our run took us through the GeoThermal Valley. Surprisingly enough with this title we see fairly close performance between the power profiles in DX12.
The Witcher 3
The Witcher 3 is running on the High settings but with Hairworks disable for these tests. And just like previous DX11 games we see very little to no difference between the profiles.
Total War: Warhammer
TW:Warhammer was tested using a built in benchmark tool with the Ultra preset. Here again we see the 4K results taking a hit with the PS profile while 1440p remains similar.
Ghost Recon: Wildlands
GR:Wildlands tested the same way with tested for the performance review using the High preset. And once again we see the DX11 results falling in line with each other.
Power and Thermals
Power and thermal testing could likely have been towards the beginning of this article, but I wanted let the results sink in before we got to this point. Power and Thermal testing were all done on the built X370 system inside of a Cooler Master MasterCase Pro 5 with two 140mm intake fans at 1k RPM. The power levels were taken from the wall, so it's a total system power draw. Ambient temperatures in the room were 22c. Measurements were taken after 30 minutes of cycling the benchmark run for Dirt Rally at 4K.
I could understand a little confusion around the thermals in the chart. Only 1c lower with the power save profile? True, it was only 1c lower, but the catch here was the fan speed. The Balanced profile saw a maximum fan speed of around 1600rpm while the Power Save never broke 1000rpm, and hovered around 950rpm rendering it dead silent.
Power draw is the most impressive change in all these charts. Performance delta was minimal at most, temperatures didn't change much, but boy howdy did power levels. Sure it's not down to GTX 1080 levels of power sipping bu over a 100w reduction from the was is nothing to shake a stick at, or complain about.
So, what does this all show. Easy, save some power and lose minimal performance on Vega. I understand wanting the strongest numbers your product can push out, but a part of me wonders how the narrative would have changed with RTG went with the Power Save profile as the default. I get the performance would have suffered a bit but wow at the power draw change. One thing it shows is that Vega really does have a wide band of consideration when discussing performance per watt. It's still behind the GTX 1080 by a healthy margin in that regards, but the Power Save profile helps close that gap considerably. And if you're running a RX Vega card on a Freesync monitor with a generous VRR window you'll likely never know the difference between profiles. This is especially true if you're on a 1440p panel, 4K seems to be a bit of a different story. I can really see where the freesync argument stands after exploring this.