Red Dead Redemption 2 PC Performance Explored – Finally
There's more RAGE to Red Dead Redemption 2's PC launch than just the engine. The PC launch of Red Dead Redemption 2 has leaked at least once a month since the game launched on Xbox and Playstation just over a year ago, and this past week it finally released with much anticipation of the new Rockstar game launcher and the Epic game store. Neither of which when too smoothly, to say the least. RDR2 is using the latest iteration of the Rockstar Advanced Graphics Engine, or RAGE, and has ditched DX11 in favor of being able to choose between Vulkan or DX12 for your preferred API. There are more than enough people out there having issues that are preventing them from even getting the game to launch and those who are able to get it to launch are finding not only a graphical feast but one that harkens back to Crysis with its ability to bring even the strongest systems to their knees. This has been one of the most frustrating and time-consuming performance analysis that I have done in quite some time thanks to the occasional crash, constant restarts after simply changing the resolution and with each benchmark run.
Thankfully Red Dead Redemption 2 has built-in benchmark utility that is fairly representative of typical gameplay and we used this to measure our performance. We used the last section of the benchmark that accounts for a 128 second run of typical gameplay and used Frameview to extract our results metrics rather than relying on the number that the game presents to us.
As far as settings used for the testing we had to create our own 'preset'. The game has a preset slider of sorts, but it has 20 stops of settings and varied based on which graphics card you are using at the time. So someone with an RX 570 would have a very different combination of settings than someone with an RTX 2080Ti for example. We went with the settings below in the screenshot but they represent a 'High' setting configuration for the game.
Once we had the results from 3 runs, after discarding an initial burner run for loading purposes, we took the average of average frame rates as well as the 99th percentile results from the run. We report our performance metrics as average frames per second and have moved away from the 1% and .1% reporting and are now using the 99th percentile. For those uncertain of what the 99th percentile is, representing is easily explained as showing only 1 frame out of 100 is slower than this frame rate. Put another way, 99% of the frames will achieve at least this frame rate.
|CPU||Intel Core i9-9900k @ 5GHz|
|Memory||16GB G.Skill Trident Z DDR4 3200|
|Motherboard||EVGA Z370 Classified K|
|Storage||Kingston KC2000 1TB NVMe SSD|
|PSU||Cooler Master V1200 Platinum|
|Windows Version||1903 with latest security patches|
Graphics Cards Tested
|GPU||Architecture||Core Count||Clock Speed||Memory Capacity||Memory Speed|
|NVIDIA RTX 2080ti FE||Turing||4352||1350/1635||11GB GDDR6||14Gbps|
|NVIDIA RTX 2080 SUPER FE||Turing||3072||1650/1815||8GB GDDR6||15.5Gbps|
|NVIDIA RTX 2070 SUPER FE||Turing||2560||1605/1770||8GB GDDR6||14Gbps|
|NVIDIA RTX 2060 SUPER||Turing||2176||1470/1650||8GB GDDR6||14Gbps|
|NVIDIA RTX 2060 FE||Turing||1904||1365/168||6GB GDDR6||14Gbps|
|ZOTAC Gaming GTX 1660||Turing||1408||1530/1785||6GB GDDR5||8Gbps|
|NVIDIA GTX 1080 FE||Pascal||2560||1607/1733||8GB GDDR5X||10Gbps|
|NVIDIA GTX 1070 FE||Pascal||1920||1506/1683||8GB GDDR5||8Gbps|
|NVIDIA GTX 1060 FE 6GB||Pascal||1280||1506/1708||6GB GDDR5||8Gbps
|AMD Radeon RX 5700XT||Navi||2560||1605/1755/1905||8GB GDDR6||14Gbps|
|AMD Radeon RX 5700||Navi||2304||1465/1625/1725||8GB GDDR6||14Gbps|
|AMD RX Vega 64||Vega 10||4096||1247/1546||8GB HBM2||945Mbps|
|AMD RX Vega 56||Vega 10||3584||1156/1471||8GB HBM2||800Mbs|
|MSI RX 580 Armor 8GB||Polaris 20||2304||1366||8GB GDDR5||8Gbps
|Sapphire Nitro+ RX 570 4GB||Polaris 20||2048||1340||4GB GDDR5||7Gbps|
Typically for this section, we would do the testing based on the games presets but without actual labeled presets in this one we had to make our own. We originally tried going off of certain stepping in the Quality preset slider but that's when we noticed that it would not be a very good way of doing it since settings were different based on cards. We did like that the game will limit you from being able to apply settings if they break the VRAM capability of your graphics card so that's why you'll see a 0 in one of the results for the RX 570 1080p settings scaling. Long and short each settings level was achieved by changing all values to that title, where 'low' wasn't an option were turned to off and if there was a slider it was set to the corresponding level. This was the best method I could come up with for this test in this game.
After playing it the game will look and perform fairly well with a combination of High-Medium settings outside of textures, so pick the ones you favor to increase and ease off of others to maintain your graphics cards intended target resolution. This will certainly be a game you'll come back to in the future when you upgrade to see what you missed out on the first go around.
4K settings scaling was performed using the GeForce RTX 2080Ti.
1080p settings scaling is the exact same idea as the 4K settings scaling just changed to using an RX 570 at 1080p.
Intel Core Scaling Performance
While this test won’t tell just how many cores and threads the game can and will use, it does show how the game performs as you move up in cores and threads available. These were tested at the 1080p settings that we tested the rest of the results while pairing the CPU with the RTX 2080 Ti Founders Edition. While this does not take into account the cache difference you would see with Intel CPUs as you move through their offering stack it does give us a better idea of how the game benefits and behaves from more cores and threads.
Red Dead Redemption 2 carries a bit of a twist on the 4 cores is enough argument in the sense that once you get to 4 cores, if you've got HyperThreading you'll be okay, but if you don't sorry Charlie, you're in for a stutterfest. Quad cores with no multithreading and lower are simply unusable. While I was able to return results from both the 4c/4t and the 2c/4t they both exhibited actual pauses in the game leading me to beleive it had locked up. 2c/2t would technically launch the game and allow you to load in, but I don't have a metric for measuring seconds per fram yet so that is absolutely useless and the game shouldn't allow those with dual cores to even start this one up.
Vulkan VS DX12
We ran a few tests across various graphics cards to test the viability of each graphics API and based on our findings we found Vulkan to be the consistently better API for each vendor. We ran the same test as all others with only the API changing and the resolution set to 1080p.
Graphics Card Results
So how do you sum up Red Dead Redemption 2 on the PC, I guess the only way you really can is 'Demanding'. The game looks great when you can crank the settings a bit, but you'll want to reserve as much VRAM as you can for the textures and push them as high as possible since that'll be one of the most obvious improvements. Some would call the performance lacking and scream of lack of optimization, and while for now, that might be a possibility this isn't exactly a lackluster visual title. There is so much going on in each scene that you really have to tip the hat to the developers for what all they've achieved here and there are some scenes that are simply breathtaking. Radeon's architecture is handling the game very nicely, especially their new RDNA based Navi cards. NVIDIAs Turing based cards are showing a clear dominance over their older Pascal based cards and that'll likely be due to their new graphics pipeline that enjoys more performance from the Vulkan API than previously. It'll be interesting to see how this game evolves over time as GTAV ran reasonably well when it launched but only got better from there, chances are we can expect a similar story for RDR2. With that out of the way, I'm going to saddle up and ride out until the next one.