Witcher 3 Benchmarked, GTX Titan Does 30FPS At 1080P And Ultra Settings
The benchmarks have just arrived, showing that The Witcher 3 could quite possibly be the most demanding game we've ever seen thus far. I've heard your audible gasps and your shouts, how on earth would a GTX Titan only muster 30 FPS at 1920x1080 ? Before some of you start seizing in shock I have to quickly point out that I'm referring to the original GK110 based Kepler Titan not the Titan X. But even then a 30 FPS average is still unprecendently low for what was the fastest GPU in the world not long ago.
The game is so demanding in fact that not even a GTX Titan X can manage a 60 FPS average with GameWorks effects enabled at 1920x1080. It's not all bad news however, because the game has the visual quality to show for it. In fact The Witcher 3 might just be the most beautiful Open World RPG there has ever been. Take a look at some of the published 4K screenshots and judge for yourself. Now before any further delay let's get straight into the numbers and what they mean.
Initial Witcher 3 Benchmarks Arrive, Beautiful Visuals Come At A Steep Price
Let's begin with 1920x1080 benchmarks running at the Ultra preset with GameWorks effects ( HBAO+ and HairWorks ) disabled. These results are courtesy of PCGamesHardware.de, so many thanks to the good folks over there. These tests were conducted with the latest drivers from both AMD and Nvidia, including Nvidia's recently launched Game Ready drivers for The Witcher 3.
At 1920x1080 and with Ultra settings you will want at least an AMD Radeon R9 285 or an Nvidia GeForce GTX 960 just for an average of 30 FPS. Which is often too slow and choppy for most PC gamers
If you want a more reasonable mid 40s framerate you will want to step up to an AMD Radeon R9 290 series card or an Nvidia GTX 970.
We're fairly certain that if a couple of settings are turned down to high or medium, an average of 60 FPS will be attainable with $300 class GPUs like the 290 and the 970. While 45 FPS may be attainable on the $200 class cards like the 285 and the 960. So that's the good news.
The not so good news however is that if you're an owner of a GeForce GTX 700 series card you will need to turn the settings considerably down to attain playable framerates. Because unfortunately just as I had mentioned at the beginning of the article. A GeForce GTX Titan only manages an average of 30 FPS, which is really just one FPS ahead of the R9 285. In fact all Nvidia Kepler GPUs ( 600 series and 700 series ) show almost catastrophically poor performance. If you look at the GTX 660 for example, which is a card that competes with AMD's R9 270, it's nearly performing at half the rate of the slightly faster R9 270X with a 13.6 FPS average compared to the 24.2 FPS average of the 270X.
The GTX Titan and GTX 780 also perform quite poorly, falling behind their Radeon counterparts the 290X and 290 by nearly 50%. We see this trend with older graphics cards from Nvidia's GTX 500 series and AMD's HD 6000 series as well. Which indicates that Nvidia's Game Ready drivers are only optimized for Maxwell based GTX 900 series products. And will need some serious optimization work to bring the Kepler based 600 and 700 series products up to par.
On the Radeon side we actually see surprisingly strong performance for an Nvidia partnered title. The results are more inline with how the cards would normally stack up against the competition. With the R9 290 and 290X cards performing similarly to the factory overclocked GTX 970 from MSI. And the R9 285 performing similarly to the GTX 960.
So now let's take a look at how the game performs with HairWorks enabled.
With game works off vs on we see an average difference in performance of about 22% on Nvidia 900 series Maxwell GPUs and 700/600 series Kepler GPUs. On AMD's GCN 1.2 GPUs like Tonga, we see double that performance penalty. And on GCN 1.1 GPUs such as Hawaii we see nearly three times the penalty seen on the Nvidia GPUs. This in part thanks to the architectural improvements that AMD introduced with GCN 1.2, particularly in relation to tessellation.
Unfortunately CD Projekt Red stated that this Nvidia optimized visual effect cannot be optimized for AMD GPUs due to particulars related to the GameWorks program. This naturally caused a lot of controversy as a result, which we distilled in an article detailing the notoriety surrounding the GameWorks program since its inception and up till today. Which you should really check out if you haven't already.
Before we move on, there's just one more interesting little point that I'd like to quickly mention. In Bryan Burke's response to the GameWorks backlash, he essentially blamed the tessellation performance of AMD GPUs for how poorly HairWorks and other GameWorks effects ,which relied heavily on tessellation, performed on Radeon cards. However a quick look at the R9 285's tessellation performance reveals that the situation isn't exactly as Burke made out it to be. Based on tessellation performance alone, an R9 285 should see a comparable performance hit to the GTX 960 when enabling HairWorks.
However that's not the case, not even remotely. The R9 285 actually sees a performance penalty that's twice as bad as that seen on the GTX 960. And because CD Projekt Red and AMD are not allowed to work towards identifying and addressing the real culprit behind this poor performance in the game code we're back to square one.
So there you have it folks, Witcher is a GPU destroying monster. Hopefully we'll see performance improvements with additional patches, driver updates and most importantly DX12 support. Share your thoughts on the current situation in a comment below and stay tuned for our Witcher 3 review.