[DSR Update] [Review] SLI Benchmarks! Titan X SLI Performance Review
[Update] I’ve added DSR benchmarks at the end of the review, you can now see a performance chart that outlines an internally rendered 4K scene that’s output on a 2K monitor.
The only thing that has been anticipated more than just the Titan X itself is perhaps putting two Titan X’s together. It’s impossible to truly predict the effect that SLI will have and just how well it’ll scale in any particular game, but thankfully we have with us two Titan X’s with which to test.
We’ve presented to you plenty of information on the specifics and technical details of the GM200 Maxwell architecture, so we’ll instead skip that and get right into the juicy bits; the wide and wonderful world of Titan X SLI performance numbers. Because that’s why you’re reading this, right? To see how well two of these monsters can do together?
Titan X SLI Performance Review
As a single graphics card it absolutely dominates the competition at a similar power threshold within an acceptable (acceptable, of course, is up to interpretation, but I didn’t think it ran hot enough to be concerning) temperature range. The apparent lack of FP64 hardware doesn’t affect its gaming credentials at all. In fact, it likely helps keep the cost and the power envelope down so that us mere mortals can actually enjoy it.
But now we’re combining two of these beastly things to see what kind of performance delta can be achieved. Does SLI scale well with the GM200 as it does for the other Maxwell based cards? Will the temperatures become worse as more of these things are stuffed closed together? Interesting and very applicable questions.
Test Setup, times are a changin’
Unfortunately my mini-ITX test box couldn’t accommodate, obviously, another Titan X with it’s lack of a second PCIE slot. So to accommodate this portion of the review I’ve had to make a few changes. I’ve moved to a Haswell-E platform, a i7-5960X. though because it’s still Haswell, the difference in chip itself shouldn’t have a huge impact on benchmarks when compared to the Xeon previously used. The boost clock-speed is about the same, so the actual performance in video games will be similar, though obviously those that are coded to use more than 8 threads will greatly benefit from the extra threads available. But for those, please just suspend your belief for a moment, if you could.
|CPU||Intel i7 5960X @3.0GHz|
|Motherboard||Gigabyte GA-X99M Gaming 5 mATX|
|Power Supply||Corsair AX 1200i 1200W Platinum|
|Hard Disk||Samsung PM851 512GB|
|Storage Disk||Samsung 840 Pro 512GB|
|Memory||Corsair Vengeance LPX 16GB 2666MHz|
|Monitor||BenQ BL2710PT 27″ WQHD|
|Video Cards||Geforce Titan X SLI, AMD R9 295X2|
|Drivers||NVIDIA 347.88, AMD Catalyst 14.12 Omega (correction from earlier got the numbers wrong)|
|Operation System||Windows 8.1 Pro|
A note: Why WQHD? I believe that the WQHD is the most accessible resolution to gamers. It’s far less expensive to enter the WQHD market with the plethora of good, great and poor models that are available for very decent prices. 4K is fantastic if a bit unrealistic and unattainable due to price and availability. It’s a more realistic representation of what’s available and what you’re likely to be running. I also don’t feel that there is a true benefit in most games to running at that resolution. Properly implemented AA can be far better performing than increasing the resolution. In addition, I don’t have a 4K monitor, thus I cannot test at that resolution.
The settings are the exact same as before, no changes whatsoever. I’ve added three new rather complex games that should show off the gaming potential of running two Titan X’s in SLI, Star Citizen, Mechwarrior Online and Sniper Elite III. Sadly the R9 295X2 had to be returned to its rightful owner before I could benchmark those particular titles.
Battlefield 4 seems to scale with SLI quite well, but we’ve known about the superior SLI and even Crossfire performance that it can bring. It yields a 45.5% increase in performance. Astounding really.
Not surprising given that Battlefield Hardline uses a slightly modified Frostbite 3 engine. The SLI scaling is, again, rather amazing with a 53.5% increase in performance. It just makes the R9 295X2 look kind of sad.
I really didn’t expect this result from Crysis 3 at all. But it appears that it’s no longer the champion of bringing destroying your system in a benchmark. A 73.7% increase in performance is no laughing matter.
Dragon Age: Inquisition
Dragon Age: Inquisition also gives us some good SLI scaling performance. The test ran has a lot of enemies and other stuff happening on the screen, but we still see a 76.3% increase in performance here.
Middle Earth: Shadow of Mordor
This one was the wild card, and I didn’t really know what to expect from it. The modified LithTech engine is an almost unknown. But it still managed a healthy increase in performance. 25.7% is still an increase!
Civilization Beyond Earth
A 58% increase over a single card here. That’s incredible for an already smooth and quick running game.
Sniper Elite III
Sniper Elite III, despite being a console port, is a great example of a custom engine. It even scales particularly well with a 80.4% increase in performance.
Star Citizen is still in an alpha build and the following performance is not representative of the final product, nor of what you’ll even achieve given the alpha status.
It would appear that the Titan X actually scales fairly well in SLI in these particular games. We all know that SLI is dependent on the driver as much as it is the game. If an SLI profile doesn’t exist and the game itself hasn’t been programmed to take advantage of SLI, then you’re likely not going to see much of a performance boost, if any. Thankfully there are more SLI enabled games than there are Crossfire enabled games at the moment, and certain engines seem to support it far more readily than others.
The most surprising result is that of Shadow of Mordor. I would love to know if SLI is enabled and available in game. Shadow of Mordor is sort of a wildcard due to it using a modified version of the LithTech engine with unknown changes under the hood. But nonetheless, it does have a benefit, just not much of one.
The other most surprising result? Crysis 3. With two Titan X’s in SLI under the hood, we no longer have to worry about Crysis 3 bringing a system to its knees. As amazing as the visuals in that game are, it’s not a threat to burning down our rigs any longer. So rejoice!
The thing that I was most worried about going into this, was the temperatures that would be seen. Traditionally a blower design is more efficient at keeping a closely populated dual card setup cool, compared to other types of cooling (bar liquid cooling, of course), but this card runs a bit hot. Moving to a new system has seen a reduction in temperature for one card, while keeping the other just as toasty as it was before. HWINFO64 was used to capture the data provided below.
Temps in a Battlefield Hardline run
Check that out! That’s not too shabby really. Those are the highest recorded temps in the run. Another question that comes to mind is if the clock speed has to be sacrificed in order to run at those temperatures. Sometimes if it’s a bit too hot it’ll throttle to prevent any damage. Oddly, and this is very odd indeed, the charts below actually mirror one another exactly.
The fan’s however, do indeed seem to be working at near their maximum speed in order to provide those nice temps. And yes, they are hot, they are definitely toasty compared to the cool running GM204 and GM206, but it isn’t anywhere near the TJ Max of this chip, so we’re absolutely safe running at those temperatures. Those fan’s do get a bit loud, but aren’t much of a problem for anyone wearing headphones. I never noticed them at all, in fact.
GPU1 is the inner GPU, oddly. Usually it’s GPU0, but I digress.
And unfortunately for now we must conclude this test
Does this mean that $2000 of hardware is truly worth it for playing at these framerates? Probably not, unless you’re of the mind that you need the absolute best in technology and the highest of framerates. For everyone else? It’s a tad expensive. No, it’s very expensive. But the upside is that it’s currently the single and the dual card setup to beat.
My graphics cards have more RAM together than my system does…
It’s fun to have such wonderful cards at your disposal, but for the price, it isn’t exactly worth it, even with the relative performance per watt that these two suckers are able to give you. This is more for bragging rights than anything else. The performance is certainly there, in spades, and it won’t disappoint on that end, but I don’t think you’ll feel as if you have got your monies worth either.
But then again, the Titan was never intended to be for those looking for a good value either.
[Update] DSR Benchmarks
Now we have some DSR benchmarks to take a look at. These can almost represent what 4K performance can look like. There is obviously some additional overhead with down sampling it to the 2K resolution, but the performance should be rather similar.
In addition to this, let’s take a look at the difference in performance between using 8X MSAA and between 2160P and 1440P. We’ll also take a look at some comparison photos to see if there is really a benefit as well.
That’s quite the performance difference between the different resolutions, at least within DSR. So does that actually translate into a better, or sharper, looking game? Let’s explore.
Those are some screenshots taken in game to try to illustrate any differences that might be there with DSR and even using 8X MSAA. Are there real differences that are discernible while in game? Probably not, especially while actually playing the game, but if you enjoy stopping and scrutinizing each and every blade of grass and texture, then you’re likely to find some differences. There is, however, a difference in how it feels. Less responsive and a bit slower.