With us being on the cusp of the Pascal and Polaris launch it may be hard to understand why we would be looking at how two mid-range 1080p cards are handling things today. It’s easy, there’s a lot of people building their PCs or upgrading and simply don’t feel like waiting or simply need something today. So to that end we’ll be taking a look at how the R9 380 from XFX and the GTX 960 from EVGA stack up at 1080p from frame rate to software offerings.
AMD Radeon R9 380 - XFX Triple X Overclocked Edition
Before we jump into all the comparisons let’s take a quick look at the video cards in question here. First up is the XFX R9 380 XXX OC ($219.99USD at Newegg) model. The R9 380 features the Antigua/Tonga core derived from the R9 285 with a bit work put into such as doubling the VRAM and now boasts 4GB instead of the 285s 2GB. The retooled GPU core still features 1792 Stream Processors and this one comes clocked at 990MHz. The 4GB of VRAM is handled by the Antigua’s 256bit bus and clocked at 5700MHz effective. Display I/O features DVI-I, DVI-D, DP 1.2, and HDMI 1.4a. Cooling is handled by XFX’s Double Dissipation Cooling system with 4 nickel plated copper heatpipes and two 92mm fans. In terms of TDP the 380 holds a rating of 190w and requires two 6-pin connectors.
The EVGA GTX 960 SSC ($219.99USD at Newegg) has its specs that are quite different from its counterpart and features the, all new for the 900 series, Maxwell architecture and was originally released in only 2GB VRAM variants but for the sake of these tests we made sure and get a 4gb model for parity. The GTX 960 features 1024 CUDA cores and 4GB of VRAM. The 4GB of VRAM is on a 128bit bus with an operating frequency of 7010MHz. Being that our model is an overclocked GPU, just like our R9 380, its core clock has been increased to 1279MHz base with a boost of 1342MHz (although while running we saw it stay at 1455MHz at all time under load). With these frequency bumps we do see the TDP of this card raised from its stock speed of 1127/1178MHz its TDP has increased as well from 120w to ‘up to’ 160w.
Nvidia GeForce GTX 960 - EVGA Super Superclocked Edition
It’s clear to see that both of these cards are very different in their specifications but how does all that pan out in the real world. Time to run the cards through their paces on the system setup for 1080p gaming.
|CPU||Intel Core i5 6400 (4.6ghz)|
|Motherboard||ASRock Z170 Extreme 6 (Thanks ASRock)|
|Power Supply||Bitfenix Fury 550g (Thanks Bitfenix)|
|HDD||Crucial MX100 512gb|
|Storage Disk||Seagate 4TB SSHD (Thanks Seagate)|
|Memory||16gb DDR4 G.Skill Trident Z 2800MHz|
|Monitor||Nixeus Vue24A (Thanks Nixeus)|
|Video Cards||XFX R9 380 (Thanks XFX) Crimson 16.3.2 , EVGA GTX 960 SSC Geforce 364.51|
|Operating System||Window 10 Pro 64-Bit|
We ran DiRT Rally using the in-game benchmark at 1080p Ultra Preset with 2xMSAA. It’s fair to say with this game there is little performance differences between the two GPUs.
Middle Earth: Shadow of Mordor
We Ran ME:SoM using the in-game benchmark at 1080p with the Ultra Preset. In this title the 380 pulled ahead a fair margin and was able to keep the average at 60FPS.
Batman: Arkham Origins
As much as I would have loved to been using Arkham Knight I feel it’s a moot point. However we did run B:AO using the in-game benchmark at 1080p using the highest settings (PhysX disabled) and 2xMSAA. The 380 may have enjoyed a higher average but it also so much lower minimums resulting in the 960 being the better choice for this title.
We would have run the new Hitman game, but are waiting for the full game to be released before making the jump. That aside, HA was run using in-game benchmark at 1080p with the Ultra Preset and 2xMSAA. While these two are close it’s clear the 380 is a bit better in this game in terms of AVG and MIN.
Metro Last Light
Metro Last Light was run using the game’s benchmark utility set to 1080p Very High details, SSAA and Advanced PhysX left unchecked, and Tessellation set to Normal.
Rise of the Tomb Raider
Rise of the Tomb Raider is the first really fun one on this list, because we get to run it in DX11 and then again in DX12 mode. We Ran it using the in-game benchmark at 1080p with the High Preset as the Very High issued a VRAM warning. The Benchmark issues an average FPS from all three tests, but a different minimum for each so our reported minimum is an average of those three just as the average is. Well it’s clear to see that in this game either there’s no benefit of asynchronous compute engines or the drivers haven’t matured enough to take advantage of them. Either way the GTX 960 walks away with the win here, especially in DX12.
Fallout 4 was run at 1080p and the Ultra Preset (Gameworks features disabled) and was a set course from the exit of the initial vault running through Sanctuary, down to the Red Rocket, and down through Concord finishing in the Church next to the Museum. Even though both cards came in fairly close, I was surprised to see the 380 in the lead on this one.
Love it or hate it, there are still a lot of people playing BF4 so we ran the game through the opening jeep ride the campaign mission Tashgar at 1080p with the ultra preset. I think these results are close enough to say you can’t go wrong with either if BF4 is your main squeeze
Star Wars Battlefront
SW:BF grew on the engine that powers BF4 but is still worth taking a look at. We took our results from playing through the first 5 waves of the Hoth Survival mission with the game set to Ultra. Something to take into consideration with the results here, even though the 380 has the much more attractive AVG, that MIN is a bit lower than the 960s. That kind of swing can lead to an inconsistent and noticeable judder if you’re not using an adaptive sync monitor.
Grand Theft Auto V
I’m not sure there is anyone left in the PCGaming realm that doesn’t have this game at this point. This game was run using the in game benchmark at 1080p with no FXAA but 2xMSAA and all setting set to Very High, Advanced settings were left alone. For Shadows each GPU got their respective shadow tech selection. This is an interesting one because the AVG fps do not show it, not really does the MIN FPS, but on the 380 we continue to see the hitching where the screen pauses intermittently, this is however not present on the 960, so for that alone the 960 walks away with a win here.
Ashes of the Singularity
Now that AotS is out of beta and has been officially launched it has been added to the benchmark suite. Much like the RotTR this game excites us because we now have two released DX12 games to keep in our lineup. Running AotS at 1080p and using the ‘standard’ preset with 2xMSAA we use the in game benchmark to get our results. We do see the 380 take a considerable lead here in both DX11 and DX12 as this game does take advantage of the ACEs that Radeon has integrated.
We couldn’t resist having at least one synthetic benchmark in the lineup. Running Heaven at Extreme set to 1080p resulted in the 380 having a marginally higher MIN and AVG.
Thermals, Noise, and Power Draw
Thermals and Noise
Gathering results for the thermals and noise were done at the same time using a vacant desktop for idle, and let the GPU equalize temperatures while running Unigine Heaven after some time and took a temperature and noise level reading.
Using a Kill-A-Watt power meter we took the idle power draw after the OS was loaded and had nothing extra running in the background. Then the load measurement was taken at peak power draw during a Firestrike combined test was being run and all cores were at 100% and the GPU was at full 100% load.
After finding the idle power draw of the R9 380 to be much higher than where it should have been we found that when hooked up to the monitor used (Nixeus VUE24A) that when set to 144hz the memory clock would run at full 3D clock speed and not idle down, thus resulting in a higher idle power draw. We changed the monitor to 120hz and this was fixed, so we took the new measurements and adjusted the charts. However, we wanted to leave the original results to show the difference between a standard monitor power draw and a 144hz monitors.
In addition to frame rates, thermals, noise, and power draw something becoming increasingly important to gamers is the idea of recording gameplay. Regardless of whether it’s for streaming, making youtube videos, or just sharing clips with friends being able to easily capture desktop and gameplay makes the total experience much better for the end user. While I will give Radeon credit for coming as far as they have with the Ratr and Plays.TV applications, but they simply aren’t as well rounded and user friendly as the entire Geforce Experience and ShadowPlay.
I started this with the assumption that the R9 380 was going to be the clear winner. For the most part it is, better bang for buck, most of the time. The GTX 960 has its own strong points as well in the power draw and feature set and wasn’t far enough behind the R9 380 to be written off as a bad buy. It’s really hard to call an overall winner in this one, my suggestion is buy the one that plays the game you actually play the best, and comes in at the better price. Both of these cards handle 1080p well enough so pick your team, pick your style, and you won’t be disappointed either way.