Assassin’s Creed Odyssey PC Performance Explored


Today is all about Assassin's Creed Odyssey and it's PC port. Assassin's Creed hasn't exactly had the best history when it comes to the PC Platform.  In it's early days, paired with a shaky at best UPLAY, it wasn't uncommon for your game saves to go missing unexpectedly.  Later down the road they paired up with NVIDIA to bring some new features through GameWorks and that wasn't too well received because along this time it was faces that went missing along with save game files.  Origins saw a new style of gameplay and an insane demand on CPU and GeForce cards took a commanding lead at launch with that one.  This time around Ubisoft has partnered with AMD, even putting this game in their bundle for Radeon Rewards.  Should be interesting to see how things have come since Origins.

We set out a bit more ambitious on this title than we have in the past hoping to really round things out for the testing, but Denuvo had different plans.  After getting locked out of the retail purchased games many times we had to scale back a bit on some of the testing.  The idea was to bring different platforms into the fold and see how performance looked across them giving more people an idea of what to expect with a similar platform to theirs.  We wanted to look at Graphics Cards as usual, but wanted to see preset scaling as well as core scaling across both AMD and Intel platforms. This look at Assassin's Creed Odyssey is what you can expect to see more of as we move forward, and no our GTX 1080ti has not made it here to this office yet.

Assassin’s Creed Valhalla DLC Being Turned into Full Game to Boost Weak Ubisoft Slate

Testing Methodology

Running through the tests for Assassin's Creed Odyssey was pretty straight forward as the in-game benchmark was representative of the performance I saw after about a hour and a half of game play.  It did however, have one glaring catch that slowed things down considerably.  Who in the world puts dynamic weather patterns in a canned benchmark? So we made sure we only took results from scenes that had clear weather as rain introduced a wild variable as the clouds were sporadic and impacted performance a bit.  We used FRAPS to capture frame times from every frame rather than relying on the in game benchmark to tell us the averages, and we needed 1% and .1% low metric numbers that the game wouldn't give.  Something to note on the in game results: they were higher than FRAPS, but it also appeared to be culling results as well giving an artificially higher results than what you are actually getting.  As far as settings go we explored the different presets and found Odyssey to be very reminiscent of Deus Ex Mankind Divided in terms of demand from settings so rather than going for the 'compare everything at Ultra' and since others have already testing the highest of settings we went with the 'High' preset as it more likely what people are going to be choosing.  'Low' and 'Medium' have clear visual disadvantages while 'Very High' and 'Ultra High' offer very little over 'High' but come with massive drop-off in performance.

Test Systems

CPURyzen 7 1700 @ 4GHzi5 8600k @ 5GHz
Memory 16GB G.Skill Flare X DDR4 320016GB Geil EVO X DDR4 3200
MotherboardMSI X370 XPower Gaming TitaniumEVGA Z370 Classified K
StorageAdata SU800 128GB
2TB Seagate SSHD
Adata SU800 128GB
2TB Seagate SSHD
PSUCooler Master V1200 PlatinumCooler Master V1200 Platinum

Graphics Cards Tested

GPUArchitectureCore Count
Clock SpeedMemory Capacity
Memory Speed
NVIDIA RTX 2080tiTuring43521350/163511GB GDDR614Gbps
NVIDIA RTX 2080 FETuring29441515/18008GB GDDR614Gbps
NVIDIA GTX 1080 FEPascal
25601607/17338GB GDDR5X10Gbps
NVIDIA GTX 1070 FEPascal
19201506/16838GB GDDR58Gbps
NVIDIA GTX 1060 FE 6GBPascal
1506/17086GB GDDR58Gbps
XLR8 GTX 1060 3GBPascal11521506/17083GB GDDR58Gbps
AMD RX Vega 64 Vega 1040961247/15468GB HBM2945Mbps
XFX RX Vega 56Vega 1035841156/14718GB HBM2800Mbs
XFX RX 480 8GBPolaris 10230412668GB GDDR58Gbps
Sapphire RX 570 Nitro+ 4GBPolaris 20204813404GB GDDR57Gbps

Drivers Used

Radeon Settings 18.9.3

Preset Scaling At 4K

We wanted to start things off by seeing how the game scales using the built in presets at 4k with the i5 8600k and RTX 2080ti so that we could get a real look at the impact and decide which preset we were going to use.  Again, 'Low' and 'Medium' offer up substantial performance increases but at a noticeable hit to image quality, especially evident in the more open areas.

Ryzen 7 1700 Core Scaling

More Intel Core i9-12900K Alder CPU Synthetic & Gaming Benchmarks Leak Out

Core Scaling on Ryzen appears to be a similar song and dance as we've seen showing gains all the way up through 6 cores and 12 threads but this time around we still see gains when moving to the 8 core 16 thread configuration.  Due to Denuvo thinking that changing core and thread count configurations is a 'new' system this was one of the tests limited by that wonderful DRM Lockout we kept experiencing so we kept the core configuration limited to available and typical Ryzen core configurations.

Core i5 8600K Core Scaling

Only having the Core i5 8400 and Core i5 8600K on hand limited our testing of the Coffee Lake CPUs to straight cores.  Seeing how this game scaled with available cores I was a little concerned at this point that the hex core would struggle, but the high clocks on the 8600K make up for what it lacks in HyperThreading.  Showing the 6 core configuration outpacing the Ryzen 6 core 12 Thread and a 4 core configuration matching the 4 core 8 thread Ryzen configuration.  Also, stay home dual cores with no hyperthreading, nothing to see here.

8600K vs 1700

But what about the Ryzen 7 1700 (4GHz) put next to the Core i5 8600K (5GHz) when paired with the RTX 2080ti? Looks like a pretty solid competition with both swinging at each other and neither coming out the clear winner.  The 8600k inches out the .1% lows at 1080p but is  loses that edge at 4K.





Well, it looks a lot like Assassin's Creed Origins all over again when it comes to performance.  While Radeon is behind again it's at least not a stutter fest, so I guess that's a positive.  The CPU is still getting pounded to the point that there's a clear and present bottleneck at 1080p and even then the frame rates aren't anything to write home about.  Having to move down to 1080p High and pairing with (I know we don't have one but I think we can all surmise) at least a GTX 1080ti to keep a constant 60 FPS or better is far from anything anyone should be happy about.  This would be understandable if this game were able to claim itself as 'The New Crysis' but it simply isn't.

Now with that rant out of the way, since this game isn't a twitch shooter or a game that necessarily benefits from higher frame rates outside of being smoother, if you're staying over 45 FPS and on a variable refresh rate monitor it's going to be okay.  The game plays well enough if you're around the 45 FPS level.  I know how that sounds and it's not the coveted 60 FPS barrier but it is okay.  The caveat is you're going to want a variable refresh monitor or grab a controller and turn on VSYNC because this engine exhibits insane levels of tearing otherwise.  This appears to be another game that is actually good on it's own that will be remembered more for it's uninspiring performance rather than it's actual game play on the PC.  It might be time for Ubisoft to sit back and take the Shadow of the Tomb Raider approach and see what DX12 could do for them, it did wonders for the Tomb Raider franchise on PC.