No, AMD’s FSR (FidelityFX Super Resolution) Is Not A DLSS Alternative, And Here Is Why You Should Care

Usman Pirzada
AMD FSR 2.0 To Bring Impressive Performance, Impressive Image Quality & Support Across GPUs From All Vendors Soon!

AMD recently rolled out an amazing new framework called FidelityFX Super Resolution (FSR) and in a stroke of genius and commitment to open standards, made it available for both Radeon and GeForce GPUs (editor's note: see the update at the end of the article). While the feature will certainly help breathe new life into older GPUs, it is not a DLSS alternative and cannot be - any attempt to hype it up as a DLSS competitor would hurt the brand more than it helps. I expect a lot of pushback on this article but it is important that AMD fans understand the facts of the comparison so they can call on AMD to deliver a true DLSS competitor eventually (bonus points if that is open source as well).

Understanding why AMD's FSR and NVIDIA's DLSS are completely different classes of upscaling and not a FreeSync versus GSync situation

Headlines like AMD's answer to DLSS works on all GPUs are currently everywhere - but there is one critical flaw - FSR and DLSS are not alike. While NVIDIA DLSS is an upscaling system based on Deep Learning/Inference and uses actual hardware to enable performance enhancement without any serious quality degradation - FSR is just a simple spatial upscale. The two classes of upscale are not even remotely comparable and is like resizing something in photoshop versus resizing something in Gigapixel AI. The latter is a far superior result.

AMD's FSR implementation in Godfall is accompanied by extreme blurring that is visible even through Youtube's compression.

One of the most common comments I had seen online was the comparison of FSR to FreeSync and how it eventually crippled the market for NVIDIA's GSync technology. While comparing the two is understandable, it is worth pointing out that this is not an apples-to-apples situation. FreeSync was a direct alternative for GSync and did more or less the same job without using a proprietary standard. It was a true open-source competitor to GSync that even NVIDIA had to eventually adopt. FSR and DLSS on the other hand, couldn't be more different.

The blurring on the right is the telltale sign of a generic spatial upscaler.

While FSR is an algorithmic spatial upscaler, DLSS is a machine-learned quality inference system from lower-resolution input images. It uses a temporal feedback loop and is vastly more complex than standard spatial upscalers which have been around for decades in third party plugins. To the discerning eye or someone from a computer science background, this makes all the difference in the world. Machine learning allows us to achieve results in certain situations which computationally infeasible to reproduce using traditional software architectures. This is also why most of the successful self-driving and video/image processing suites in the world are now AI/ML based.

The hint is in the name "DL"SS, Deep Learning Super Sampling

FSR does not use any machine learning or inference and while it is an amazing tool to have in the absence of a DL system - it is not comparable in any way to an AI-powered image upscaling system. The former will always have a quality cost associated with it while the latter can actually get to a point where it would be impossible to see differences between native and AI-upscaled images. With the non-DL implementation AMD has rolled out with FSR, you are looking at quality that is worse than DLSS 1.0 on the highest preset. Performance presets should impact quality even more.

A comparison of DLSS variants vs native taken from Youtube. Notice DLSS 1.0 is far more palatable than AMD FSR and has very little blurring. (source)

To sum it up:

  • DLSS uses AI and a temporal feedback loop for native image quality in still and motion.
  • FSR is a simple spatial upscaler, with no AI or temporal feedback for high quality in motion.
  • AMD’s own comparison reveals a material reduction in image quality versus native resolution - and if you can see it through Youtube's compression algorithm, it will be even more visible in an uncompressed format.

Gamers need to push AMD to bring out their own DL/AI-based upscaling system, bonus points if that is open source as well

An old, never settle marketing slide from AMD.

If AMD fans accept FSR as it is, they are doing themselves a grave disservice. Radeon owners deserve a true competitor to DLSS - one that is based on DL/ML and uses AI inference to upscale the input images. Without this implementation, FSR would never be able to compete with the likes of NVIDIA DLSS. A common argument to this could be that AMD hardware does not have dedicated tensor cores that can work with AI workloads but considering its GPUs have excellent compute, general-purpose computing frameworks can be used for inference of a deep learning-based image enhancement system. It is definitely possible should AMD want it.

Another example showing the massive blurring due to the use of a spatial upscaler.

We love AMD. I was one of the first journalists in the world to cover Zen, I was one of the first to say they are going to be a massive threat to Intel, defended them when WSJ did a shoddy hack job about them selling IP to China, but now I will also step in and point out that this is not good enough. Radeon fans deserve an AMD Deep Learning upscaling system, and they should not have to "settle" (pun intended) for the current implementation of FSR.

Update 6/2/2021

AMD's Scott Herkelman has stated that they have no intention of optimizing FSR for NVIDIA GPUs and that NVIDIA should do that work. While it would have been a completely reasonable expectation in normal circumstances, the fact that AMD expounded on NVIDIA support, absorbed a ton of good press on this and is now basically back tracking makes it seem like a bait and switch situation. This also implies that FSR for NVIDIA users will be optimized only for Godfall unless NVIDIA wants to adopt the technology (which, in my opinion, they absolutely should for non-RTX cards).

Share this story

Deal of the Day