NVIDIA DLSS 2.0 Revealed – 2x Faster AI Model, Quicker Game Integration, Unreal Engine 4 Support
NVIDIA DLSS 2.0, a major update in the AI-powered Deep Learning Super Sampling technology, was discussed in a recent NVIDIA press briefing attended (remotely, of course) by Wccftech. Let's check what it's all about in detail.
DLSS 2.0 Features
● Superior Image Quality - DLSS 2.0 offers native resolution image quality using half the pixels. It employs new temporal accumulation techniques for sharper image details and improved stability from frame to frame.
● Customizable Options - DLSS 2.0 offers users 3 image quality modes (Quality, Balanced, Performance) that control render resolution, with Performance mode now enabling up to a 4X super resolution.
● Great Scaling Across All RTX GPUs and Resolutions - a new, faster AI model more efficiently uses Tensor Cores to execute 2X faster than the original, improving frame rates and removing restrictions on supported GPUs, settings, and resolutions.
● One Network for All Games - While the original DLSS required per-game training, DLSS 2.0 offers a generalized AI network that removes the need to train for each specific game. This means faster game integrations and more DLSS titles.
Russ Bullock, President at Piranha Games:
NVIDIA DLSS 2.0 basically gives our players a free performance boost, without sacrificing image quality. It was also super easy to implement with NVIDIA’s new SDK, so it was a no brainer for us to add it to MechWarrior 5.
Mika Vehkala, Director of Technology at Remedy Entertainment:
With Control, we set out to create a visually stunning and immersive world. Real-time ray tracing and NVIDIA DLSS made Control look amazing at launch, and upgrading to DLSS 2.0 made the game’s image quality better than ever.
Koen Deetman, CEO and Game Director at KeokeN Interactive:
We never expected the image quality in Deliver Us The Moon to increase when DLSS was enabled, but that is exactly what we experienced. This coupled with real-time ray tracing, and the huge performance boost from DLSS, gives our players the ultimate Deliver Us The Moon experience without any compromises.
Jim Kjellin, CTO at MachineGames:
It was critically important to us that our game standout both visually and from a performance perspective. Adding NVIDIA DLSS enabled us to get the best of both worlds, maximum performance with incredible image quality.
First of all, the improved AI network is now using Tensor Cores in a more efficient way, which makes it much faster - about twice as fast, in fact. The performance tradeoff of DLSS 1.0 at low resolutions wasn't good, but the DLSS 2.0 network makes it a lot more viable across a wide range of GPUs and resolutions.
Secondly, while first-generation DLSS targeted a 2x boost in pixels, DLSS 2.0 can go up to 4x, effectively delivering a reconstructed 4K image from a base 1080p image. It also benefits from temporal feedback, accumulating data over time and using multiple frames and motion vectors to generate the output frame. This results in higher quality, as the network itself has more data that is temporally processed, and it also allows higher scaling.
Arguably the most interesting feature is the latter, though, as the new 'fully synthetic training set' won't require to be trained specifically for each game implementing Deep Learning Super Sampling. Not only is this going to translate in much faster adoption among developers, but it is also important because a lot of games are not 'deterministic', meaning that two runs of an identical scene aren't exactly the same. This meant some of those generated frames weren't actually valid for training purposes, making it harder to get valid training data for DLSS.
Alongside the DLSS 2.0 architectural improvements, NVIDIA also confirmed two new games that are getting the feature (in addition to Wolfenstein Youngblood and Deliver Us The Moon, who already used the latest DLSS version): Control, where Remedy is updating the entire game from DLSS 1.0 to 2.0, and MechWarrior 5 which is getting DLSS for the first time. MechWarrior 5 is an Unreal Engine 4 game and NVIDIA revealed there's now a DLSS 2.0 UE4 branch available for game developers, which should make it even easier for integration.
Let's check one example of Remedy's game first. While Control's original DLSS implementation was widely praised as possibly the best when the game launched last August, the newest version of Deep Learning Super Sampling can greatly improve quality in certain cases, such as objects in motion as you can see with the fan below.
Now, you might be wondering why Control is running slightly slower in this DLSS 2.0 example (69 FPS against 71). The reason is that the previous DLSS version featured in Remedy's game wasn't actually using a neural network, it was running on shaders in a way that's more akin to a clever use of temporal anti-aliasing upscaling (TAAU), as revealed by NVIDIA's Tony Tomasi during the presentation.
Keep in mind, though, that DLSS 2.0 comes with three different presets (Quality, Balanced, and Performance). The Performance mode, according to Tamasi, will now provide higher performance than the previous DLSS implementation featured in Control while yielding equal or slightly better image quality. As we've reported some time ago, Control is getting an expansion called The Foundation on Thursday, March 26th, and that's when DLSS 2.0 is coming to the game.
MechWarrior 5: Mercenaries, on the other hand, didn't have Deep Learning Super Sampling at all at launch, which means RTX owners can now enjoy a substantial performance boost once the update goes live (later today) in this game. In the image below, activating DLSS yields a 33% performance increase at 1440P resolution while using an RTX 2060 graphics card.
Even more impressive is the zoomed version of the same image, though. DLSS 2.0, at least in its Quality Mode, provides a sharper image than that of the native resolution particularly if you look at the railings. According to Tamasi, that makes sense as the image reconstructed via DLSS 2.0 actually has more data than the native one, after accumulating data over time before the reconstruction and having fixed the temporal stability issues of the first version of DLSS.
That is not to say Deep Learning Super Sampling will always offer better image quality than native, but it does happen in certain cases.
Once the presentation was over, we asked Tamasi whether NVIDIA considered using Microsoft's DirectML API for Deep Learning Super Sampling or if it would be simply redundant. Here's what he had to say about that:
We did consider that. It is redundant since have essentially a direct path to the Tensor Cores in the GPU, so there'd be no advantage to us to making use of DirectML. That's not to say we might never do that, but at least right now the plan is to handle this natively through the Tensor Cores and our own API.
NVIDIA DLSS 2.0 wasn't all that the briefing was about, anyway. We also learned that beyond the official Vulkan ray tracing extension, based on the extension originally developed by NVIDIA, a new scalable RTX Global Illumination SDK is now available for developers. This was actually a highly requested feature among studios and it will allow them to use their existing light baking/light probes approach to Global Illumination to then scale it with the same workflow and integration up to raytracing.
Last but not least, Epic's Unreal Engine 4.25 update is also coming soon with fully integrated support for Microsoft's DirectX Raytracing API. Developers could use pre-release branches with DXR before, of course, but this will be the full mainline integration of DXR in UE4.
Of course, this is all in addition to the previous DirectX 12 Ultimate news where NVIDIA confirmed all RTX graphics cards (of which 15 million units have been sold to date) to fully support the features of the updated API, such as DXR 1.1, VRS, Mesh Shaders and Sampler Feedback. Exciting times indeed, as developers will have more and better tools to deliver high-end optimized graphics in their games.
Stay in the loop
GET A DAILY DIGEST OF LATEST TECHNOLOGY NEWS
Straight to your inbox
Subscribe to our newsletter