AMD FreeSync Vs Nvidia G-Sync – Dissected And Compared
FreeSync just launched bringing much needed competition to the variable refresh rate gaming monitor scene but how does it compare to G-Sync? Well that's the one million dollar question that we hope to answer by the end of this piece. But before that let's get a brief break down of how the technologies work and the benefits they offer.
How Variable Refresh Rate Technology Improves The Experience
A variable refresh rate would solve three distinct issues in games. The first issue is screen tearing. Tearing occurs whenever the frame rate is mismatched with the fixed refresh rate of the monitor.
The second and third issues are stutter and input-lag and they're somewhat related to the first issue. Because to solve tearing you would have to enable what is called V-Sync. And what V-Sync does is attempt to match the highly variable framerate of your game to the fixed refresh rate of your monitor. However this technique relies on forcing the last frame to wait for the next refresh cycle to be displayed, instead of being displayed immediately, in the middle of the scan, causing tearing. This can create both stutter and input-lag.
AMD FreeSync Vs Nvidia G-Sync, Slightly Different Approaches
Both technologies work similarly by enabling the monitor to refresh dynamically instead of doing it at a fixed rate of 60 times every second (60Hz) or 120 times every second (120Hz) for example. The idea is that the monitor would only refresh whenever there's a frame sent from the GPU. So the refresh rate of the monitor is always matched to the frames sent by the GPU. Thus no tearing ever occurs because frames don't ever overlap at any one time. And there's no input-lag or stutter caused by V-Sync.
So far so good, but how do the technologies differ ? Well FreeSync and G-Sync are different in several ways. FreeSync is based on the Adaptive-Sync open standard in DisplayPort 1.2a and later versions in addition to the integrated display controller inside AMD's graphics processors. While G-Sync relies on what is called the G-Sync module. A chip that gets embedded into the control board of the display panel. This module handles the coordination between the graphics card and the monitor, a job that's handled directly by the integrated display controller of the GPU with AMD's FreeSync implementation.
So G-Sync requires additional hardware other than your graphics card and the monitor to function. The G-Sync module isn't cheap either. It's based on a reprogrammable FPGA and has an integrated frame buffer both of which add to the cost. G-Sync also requires that the monitor makers pay a licensing fee to Nvidia to integrate this technology in their monitors. As a result you'll typically find that in two identical FreeSync and G-Sync monitors the FreeSync monitor would end up $100 less expensive. An example is Acer's XG270HU monitor which is $100 less expensive than its G-Sync enabled sibling the XB270HU.
Also because G-Sync doesn't rely on the integrated display controller inside the GPU but rather on the G-Sync Module. This means that the technology ends up being compatible with older graphics cards as well. G-Sync, Nvidia tells us, is compatible with most of the GTX 600 series graphics cards and up. Due to FreeSync's reliance on the integrated display controller the technology ends up being compatible only with the more recent GPUs by AMD. However, AMD also tells us that all future GPUs from the company will support the technology.
Another difference between FreeSync and G-Sync is in the flexibility of the effective refresh rate range. G-Sync is capable of refresh rates that range from 30Hz to 144Hz while the FreeSync spec is capable of refresh rates that range from 9Hz to 240Hz. That doesn't mean that you will find monitors with a refresh rate range of 9Hz-240Hz but it means that monitor makers can make monitors with any range in mind that falls between 9Hz-240Hz. FreeSync monitors out right now feature a variety of ranges.
Apart from cost there are some other drawbacks to Nvidia's G-Sync module. The module limits the outputs which a G-Sync monitor can support. G-Sync monitors only support Displaport outputs and do not allow for anything other than the most basic monitor features. And because monitor makers cannot use their own feature-full scalar chips to drive these monitors. They cannot include any of their differentiating features in the monitor. The G-Sync module also only allows for very preliminary color processing and does not support any form of audio output.
Now that we've discussed the hardware let's discuss the software involved. Both G-Sync and FreeSync function identically when your game's FPS falls in the variable refresh rate range of the monitor. So between 40 FPS to 144 FPS for the Acer and the BenQ and 48 FPS to 75 FPS for the LG monitors. Apart from a very minor (~2%) performance penalty with G-Sync both technologies work pretty much exactly the same when you're inside the variable refresh window.
However the story is a little bit different when your game's FPS dips below the minimum refresh rate. Say 39 FPS while using the Acer or BenQ monitors. Once that occurs G-Sync and FreeSync deal with the situation quite differently. With FreeSync the panel would simply revert back to a fixed refresh rate that matches the lowest refresh rate that the panel is capable of. So 40Hz in the case of the Acer and BenQ monitors and 48Hz in the case of the LG monitors. At this point you will have to choose whether you want to enable V-Sync to mitigate tearing in exchange for input latency or play without V-Sync and get tearing. And because the panel is operating at its minimum refresh rate rather than its maximum, both the input latency and the tearing become more noticeable and pronounced.
G-Sync was setup in earlier G-Sync monitors to deal with the situation not very differently than FreeSync, the panel would simply stick to the maximum refresh rate resulting in an increase of input latency. However it isn't as severe as with the FreeSync behavior, as the monitor would be fixed at the maximum rather than the minimum refresh rate. But unfortunately with G-Sync, Nvidia doesn't give you the option of V-Sync off or on, it would always stay on so you can't get rid of that latency in exchange for tearing. However I would still consider that fixing the monitor to the maximum refresh rate rather than the minimum to be a superior solution. Fortunately this would only require a driver update to change the current behavior of FreeSync below the minimum refresh rate, which I do hope AMD seriously considers.
UPDATE April 13th 2015
As of this date AMD has officially stated that they have chosen to refresh the panel at the maximum refresh rate rather than the minimum when the FPS dips below the FreeSync operating range, precisely the type of behavior that we had called for. And just as we had originally stated this behavior is enforced directly through the drivers which is why AMD was able to introduce this behavior without much trouble.
— Robert Hallock? (@Thracks) April 13, 2015
— Robert Hallock? (@Thracks) April 13, 2015
G-Sync employs a very clever trick in its latest iteration. Unlike the previous G-Sync behavior below the minimum refresh rate G-Sync now wouldn't simply fix the panel at the maximum refresh rate. Instead the panel will display the frame at a refresh that's twice the FPS.
Let's say your game FPS falls down to 25 FPS (40ms per frame), the display panel can't simply go down to 25Hz for reasons tied to brightness and pixel longevity. So what G-Sync does is display the frame twice at a rate of 50Hz (20ms per frame twice = 40ms ). So the frame stays on the screen for the exact length of time that it needs to. This provides an experience that's identical to a panel that's capable of 25Hz. The only downside to this technique is that it can cause a perceptible change in brightness on some panels i.e. flickering. So it's certainly not a perfect solution but it's still a viable and more satisfying alternative to the current FreeSync implementation.
Again the technologies also differ on how they deal with the situation when the framerate exceeds the maximum refresh rate of the monitor. With FreeSync you have the option between enabling V-Sync or disabling it. With G-Sync you only have V-Sync enabled functionality. Although, Nvidia is currently considering adding the option of disabling V-Sync when you hit the maximum refresh rate.
So in summary, there are advantages and disadvantages to both approaches.
FreeSync Pros :
- Easier to integrate into a wider range of monitors due to lack of any additional hardware.
- Significantly less expensive than G-Sync.
- Enables all the usual monitor features and display outputs.
- Gives users the option of V-Sync on or Off.
FreeSync Cons :
- Currently limited to six graphics cards and six APUs. [UPDATED Nov 4 2015 : Majority of graphics cards launched in 2014 and after support FreeSync]
- Reverts back to the monitor's maximum fixed refresh rate when the framerate dips below the minimum threshold. *No longer the case ever since AMD introduced FreeSync Low Framerate Compensation (LFC) on November 4 2015 via the Crimson driver package. Although not all FreeSync monitors support this feature.
G-Sync Pros :
- Compatible with a wider range of graphics cards.
- Frame duplication extends G-Sync's functionality below the minimum threshold but may cause flickering. This Feature works like FreeSync's Low Framerate Compensation (LFC) but is present across all G-Sync monitors.
- Availability of 4K G-Sync monitors with 30-60Hz ranges. (FreeSync 4K options limited to 40-60Hz i.e. no LFC )
- Overall monitor selection is limited compared to FreeSync but G-Sync monitors generally perform better when it comes to ghosting.
G-Sync Cons :
- Requires the addition of dedicated, costly Nvidia made scalar hardware in desktop monitors, making it measurably more expensive than FreeSync.
- Limits monitor features, sound and display output options to DisplayPort.
- Currently doesn't give users the option to disable V-Sync above the maximum refresh rate of the monitor.