How To Set VSYNC The Right Way In Your Games
Welcome to the first of my occasional series on different aspects of technology. I hope you'll find it interesting and useful.
Lots of gamers tend to set vsync (vertical synchronization) wrongly on their PCs and then wonder why their games look all stuttery instead of animating smoothly, ruining their gaming experience. I see this on tech and gaming forums all the time, with plenty of bad advice being given out to fix it too, so this guide aims to help them get it right and enjoy their games more.
This guide applies to all* games and graphics cards, since they all work the same way. There might be individual situations where certain games and/or graphics cards have weird results, but that's just that particular hardware and/or software not working quite right, or system performance being too low.
*Things might be a bit different with NVIDIA's latest RTX raytracing technology, but that's too new to comment on right now, so this guide explicitly excludes it and sticks to raster graphics, ie every game out there right now.
Single GPUs Rule
This guide is aimed more at single GPU users, since having 2 or (rarely now) 3 GPUs in your system tends to put limitations on your vsync options. There can also be the issue of microstutter, where the game looks stuttery even though vsync is set properly and there are no dropped frames. This is caused by delays in the GPUs communicating with each other over the SLI or CrossFire link. It can be impossible to eradicate in some instances, which must be annoying as hell for the gamer who spent all that money on those two cards and an SLI/CrossFire motherboard. One of several good reasons for sticking to one powerful GPU whenever possible.
How VSYNC Works
To set vsync properly, one must understand what it does: synchronize the graphics card with the monitor.
The monitor refreshes at a fixed frequency, which is typically one of these frequencies: 60Hz, 100Hz, 120Hz, or 144Hz, with higher being better. So, to produce perfectly smooth, stutter and tear-free motion, the GPU has to draw a new frame at the same rate as the monitor's refresh and those frames have to be synchronized to the monitor's refresh, or stuttering and tearing will result. The higher the refresh rate of the monitor, the faster and more expensive the PC must be to ensure this, especially the graphics card, so come on down, NVIDIA's new $1200 RTX 2080 Ti !! Nah, I'm kidding, spend something reasonable.
Since I've got an NVIDIA GTX 1080 in my system, the system specific advice here will refer to NVIDIA. AMD users, you might have an equivalent function for your card, in some cases. Have a look around the driver settings, or Google for it.
Most gamers still use fixed refresh rate monitors, but a graphics card's GPU doesn't create frames at a fixed frequency. Instead, the framerate bounces around all over the place, since the scenes take different amounts of time to draw from moment to moment as the scene changes. This creates problems with the smoothness of the animation and the problem is worse the lower the refresh rate of the monitor.
Using vsync fixes this problem, but unfortunately it's not a silver bullet and does have trade-offs between tearing, smoothness and lag. It's basically a compromise.
Of course, a PC won't always be able to render frames higher than the monitor's refresh, with the problem becoming worse the higher the refresh rate of the monitor and the cheaper and weaker the PC, especially the graphics card. Note that even the weakest of PCs will be able to show a desktop at 144Hz or 240Hz refresh perfectly well, as long as the graphics card and monitor support it. Note that the extra fluidity of high refresh rates is also quite apparent with just mouse and window movement on the desktop, so it makes for a nice enhancement there too. It's not just for games.
Tearing, Smoothness and Lag
What are tearing, smoothness and lag? Tearing is where the GPU draws a new scene partway through a monitor refresh cycle. The result is a visible and distracting discontinuity in the moving picture, which looks a lot worse on lower refresh rate monitors, see screenshot below.
This is because there's a bigger time delay between refresh cycles. At 60Hz, this is 16.7ms, while at 120Hz it's 8.3ms and 240Hz (the highest widely available refresh rate) it's just 4.2ms, making for higher temporal resolution, so you can see how tearing looks worse at lower refresh rates. You need a really fast (and expensive) system to drive a 240Hz monitor without dropping frames though and may still need to drop quality settings to achieve this framerate performance.
With slower paced games and simulators, such as flight and train sims, or animated chess games say, any lag introduced by it doesn't matter, so just switch vsync on and enjoy beautifully rendered video.
Another good option is just to switch vsync on anyway, even with a twitchy shooter in single player or online modes, especially if the monitor refresh is 120Hz or more and the PC can render the game with no dropped frames. Motion will be amazingly superfluid smooth with less motion blur and at these refresh rates, the extra lag introduced by vsync isn't too bad, so try it and see how you like it. You may also not mind the extra lag at 60Hz either, so more power to you.
One of the more popular misconceptions I see on forums is where the gamer often has a 60Hz monitor and then uses some third party utility like RivaTuner to manually set the maximum framerate of the graphics card, independent (unsynchronised) of the monitor's refresh rate in a bid to reduce lag. Therefore, they might set it to something like 58Hz, or 62Hz with vsync off. This, to put it delicately, is idiotic and it can be quite hard to convince them otherwise (I've really, really tried at times).
They then go on to complain how the game is all stuttery and hitchy (small, highly annoying pauses or jumps every second or so). Well, duh! In short, the lag benefit is questionable and the visual artifacts severe, so never do this and don't listen to anyone that suggests you do. I'm likely to get flamed in the comments for saying this. Whatever.
Now that we have this cleared up, remember I said it's a compromise? This means that you have options in how you set it and some experimentation may be in order to find the sweet spot.
Well, that means that you may wish to not use it at all and accept tearing and stutters in return for improved lag response. This is only worth it where the PC can render frames far above the monitor's refresh rate, at least double. This is good for twitchy first person shooters, especially for online or LAN play, where milliseconds count and where the monitor refresh is 60Hz. What would you rather have, beautifully rendered laggy video, or win the game, huh?
NVIDIA FastSync - Killer Feature
To get around this compromise, NVIDIA has a handy new version of vsync called FastSync (AMD's version, which came later, is called Enhanced Sync) which they introduced a while back and only works with the Maxwell generation and later, ie 9 series cards onward. For me it's a bit of a killer feature as it tries to give you the best of all worlds and works very well.
It works best with cards that have a lot of video memory, such as 8GB on a GTX 1080 or AMD RX Vega 56/64 and that can render consistently faster than the monitor refresh rate (at least 3x is recommended by NVIDIA). The CPU of course has to keep up with that fast rendering performance, otherwise that might need upgrading, but that's for another article.
What this does in a nutshell, is to let the GPU freewheel so that it runs as fast as possible by using three rendering buffers, asynchronously to the display, but only displays the frame just before the monitor's refresh cycle, throwing the rest away. The downside is that this causes the graphics card to work harder, consuming more power and kicking out more heat. There might be noticeable coil whine too, depending on the graphics card make and model.
In practice, I found this to usually work quite well, leading to fast and fluid gaming with low lag, but can actually introduce its own stutter and hitching if it doesn't work perfectly with the game, or parts of it, so try it and see.
If this sounds similar to triple buffering, that's because it is. It's actually an enhanced version of it designed to keep lag to an absolute minimum. Unlike triple buffering which is OpenGL only however, this works with DirectX too.
Note that since FastSync is optimised for lowest lag, rendering isn't always smooth (NVIDIA themselves admit this can happen) so if this is a problem, then you might want to use regular vsync, or AdaptiveSync.
See the PC Perspective video at the end of this article for a full technical explanation of FastSync and triple buffering, featuring NVIDIA's Director of Technical Marketing, Tom Petersen talking to Ryan Shrout.
There's also another new version of vsync that NVIDIA introduced, called AdaptiveSync. What this does is to turn vsync on, where the GPU is capable of rendering frames faster than the monitor refresh rate, but turns it off when rendering slower. This helps in maintaining a higher overall framerate when the system can't keep up with the monitor refresh, but will introduce judder, tearing and maybe hitching, as the GPU is not synced with the monitor any more.
It might be a reasonable compromise, but in this case, perhaps a better solution might be to drop the monitor refresh down to 60Hz with vsync on (regular or adaptive) assuming of course, that the system can manage at least this refresh rate most of the time.
Tearing Happens Above And Below Monitor Refresh With VSYNC Off
Some of you may now be shouting at the screen, or frantically posting in the comments, trying to tell me in no uncertain terms that tearing doesn't happen below the monitor's refresh rate. Well, this is just another misconception and it will be more obvious at the lowest 60Hz monitor refresh rate. It tends to look very obvious and awful too, especially with some games.
The fact is that tearing happens above, or below the monitor's refresh rate and simply means that the GPU has started drawing a new frame partway through the monitor's scan. In fact, it can look even more of a mess when the GPU framerate is low.
By the same token, an ugly, uneven stutter/judder can also be very apparent when the GPU is rendering above the monitor's refresh rate with vsync off. In my experience, it tends to look worst on a 60Hz monitor with the GPU rendering at around 61-85fps.
NVIDIA AdaptiveSync (half refresh rate)
This is an odd one, because at first glance, all AdaptiveSync (half refresh rate) does is mess up smoothness of motion by introducing constant judder.
As the name implies, the GPU will render frames at half the monitor's refresh rate, assuming it has enough performance to go that high. This ensures an unpleasant juddery, double image experience no matter what your monitor's refresh rate is. Lovely.
I can't see what the point of this mode is on a high performance system, other than fiddling about with it and marvelling at how awful it looks. Don't do it, kids! Let me know in the comments if you've found some use for it other than experimentation.
However, there are uses for it on constrained systems. It could be used to save power on a mobile GPU in a laptop that's running on battery power, or on a budget system with a low end graphics card that can't maintain 60fps at decent quality settings, but might still achieve a solid 30fps at those settings. This way, consistent framepacing can be achieved even on a lower end system, resulting in a nicer experience.
In fact, this technique is often used in current gen consoles such as the PlayStation 4 and Xbox One (all models) since these are much weaker than most PCs. One should see this done somewhat less on the considerably more powerful Xbox One X, though.
The VSYNC Holy Grail: Adaptive Sync
Now we come to the newest way of synchronizing GPU framerate and video refresh - adaptive sync. Currently, this comes in two flavours, NVIDIA's proprietary G-SYNC and AMD's open standard FreeSync. If you've got one of these systems just turn it on and forget about it, you've got the best of all worlds. Note that the monitor has to support G-SYNC, or FreeSync for this to work. There are no monitors at the moment which support both systems, due to corporate rivalry between the graphics card manufacturers. It's a classic format war and the sooner it's settled the better.
With these systems, the refresh works the other way round: the monitor only refreshes/scans when the GPU has created a new frame, meaning that the monitor's refresh rate bounces around in sync with the GPU. This removes all tearing and stuttering (within the adaptive sync range) helps with lag too and looks great. Short of maintaining a solid 144Hz or better with no dropped frames, which can be a tall order, this is the best there is, so be sure to use it.
Windows Patching and Driver Updates
Finally, whether you have an NVIDIA or AMD graphics card in your system, it's best to keep Windows fully patched and install the latest graphics driver from their respective sites.
People sometimes report patches or new drivers messing up their systems, but it's much less common than you might think, so go ahead and update. You'll reap the rewards of increased stability, security and compatibility with the latest games. Heck, as a PC enthusiast, you've got the skills to fix it if it screws up, right? If you really must hold off updating, then try not to wait more than a few days.
So there you have it, follow this guide and enjoy a better gaming experience!
Please let us have your feedback in the comments and suggestions for future articles.
Stay in the loop
GET A DAILY DIGEST OF LATEST TECHNOLOGY NEWS
Straight to your inbox
Subscribe to our newsletter