On games this can be quite expensive in terms of frames per second and smoothness because of how it interacts with the monitors refresh rate. If the next buffer isn’t quite ready in 1/60th of a second for the start of the monitors refresh then you basically have to sit there doing nothing until the monitor is ready for you again. Now the user sees the same image for two frames then a jump and your frames per second has fallen off a cliff. You can try to mitigate that with triple buffering but that’s starting to take a lot of memory up now that you want to use for other things.
AMD has this standard for interacting with the monitor refresh now called FreeSync. Now the game can communicate with the monitor to tell it when the buffer’s ready, to get around the problems caused by a fixed 60hz refresh. A lot of ‘gaming’ branded screens are supporting it now.
(b) you only get judder with vsync=on, vsync=off just flips to the new buffer as soon as it's available. The downside is that you tend to have a visible seam or "tear" in the image when the buffer flips.
(c) the memory for triple-buffering is trivial, that's not the problem. The real problem is that you need to be pushing a framerate at least as high your monitor's refresh rate, and ideally 2x your refresh rate. That's undesirable in a world where you don't have infinite money to spend on hardware.
(d) NVIDIA actually pioneered this technology, AMD came in with a copycat implementation after the fact. AMD has the advantage of not needing special hardware in the monitor, but it usually comes with the disadvantage of only being able to sync over a narrow range, such as 40-60 Hz. Some monitors have as narrow as a 10-hz sync range, and many of them tend to flicker once they get down to the lower end of their sync range. There are currently a total of three FreeSync monitors on the market that don't totally suck.
> with the disadvantage of only being able to sync over a narrow range, such as 40-60 Hz.
1) This is not the problem of FreeSync itself but of the panels and their support for VRR.
2) I have 2 FreeSync monitors at home which both do 40-144Hz (Acer XF270HUA and Benq XL2730Z). In addition, more than 2 years ago AMD added LFC to help with the classic case of "FPS drops below FreeSync range" issue. I don't notice any issues with judder when my framerate goes below the range for 1 or 2 frames.
Your info is out of date. Nowadays FreeSync is just as good as GSync if you get a monitor of good quality (and you still pay 150-300$ less just because it isn't GSync).
LFC only works on models that already have a wide sync range, which is the reason that stuff like 40-60 Hz panels are a problem in the first place. The top end of the sync range needs to be at least 2.5x the rate of the bottom end of the sync range (it works by doubling frames up when you go off the bottom end of the sync range).
XF270HU/HUA (-A is a later revision of the same model) is one of the handful of good models (wide sync range). The Nixeus EDG has a wide sync range plus is the only FreeSync model known to have Adaptive Overdrive, which is important for preventing ghosting.
There are a very small handful of others, but only about 1/3 of FreeSync monitors even support LFC, and the EDG is the only (!) one that does Adaptive Overdrive, which is a basic feature on all GSync monitors. So no, this is not "out of date" at all, there really are only a handful of FreeSync monitors that are "just as good as GSync".
No. When NVidia created GSync, they enforced quality far beyond what the specification could manage and used proprietary technology and consultant engineers to accomplish it. AMD copied their efforts with Freesync by getting some of it added to the DisplayPort protocol. NVidia sees it as an inferior copy and won't support it. AMD and many others see GSync as an exclusionary market grab through tie-ins.
Despite many claims to the contrary from Freesync supporters, it's still the case that only a handful of Freesync displays will give near the same experience as nearly all GSync displays.
> If the next buffer isn’t quite ready in 1/60th of a second
As a UK developer this caused a real problem writing for the US market. We'd wring the most performance we could developing on the 50Hz refresh rate we had here in the UK and suddenly lost 20% CPU time for the US market. Not always a problem but certainly it was a consideration in how screen refreshes were timed.
AMD has this standard for interacting with the monitor refresh now called FreeSync. Now the game can communicate with the monitor to tell it when the buffer’s ready, to get around the problems caused by a fixed 60hz refresh. A lot of ‘gaming’ branded screens are supporting it now.