Saturday , July 4 2020
Home / Reviews / Cutting-edge Technology / FreeSync vs G-Sync – AMD and Nvidia face off for adaptive sync dominance

FreeSync vs G-Sync – AMD and Nvidia face off for adaptive sync dominance

Screen tearing is one of the biggest irritations facing PC gamers today. It’s a huge annoyance for players who want quick response times in fast-paced games such as FPS and RTS titles, but the problem affects games and gamers across the board, from budget PCs to high-end monsters. It’s a crisis that graphics card and monitor makers have finally come together to fix.

Nvidia and AMD have two differing solutions to this problem. Together they’re called adaptive sync (alternatively known as dynamic refresh rates). The two firms market their technologies differently, but they solve the same problems in a very similar way; it’s the hardware implementations that vary slightly. In this article, we’ll talk about how the technology works and give you quite a lot to think about if you’re in the market for a monitor or graphics card.

However, it’s not all rosy – while consumers will soon have a choice of technologies (competition always being a good thing), the camps are very much divided. Nvidia cards won’t work with FreeSync monitors, and AMD cards won’t work with G-Sync monitors. This leaves consumers with a difficult choice, as your choice of monitor will potentially lock you to one manufacturer or the other for the life of your display.

Europe vs. Google: EU accuses search giant of market dominance abuse


Frame tearing

Gamers with high-performance systems often run into the problem of frame tearing. This is caused by the refresh rate of the monitor being out of sync with the frames being produced by the graphics card.-

G-Sync diagram 0

A 60Hz monitor refreshes 60 times per second, but your graphics card’s output will vary – due to the varying load events onscreen put upon it. As a result, when your screen refreshes, the graphics card may have only drawn part of a frame so you end up with two or more frames on screen at once, which results in fairly distracting jagged-looking images when there’s fast-paced action on screen.

^Frame tearing caused by out of sync graphics card and monitor panel (Nvidia diagram)

This can easily be solved by turning on vertical sync (Vsync) in-game, which forces the graphics card to match the refresh rate of the monitor, typically producing 60 complete frames per second. However, many cards can’t keep up with this, but because they have to send 60 full frames each second, some of the frames are repeated until the next frame has been fully drawn. This leads to input lag and stuttering that for many are even more unpleasant than screen tearing.

^Stuttering caused by vsync (Nvidia diagram)

Because graphics cards and monitors don’t really talk to each other in any meaningful way other than to share basic information, there’s no way to sync the frame output and the refresh rate of a monitor. G-Sync and FreeSync solve this problem in the same way, although they both use slightly different technology to do so.

^ Adaptive sync controls when your monitor refreshes (Nvidia diagram)

With G-Sync and FreeSync, the graphics card and monitor can communicate with one another, with the graphics card able to control the refresh rate of the monitor, meaning your 60Hz monitor could become, say, a 49Hz, 35Hz or 59Hz screen; changing dynamically from moment to moment depending on how your graphics card is performing.

This eliminates both the stuttering from Vsync and also eliminates frame tearing because the monitor is only ever refreshing when it’s been sent a fully drawn frame. The impact is obvious to see, incredibly impressive and is particularly strong on mid-range machines with fluctuating frame rates. High-end machines will benefit, too, although not to the same extent.


Nvidia was first on to the market with its G-Sync technology, with launch partners including AOC, Asus and Acer. The technology is impressive but it has a slight drawback. In order to be G-Sync compatible, the screens need G-Sync specific hardware that’s rather expensive, unofficially adding around £75 on to the price of any given monitor.

G-Sync monitors require a proprietary Nvidia G-Sync scaler module in order to function, which means all G-Sync monitors have similar on-screen menus and options and also have a slight price premium, whereas monitor manufacturers are free to choose scalers from whichever manufacturers produce hardware that supports FreeSync.

FreeSync, which is an AMD technology, uses the Adaptive Sync standard built into the DisplayPort 1.2a specification. Because it’s part of the DisplayPort standard decided upon by the VESA consortium, any monitor with a DisplayPort 1.2a input is potentially compatible. That’s not to say that it’s a free upgrade; specific scaler hardware is required for FreeSync to work, but the fact that there are multiple third-party scaler manufacturers signed up to make FreeSync compatible hardware (Realtek, Novatek and MStar) should mean that pricing is competitive due to the competition.

While DisplayPort 1.2a is an open standard that can be used by anyone, Nvidia’s latest 900-series graphics cards don’t use it, with the firm saying it’s going to continue focusing on G-Sync instead. Some monitor manufacturers are sticking with Nvidia for now, too.

One clear difference between Nvidia G-Sync and AMD FreeSync is how they handle graphics cards that produce higher frame rates than a monitor can handle. G-Sync locks frame rates to the upper limit of the monitor while FreeSync (with in-game Vsync turned off) will allow the graphics card to produce a higher frame rate. This introduces tearing, but also means that input lag is at an absolute minimum, which is important for twitch gamers such as those who play FPS titles.


With such similar technology, your choice of monitor and graphics card may ultimately come down to your current situation. Older Nvidia cards and AMD cards (including APUs) can be updated to work with G-Sync and FreeSync monitors respectively, so your current setup may be ready without the need to buy a new graphics card.

AMD: According to AMD, The following pre-existing GPUs will be able to use FreeSync for dynamic refresh rates in games (after a software update): Radeon R9 295X2, 290X, R9 290, R9 285, R7 260X and R7 260. Other cards/chipsets will support FreeSync, but only for “video playback and power-saving purposes”, these include Radeon HD 7000, HD 8000, R7 and R9 series cards and APUs from the Kaveri, Kabini, Temash, Beema and Mullins lines.

Nvidia: Plenty of older Nvidia cards are compatible with G-sync. The full list of cards is as follows: GeForce GTX 980, 970, TITAN Black, TITAN, 780 Ti, 780, 770, 760, 750 Ti, 750, 745, 650 Ti BOOST, 660, 680, 670, 690, 660 Ti.

The most important thing to take away from this is that even fairly old mid-range cards from both AMD and Nvidia both support adaptive sync. This means you don’t need to buy a new card to reap the benefits of either technology. The lineup of monitors supporting FreeSync is already enviable (on paper), but there are some great G-Sync monitors on the market right now from Acer, Asus (the SWIFT, in particular, is staggering) and AOC.

The problem, of course, is the incompatibility of the two systems. If adaptive sync is important to you and you’re looking to buy a new graphics card or monitor, you should try and wait and see how the market develops, how much new G-Sync and FreeSync monitors cost, and exactly how different the two technologies end up being. It’s a confusing time for consumers, which is a shame because the technology itself is so incredibly useful.

If you’re buying a brand-new graphics card and have an adaptive sync monitor in mind, AMD looks to be in a strong position. Adaptive sync greatly benefits modest hardware, and those with mid-range cards will greatly appreciate the lower cost overheads that a DisplayPort 1.2a monitor looks to have over a G-Sync one.


Check Also

A Kidney Stone Could Not Stop Linus Torvalds from Releasing Linux Kernel 4.13

Just months after the release of Linux Kernel 4.12 which had support for Nvidia’s GTX 1000 Pascal …

Leave a Reply

Your email address will not be published. Required fields are marked *