Hardware & Technical G-sync support coming to Freesync monitors

Nvidia will now bring G-sync support to select Freesync screens:
https://www.theverge.com/circuitbreaker/2019/1/7/18171631/nvidia-g-sync-support-freesync-monitors

Nvidia has announced at CES 2019 that it will bring G-Sync support to several FreeSync monitors, removing the need to buy a monitor specifically certified and branded by the leading GPU maker. 12 FreeSync displays have been confirmed to get G-Sync compatibility through a driver that will be released on January 15th.


The new G-sync compatible screens:
Acer XFA240
Acer XG270HU
Acer XV273K
Acer XZ321Q
Agon AG241QG4
AOC G2590FX
Asus MG278Q
Asus VG258Q
Asus VG278Q
Asus XG248
Asus XG258
BenQ XL2740


And it will be possible to use more Freesync screens:
https://www.engadget.com/2019/01/07/nvidia-freesync-g-sync-certification/?guccounter=1

It also notes that even if your FreeSync display isn't tested, it might still work. "For VRR monitors yet to be validated as G-SYNC Compatible, a new NVIDIA Control Panel option will enable owners to try and switch the tech on -- it may work, it may work partly, or it may not work at all," the company wrote.
 
Last edited:
So they're saying the lock-in attempt wasn't successful and they could use a standard built into almost every monitor with DisplayPort for several years?
 
The company says that gamers with monitors the company hasn't tested, "or that have failed validation," will be able to manually enable VESA Adaptive Sync. Nvidia didn't make it clear if this has any practical differences from the automatically enabled mode above, but we expect the actual difference will depend on the display in question.
 
The launch seems to be a bit at an attempt by NVIDIA to brand their "Freesync" enabled G-sync monitors as the PREMIUM solution.

The aim then being to ultimately upsell to the Hardware based G-sync solution with its "luxury NVIDIA" tax, to show punters that this truly is the product for the Elite gamers of this world.

From a consumer point of view, NVIDIA caving in and adopting what has become the defacto standard is a good thing. It looks like Intel announcing a short while ago that they would be adopting "Freesync" for their upcoming graphics card was the nail in the coffin.



NB: Im aware that Freesync is AMD's branded name for an open standard. However it is the defacto recognised name. Just as GPU was the name that NVIDIA coined many years ago for what had previously been referred to as 3D graphics cards.
 
Apparently some new TVs have adaptive sync - maybe that has something to do with nVidia's new stance?

NVIDIA has known hardware G-Sync was a dead-end since VESA announced Adaptive Sync in the first place.

They are holding on to what control of the ecosystem they have for as long as possible, by any means possible.
 
10 credits on them screwing things up in a way that breaks several months of upcoming drivers on any sub-2000-series card too.

Oh wait, they already did that :D
 

Robert Maynard

Volunteer Moderator
NVIDIA has known hardware G-Sync was a dead-end since VESA announced Adaptive Sync in the first place.

They are holding on to what control of the ecosystem they have for as long as possible, by any means possible.

Which would infer that G-SYNC was DOA - as adaptive sync forms an optional part of the DisplayPort 1.2a specification, released in January 2013.
 
Which would infer that G-SYNC was DOA - as adaptive sync forms an optional part of the DisplayPort 1.2a specification, released in January 2013.

It was always going to be DOA. It was just a means to make more money out of nothing. While Nvidia make good GPU's, their business ethics is not very good in my opinion.
 
Which would infer that G-SYNC was DOA - as adaptive sync forms an optional part of the DisplayPort 1.2a specification, released in January 2013.

Adaptive Sync was added to the DP 1.2a spec in mid-2014, well after the original spec was published and after G-Sync VRR was a thing.

NVIDIA deserves some credit for getting a viable desktop VRR implementation out first and accelerating the overall VRR push, but they've had to have known the hardware they were pushing was unlike to see broad adoption and that they would eventually have to switch to VESA's standard.
 

Robert Maynard

Volunteer Moderator
Adaptive Sync was added to the DP 1.2a spec in mid-2014, well after the original spec was published and after G-Sync VRR was a thing.

NVIDIA deserves some credit for getting a viable desktop VRR implementation out first and accelerating the overall VRR push, but they've had to have known the hardware they were pushing was unlike to see broad adoption and that they would eventually have to switch to VESA's standard.

That'll teach me to trust the wikipedia article.... :)

Credit where it's due - although I do wonder how long adaptive-sync was in draft for the standard before G-Sync was invented - but the usual disdain for attempting to create a locked eco-system reliant on their hardware.

Reminds me of Physx - except for the innovation bit - as they just bought that up.
 
That'll teach me to trust the wikipedia article.... :)

Credit where it's due - although I do wonder how long adaptive-sync was in draft for the standard before G-Sync was invented - but the usual disdain for attempting to create a locked eco-system reliant on their hardware.

Reminds me of Physx - except for the innovation bit - as they just bought that up.

It was certainly in draft, as the old R9 290 launched in late 2013 supports freesync. That indicates that Freesync/adaptive sync was already on the table as the graphics cards had to have the correct coding to support it. Hence why earlier AMD chips dont support freesync and also explains why only 10 series and 20 series NVIDIA will support freesync.

Its notable that Pascal architecture for NVIDIA was announced in 2014. It will have been baked into the chips during this design process in light of AMD launching Freesync. Nvidia knew back then that they had to have a plan B.

I could also specultate that with NVIDIAs shares tanking and also profit expectations being hit due to gambling big on the crypto mining boom, that NVIDIA may be looking to trim down on loss making products. Hence by quietly shelving hardware G Sync they are cutting and running.

Its of note that HDMI 2.1 comes with Freesync (Variable refresh rate) baked into the standard. Therefore as new tvs and monitors launch with HDMI 2.1 they will automatically support freesync.
NVIDIA would therefore have no chance trying to compete against AMD and Intel, nor for that matter the games consoles which both support freesync, though only microsoft so far have formally announced support for freesync. I suspect Sony will only formally announce support once they have started selling freesync compatible tvs.
 
Last edited:
I do wonder how long adaptive-sync was in draft for the standard before G-Sync was invented

It was certainly in draft, as the old R9 290 launched in late 2013 supports freesync. That indicates that Freesync/adaptive sync was already on the table as the graphics cards had to have the correct coding to support it. Hence why earlier AMD chips dont support freesync and also explains why only 10 series and 20 series NVIDIA will support freesync.

Embedded Displayport (eDP) has had variable refresh rate support in the standard, as a power saving feature for mobile use (refreshes are expensive, so doing them as infrequently as possible is a good way to extend battery life), for quite some time. These features were later co-opted for VESA Adaptive Sync in the DP 1.2a spec.

Any graphics hardware that had eDP support (almost all of them, as the same designs were used in mobile parts) would likely have been easy to adapt to DP 1.2a's Adaptive Sync.

Also, NVIDIA's mobile G-Sync, which has been a thing for years, has never needed a hardware module...it uses eDP.
 
Last edited:
Which of the initial 12 were IPS ? Acer XG270HU and Asus MG278Q are both TN panels.
Might have mistaken the Asus MG278Q for the MG279Q, as that model is IPS. At first, I've also misread it, although that would probably be because I have the 279 and didn't know about the 278.
 
Back
Top Bottom