Computer

These are the reasons why AMD FreeSync has eaten NVIDIA G-Sync

Surely when you go to buy a monitor, you see that the vast majority of monitors integrate this “sticker”. The reality is that it was not the first solution of this type, but it has become the most used. We are going to explain the reason why the vast majority of gaming monitors integrate technology AMD FreeSync and because NVIDIA G-Sync has become something “residual”.

Both technologies basically do the same thing: they synchronize the GPUs of the graphics card with the refresh rate. It may seem like a minor thing, but it suppresses issues like flickering, image tearing, and the like. In short, it seeks to offer a better image quality to the user.

A big win (more) for AMD

From AMD they have understood that in many aspects it is difficult to compete against NVIDIA. Faced with this challenge, they have opted to develop open tools and technologies complementary to their hardware. This means that they do not charge for their use, which allows them to access more clients.

Although both technologies (AMD FreeSync and NVIDIA G-Sync) seek the same thing, they differ in many ways. In summary, we leave you a comparative table at a general level of these technologies

Characteristics AMD FreeSync NVIDIA G-Sync
Technology It is based on VESA Adaptive Sync Developed and owned by the company
Hardware No additional hardware required A special module is required in the monitor to use this technology (except G-Sync Compatible).
Cost Does not make the monitor more expensive by not requiring additional hardware It makes the monitor more expensive since a special module that only the company manufactures is required
Compatibility It can work on AMD and NVIDIA graphics cards Only works with NVIDIA graphics cards (and from GTX 10 Series)
Frequency range Part of 48 Hz up to 240 Hz Part of 120 Hz up to 360 Hz
versions -FreeSync
-FreeSync Premium
-FreeSync Premium Pro
-FreeSync 2 HDR
– G-Sync Compatible
-G-Sync
-G-Sync Ultimate
Availability Virtually all monitors (gaming and/or office) already integrate this technology. Few monitors support this technology due to its high additional cost
Certification It is simple and usually passes without problems It is difficult to go through the number of factors that are verified, some quite complex

Looking at the table, one can already guess a little about the reason for AMD’s great triumph with FreeSync. The fact that it is free for the user and that it is compatible with NVIDIA graphics are the main factors that have served to make it popular.

We have to understand that the lowest certification given by NVIDIA is G-Sync Compatible. The monitors that receive this label have had to go through the company’s laboratories. They must meet a series of minimum parameters to receive this certification. But be careful, it is not free, you have to pay to pass the tests and if only you pass them, they will give you permission to use this guarantee “seal”.

AMD’s parameters for certifying monitors are not as rigid as NVIDIA’s. They focus on fewer parameters, making the process faster and less expensive for both parties. Certainly G-Sync guarantees us better monitor quality, but is this quality and extra cost really worth it for the average user? NO.

amd freesync technology

Two philosophies, one goal

Who starts this “race” is NVIDIA launching G-Sync under its conditions and with dedicated hardware. Before this, AMD responds with FreeSync, which in the end is nothing more than taking VESA Adaptive Sync technology, tweaking it a bit and launching an open standard.

They are two different models. NVIDIA wants to market a select product thinking of users with a medium or high purchasing power who want quality. AMD for its part offers a “good, nice and cheap” product that is accessible to any user regardless of the customer’s budget.

But, both fundamentally pursue the same thing, offering the user a high-quality gaming experience. It is intended that the user enjoy the games without flickering, without tearing images and other problems. In the end, it synchronizes the refresh rate with the amount of FPS that the graphics card offers at all times.

Related Articles

Leave a Reply

Your email address will not be published. Required fields are marked *