Why is the refresh rate important in gaming monitors?

In the market there are a lot of technologies related to our monitors and televisions. All of them under trade names such as VRR, Adaptive Sync, AMD FreeSync and NVIDIA G-SYNC. All of them with the same utility in appearance, but with some subtle differences. So let’s look at variable refresh rate technologies.

The first video game systems were so rudimentary that they lacked VRAM and took advantage of the visual persistence on which cathode ray tube screens were based to generate the image on the screen in time with the electron beam. However, as the cost of video memory decreased, it was converted to rendering first in the image buffer and then transmitting it to the video output.

Why do we need variable refresh rates?

This process still lasts today and because the image is not generated at the same frequency as that of the monitor, they end up producing what are called image artifacts such as tearing. That occurs when there is a gap in the signal between the device that emits the video signal, the graphics card, and the one that generates the image, which is the screen or the monitor.

The solution to the problems derived from this lack of synchronization? Well, do something that the old VGA outputs did, that is, give the graphics card control of the timing of each frame in terms of horizontal and vertical synchronization to the device that outputs the video signal. In this way the signal is fully synchronized and there are no problems derived from it. This measure not only serves to avoid screen tearing, which is the visual error seen in the image above, image stuttering or also called stutteringwhich consists of the last frame being repeated at high speed and an unwanted signal delay or Input Lag.


What technologies exist for variable refresh rate?

However, instead of a standard, several variable refresh rate technologies have been created that seek to solve the problem. This has ended up increasing confusion among buyers, since it not only makes it difficult to buy a monitor, but also the graphics card that we are going to use with our PC. And it is that two types of standards have appeared at the same time. On the one hand, those that depend on the type of video interface used (VESA Adaptive Sync and HDMI FreeSync) and on the other hand, those technologies that depend on a graphics card manufacturer such as AMD FreeSync Premium and NVIDIA G-SYNC.

Let’s see, therefore, a review of each of those that currently exist so that you can differentiate them.

VESA Adaptive Sync

The first of the variable refresh rate technologies that we will deal with is the one defined by VESA, which ensures that the standards in terms of computer monitor specifications are met. Because there is still a separation, more than anything bureaucratic, with the world of televisions, many manufacturers of the latter do not adhere most of the time to the standards of the Video Electronics Standards Association.

The Adaptive Sync it was first included in version 1.2a of the DisplayPort and has been retained in later versions of the standard. Therefore, for its use, the device needs to use this video interface, which, due to the fact that it is not seen on conventional televisions, means that many games do not take advantage of it, since the developers have to ensure that the functions are used. by as many people as possible. Unfortunately, the new generation consoles lack outputs display port.


AMD Free Sync

The AM FreeSync technologyD is a blatant case of rebranding, since it is nothing other than the Adaptive Sync itself that we discussed in the previous section. So any Radeon graphics card or Ryzen APU can use Adaptive Sync. Only AMD sells it under its own brand.

However, AMD allowed itself the luxury of making an extended version under the name of FreeSync Premium and FreeSync Premium Pro as the support for HDR and the Low Framerate Compensation which is based on add “ghost” frames when the screen refresh rate is below 60 Hz. However, implementation of such technologies requires a number of additional components in the monitor’s circuitry, so they cannot be taken advantage of by monitors that support only Adaptive Sync.

As a curiosity, there are some monitors and devices compatible with FreeSync that can apply it through their HDMI interface. There are also low-cost monitors capable of running at 75Hz refresh rate, but they are fully compatible with FreeSync, albeit with a problem: they run at 48Hz as the minimum speed rate, so if the GPU retransmits at less than that speed image artifacts may appear.

AMD FreeSync Logo variable refresh rate


In the midst of a very cynical exercise and seeing how his proposal G SYNC could completely disappear from NVIDIA, they came up with a very similar marketing move to AMD. Give a name to the Adaptive Sync of the DisplayPort of your graphics cards under the name of G-SYNC Compatiblee, in order to mentally bind the VESA protocol with its own technology. So those of Jensen Huang are as guilty or more than those of AMD.

And we can say that more because of the fact that a G-SYNC Compatible monitor does not mean you have access to all the features of the G-SYN standardC, which requires manufacturers to install a special module provided by NVIDIA itself and that makes the final price of the monitor more expensive and that only works obviously with the brand’s graphics cards. This was what created a huge controversy, especially when VESA’s Adaptative Sync appeared and it ended up being seen that it was not necessary to complicate the monitor components.

So G-SYNC and G-SYNC Compatible are not the same despite serving the same purpose, in any case this forced NVIDIA to evolve its solution beyond what Adaptive Sync can offer and like its rival from AMD adds support HDR, up to 1000 nits in the Ultimate version and a Improved Input Lag.

NVIDIA G-SYNC variable refresh rate technologies

VRR or Variable Refresh Rate for HDMI

Under the original acronym of VRR we are facing the proposal of, so we are facing the same as the Adaptive Sync, but the HDMI port. This means that video game consoles will be able to take advantage of monitors and televisions under version 2.1 of the standard. In passing we will remember that if your monitor uses an older version of HDMI, then you will not be able to use this variable refresh rate technology.

So it is still the same functionality as the Adaptive Sync, but designed for the HDMI output. The problem? While the VESA solution is an integral part of the base standard, the updated HDMI 2.1 requirements have left it as a totally optional solution for monitor manufacturers. This means that if the controller of the HDMI output does not support it, then other solutions have to be used to implement it, which can mean a cut in the power of the graphics card since it has to be the graphics card that has than apply something that would work for the video controller.

The post Why is the refresh rate important in gaming monitors? appeared first on HardZone.

Related Articles

Leave a Reply

Your email address will not be published.