Computer

What ports do GPUs and motherboards use for monitors and TVs?

The first computers used modified television screens, this is so due to the fact that in the middle of the Cold War there was a fear that the radio frequency system of these could be modified with computers for espionage issues or access to radio communications used by back then. This caused computer terminal displays to be sold without the ability to tune, but at the same time allowed the implementation of proprietary video input and output connectors.

Until the arrival of the PC AT in 1984 that brought the EGA standard, PC monitors were modified NTSC television screens, however, the limitations of this format when displaying images with higher definition made the need to adopt new formats PC monitor with a horizontal frequency higher than NTSC 15.7KHz. At that point the path of televisions and monitors was separated by 20 years until the arrival of monitors and LCD televisions, however, the difference remains the same, the lack of inclusion of a television tuner in the monitor.

In this article we will talk about the evolution of input and output connectors for monitors and, therefore, we will focus on those found on PCs. So we are not going to talk, therefore, of video outputs such as S-Video, Component, Composite, SCART and many others that we can see on the rear panel of many televisions.

Deprecated video outputs

Disused Video Connectors

The connectors in this section are those that you can still see in many computers, but due to the appearance of more advanced technologies, they have fallen into disuse. In other words, they have been replaced and they will not evolve, so if you find a monitor or a graphics card with this type of connectors, believe us that it is very old and it will not provide an acceptable performance today.

VGA port

VGA Pinout

Any PC from the 90s and the first half of the 2000s used VGA, a video connector of the RGB type, since it sent the information of the color components in three different channels, the connector also has the pins to control the synchronization horizontal and vertical monitor, which at that time was controlled by the graphics card itself. The synchronization pins indicate that it is a port designed for monitors that use a cathode ray tube to generate the image.

Being a port for a PC monitor, it does not carry audio and that is because when the VGA first appeared on the PC, the internal speaker was used or, failing that, the then new sound cards to reproduce the audio and not the monitors. , which, as you may have deduced, had no ability to reproduce sound.

Due to its analog nature, nobody gets along with LCD panels, which led to the need to create a digital interface in order to achieve good image quality. The reason why this happens has to do with the analog-to-digital signal converter that causes the conversion to lose information when using a VGA port.

DVI (Digital Video Interface) port

With the passage of time, LCD panels became cheaper and users began to buy them due to the fact that they allowed to save space on their desks compared to the classic tube monitors. Although the different nature of displaying the image on the screen required changes to be made to the video outputs on PCs, hence the birth of DVI.

The differences with the VGA? To begin with, it assigns two pins to each of the three RGB channels, although despite being a digital port it maintains the HSync and VSync pins, but they are hardly used. Actually it does it with a series of pins from where the monitor sends the information about the resolution and the refresh rate, in this way the sending of the signal is self-configured and the use of synchronization pins is not necessary. The first ports that appeared were the DVI-I, since they have the pins for use in analog monitors, however over time the DVI-D was standardized, which is intended purely for LCD screens as it lacks the pins for the LCD screens. CRT screens.

There were two different generations of the DVI video connector, on the one hand the SL with a maximum resolution of 1920 x 1200 pixels and on the other the DL with a resolution of 2560 x 1600 pixels also at 60 Hz. By the way, the DVI never supported variable frequency rate and over time it was quickly replaced by HDMI and DisplayPort.

Video outputs in use today

Current video connectors

The video outputs that we are going to talk about next are those that we can find on graphics cards and PC monitors today. The fact that the panels are the same in both flat screen televisions and monitors has completely universalized the video interfaces and thus the different connectors.

HDMI port

The most used today and with almost twenty years of evolution has gone from showing images to 720p version 1.0 to resolutions 8K with 2.1, which represents an increase of 50 times more pixels with respect to its origin. In its evolution we have seen additions such as support to various aspect ratios, variable frequency rate, HDR, and so on.

The HDMI port was born as a variant of DVI, but it does not match in its design and shape, as well as the distribution of the pins. The differences with DVI? Since it was created as a video output for audiovisual content, it can carry audio and also contains video for HDCP, the content protection system still used today by the audiovisual industry.

The HDMI cable is designed to give the signal without losses with a distance of up to 5 meters, a much greater distance between the output signal and the input signal ends up generating losses in the quality of the signal, so it is important keep the PC as close as possible to the monitor when using such a video connector.

DisplayPort or DP

The other video connector most used in monitors and PC graphics cards is called DisplayPort, which like HDMI has had several different generations in its evolution until today. The big difference is that they were born from two different standardization committees and that is why DisplayPort is seen more on monitors than on televisions. Moreover, many times a good way to know if a screen is a television or a monitor is the inclusion of this port.

The big difference between HDMI and DisplayPort apart from their shape and pin distributions is that HDCP support for certain content is not mandatory in a DP interface, this means that for video game consoles and video players this type of connector It is not used, as this medium is essential for the reproduction of commercial audiovisual content. That is why the DisplayPort has become a port associated with the PC today. However, unlike DVI and VGA and also like HDMI, DisplayPort has the ability to also transmit audio.

DisplayPort over USB

USB C Alt DP

The newest of the video outputs is a variant of USB-C, which uses its enormous bandwidth to transmit the video signal, at the same time that it is used to transmit data through the USB 2.0 pins and serve also to power the monitor. It may become the most used port in the future by allowing the screen to dispense with the power adapter and make them become totally portable, of course, this will require that the bandwidth exceed HDMI 2.1, which already available with Thunderbolt-based USB 4.0.

At the moment it is a port of minority use, in any case, everything indicates that the DisplayPort is closer to extinction than HDMI, since it has all the numbers to be replaced by USB-C ports with the ability to transmit video.

Related Articles

Leave a Reply

Your email address will not be published. Required fields are marked *