Currently the vast majority of graphics cards are internal, at the same time, the majority of systems that are sold for gaming are laptops. Nevertheless, barely visible a single external graphics card or simply not to say that not a single one is seen. What’s more, it’s as if they had disappeared. Why did it happen? We explain it to you.
The biggest problem with buying a gaming laptop is precisely its graphic capacity and it is that everyone knows that due to consumption and temperature limitations, the same brown beasts that are in tower computers cannot be placed. Which would lead us to think that there should be a demand for external graphics cards, however, this is not the case and they will not stop being a simple anecdote
Why is there not a single external graphics card visible?
The first of the problems when using an external graphics card is the high energy consumption that they require, so it would be necessary for them to have an integrated power supply and not exactly a small one, since we have models that consume several hundred of watts to run. This also makes it unfeasible for desktop models to become portable so that they can be taken anywhere due to consumption. Therefore, if we wanted to make them fully portable, we would have to include a battery in the equipment, thereby increasing the joint cost of the entire equipment to obtain lower performance.
Now, we may have a laptop and want to give it additional graphics processing power for certain applications. A few years ago, the graphics cards for these computers were not soldered to the motherboard, but used a type of module called MXM that was still graphics cards for laptops and you only had to replace them with a more powerful one if necessary. The problem? They were so expensive and rare that it was more worthwhile to buy a tower with the equivalent GPU model inside.
In the end, the best option not only for gaming, but also for professional 3D rendering is to buy a desktop computer, which does not have the limitations of a laptop. However, the reason why external graphics cards have not managed to catch on with users is more than anything technical.
The problem is the interface
Our graphics cards today connect to a parallel interface called PCI Express with the ability to transmit tens of gigabytes per second. However, it achieves this using a huge number of pins and a very short distance. Precisely one of the reasons why peripherals use serial interfaces is because the connector should be too wide. Those of you who are older will remember the size of the classic printer connector, colloquially called LPT1, and how wide and how big it was.
Actually external PCI Express interfaces do exist, but they are based on an optical interface and not copper to prevent power consumption from being triggered by the resistance of the copper as the cabling distance increases. Not in vain, the reason why the connection of the graphics cards is so short, is for this reason, if it were longer, then more energy would be needed to transmit the data and the electricity bill would go through the roof.
However, you only need to look at the size of the connector, in this case it is an x8 connection and as you can see it occupies a good part of the rear width of the card. Although it could be placed on the side, it would be necessary to build a PCB or a dedicated board for this type of interface and we will only solve part of the problem, that of the information transfer and the power supply necessary to be able to have a card. external graphic.
keep things simple
The reason why no one has opted for an external graphics card is due to the fact that the current model, based on clicking on the PC’s motherboard, allows us to reduce costs, since among other things it is not even necessary a power source. power supply and neither a casing for it. That is to say, it makes things much easier for us and without the need to complicate ourselves with two power supplies, two casings, and so on.
What’s more, the policy of 6, 8 and 16-pin connectors to power graphics cards became necessary as soon as it was possible to verify that the previous interface, AGP, would fall short in terms of power. As a historical curiosity, some models that came out with an integrated PSU should be highlighted.