Tech

Six myths about graphics cards that we should already overcome

The world of technology is full of myths, and within it, graphics cards are one of the components that has been most affected by this reality. I am sure that as soon as you read this some of those myths will have come to your mind, although you may have also thought of others that you do not quite know if they are true or not, since you have never been able to deny them altogether.

I am aware that many of our readers continue to have this problem, and that some of these myths have become entrenched to the point that for some people they are “universal” truths when, in fact, they are they are totally false. To help you overcome these myths and have a more realistic and accurate vision of graphics cards, today I want to share with you six myths that we must overcome.

Before we start, I want to tell you that some of the myths that we are going to see made some sense a few years ago, and were also true in certain generations of graphics cards, but that does not make them universal truths that we must always keep in mind as true.

In this sense, I want to highlight something very important, and that is that both AMD and NVIDIA have had clearly superior graphics generations that made their graphics cards better than those of their rival, but that happened at a specific timeand should not be interpreted as something absolute and permanent.

With this in mind, I think we are ready to start. If you are planning to buy a new graphics card and need help choosing, I recommend that you take a look at our buying guide. Now yes, make yourself comfortable, we have many interesting things to read.

1.-NVIDIA or AMD graphics cards are better only because of the brand

NVIDIA and AMD graphics cards

This myth is deeply associated with the idea of ​​«status» that we have of one brand or another, and is also affected by the personal preference of consumers, who turn any of these two companies into a kind of idol, reaching the point of defending it tooth and nail with unprecedented blindness.

Needless to say, this is a problem. Buying something because of a preconceived idea of ​​status can lead us to make a bad purchase, and that level of absurd adoration of a brand can have far worse consequences. During the last two decades I have had conversations with fans of one brand and another, and I have met people who were incapable of making the slightest criticism of their favorite company, a problem that continues to this day and is raging present.

NVIDIA graphics cards they are not better for being from NVIDIA, and the same goes for those from AMD. Each company has good, less good, and bad generations, though luckily in recent years we haven’t seen a really bad generation of graphics cards from either of these two companies. Therefore, we should not prejudge or get carried away by this myth, each manufacturer offers a differentiated value, and each graphics card has its advantages and disadvantages, regardless of whether it is from AMD or NVIDIA.

2.-The graphics memory is the most important thing in a graphics card

vram

Another of the most popular myths, and one of the most recurrent. Users with less knowledge in the technological world are the most prone to get carried away by the “greatness of numbers”. Yes, I am referring to the classic “this graphics card has twice the graphics memory”, a very simple phrase that already encourages the other to think that just for that reason it has to perform twice as much as the other, when in fact it can perform even less.

It is true that this is a problem that is suffered less today than in previous generations, since manufacturers are choosing to reduce costs and it is no longer common to see low-power graphics cards with a lot of graphics memory, although there are still situations that can lead us to mistake For example, a GeForce RTX 3060 has 12 GB of graphics memorywhile a GeForce RTX 3070 Ti has 8 GB of graphics memory. We might think that the first yields more and that it is better, but in reality the opposite happens.

That a graphics card has more memory does not imply that it is better than another with less graphics memory. In the end, what matters is the architecture of the graphics card, its raw power, the generation in which it falls, since it determines how advanced it is, and its configuration at the hardware level. For example, the Radeon RX 6600 XT is only 8 GB graphics memory, while the Radeon VII has 16 GBbut the former is more powerful, more efficient and also has dedicated hardware to accelerate ray tracing.

We could put many other examples, such as the GeForce RTX 2060, which has only 6 GB of graphics memory but is capable of outperforming the 11 GB GeForce GTX 1080 Ti in ray-traced games, thanks to the performance improvement achieved by its RT cores, and the value of their tensor cores and DLSS should also be taken into account. In the end, the graphic memory It is just one of the many data to take into account to assess a graphics card, and it is not the most important.

3.-The graphics memory does not improve the performance in games

graphics memory consumption

We could say that this myth is the nemesis of the previous one, since it minimizes the real importance of graphic memory. As we have already said before, it is true that graphics memory is not one of the most important aspects when choosing a graphics card, but this does not mean that it is not importantor that it does not affect the performance in games, in fact the opposite happens.

The first thing we must be clear about is that to run a game we need to have a minimum amount of graphics memory. If we do not meet that minimum, the game may work, but it will give us important problems, among which we can highlight:

  • Errors in textures, popping and graphical glitches.
  • Low permanent performance.
  • Jerks and stops as a result of the constant cycles of emptying and filling the graphics memory.

Not having enough graphics memory to run a game It is one of the most serious problems that we can face, since the GPU will not be able to keep all the things it needs stored in said memory, and it will have to constantly repeat cycles of work that it had already done. To these will be added other new duty cycles, which will end up overloading the GPU and causing the performance to be very poor.

The amount of graphics memory, and its bandwidth, it can significantly affect the performance of a game, and even prevent us from accessing certain quality settings if we do not reach a certain level. However, it is true that once we have exceeded the optimal level, having a larger amount of graphics memory will not make any difference. In essence, it is the same thing that happens with RAM.

4.-AMD graphics cards consume more, and heat up more, than NVIDIA graphics cards

Radeon vs GeForce

What can I say, it’s a classic, and the truth is that it has made sense in some graphic generations, but in recent years has become a myth. If we limit ourselves to the consumption values ​​listed in the specifications of each graphics card, we already realize that the consumption of the GeForce RTX 30 is greater than that of the Radeon RX 6000, and with the GeForce RTX 20 and the GeForce RX 5000 the thing was also very tight.

For example, the GeForce RTX 3080, which is the direct rival of the Radeon RX 6800 XT, has a TGP of 320 watts while the latter has a TBP of 300 watts. AMD has managed to fine-tune consumption a lot with its latest graphics cards, and this has been noted. However, if we bring performance into the equation and dig deeper by looking at things like ray tracing and image reconstruction and rescaling, things look better for NVIDIA as it uses more advanced architecture.

Continuing with the previous example, the GeForce RTX 3080 performs much better than the Radeon RX 6800 XT in ray tracing, and supports second-generation DLSS, a technology clearly superior to AMD’s FSR 1.0 and FSR 2.0. These two keys are more than enough to justify the difference in consumption that exists between the two, but the important thing is that there is not an abysmal difference in consumption between the graphics cards of both companies, and that this myth does not make any sense.

AMD graphics cards also don’t get hotter just because they’re from AMD, it’s another misconception that stopped making sense a long time ago. It is true that the Sunnyvale firm sometimes launched generations that reached very high temperatures, such as the Radeon R9 290, but NVIDIA also did the same with models such as the GeForce GTX 480, and today it is absurd to generalize maintaining this myth.

5.-AMD graphics card drivers are a disaster: much worse than NVIDIA’s

Being honest, we must admit that AMD has made some important mistakes with its graphics card drivers in recent years, in fact we can remember the case of 2020, when faulty drivers caused many users of the Radeon RX 5700 and RX 5700 XT will meet Serious bugs and black screens.

It is true that AMD still has room for improvement in certain aspects, but its drivers are not a disaster, and they are not much worse than those of NVIDIA graphics solutions, in fact something very curious happens, and that is that AMD has been able to take care of and improve both the optimization of its drivers that have earned a reputation for “finewine”, that is to say, that some of its graphics cards have aged like fine wine thanks to the improvements introduced at the driver level.

AMD drivers have also been evolving at the level of interface and advanced functions, and with the jump to Radeon Software Adrenalin there was a very important jump for the Sunnyvale company, which gave it the foundation it needed to be in a much more competitive position against NVIDIA. On that basis, it has been introducing numerous improvements, and has polished the design and interface with great success.

Support at the game level has also been improving a lot over the years, and today we can find many titles that are deeply optimized to work best with AMD Radeon graphics cards, like AC: Valhalla. I think that with all this on the table it is quite clear that we are facing a myth to forget.

6.-Only top-of-the-range graphics cards can play games in 4K

eternal doom

Many users still believe that playing in 4K is something that is limited to the most powerful graphics cards, and that this resolution is something “new” that has only recently started to be used, so only the most current models are prepared to work with it. she. Nothing is further from realityIn fact, the GeForce GTX 980 Ti was one of the first graphics cards that was really capable of moving games in 4K in a more than acceptable way.

I had that graphics card, and to give you an idea, it was capable of running games like Battlefield 4 and Far Cry 4 in 4K at full quality while maintaining more than 30 stable FPS. Other less demanding titles like Tomb Raider 2013 and CoD Advanced Warfare ran at 55 and 80 FPS in 4K, also at maximum quality.

Playing in 4K is not a goal that we set ourselves two days ago, it is something that has been possible for years, although new video game developments have made the requirements increase, and therefore older graphics cards are no longer capable of offering good performance in that resolution with current titles. This does not support discussion, but today a GeForce RTX 2070 or a Radeon RX 5700 are perfectly capable of moving titles in 4Kand they are not top of the range models.

It is true that having a more powerful graphics card will give us greater fluidity, but this does not mean that we will not be able to enjoy an optimal 4K experience with a mid-high range, or even mid-range graphics card. For example, the GeForce RTX 3060 is capable of moving Battlefield V in 4K while maintaining 60 FPS, it achieves 82 FPS in DOOM Eternal and 55 FPS in Death Stranding, and without having to resort to DLSS.

Related Articles