Is a new graphics card shortage coming due to AI?

Demand for AI hardware has grown exponentially in the past year, especially for GPUs. This makes us wonder if we will ever see one again. graphics card shortage as happened with mining not long ago. To make matters worse, NVIDIA has presented products to supply this market in its GTC, which makes many wonder if the nightmare is going to return.

Like every March, NVIDIA holds its own conference, the GTC, focused on the world of artificial intelligence and supercomputing. In it they do not present anything for PC, but products and services in the form of software and hardware related to AI and for different industries. However, the boom in applications such as ChatGPT, Stable Diffusion and many more based on large language models and generation of images from text has made the demand grow in this regard. And of course, those of Jen Hsen Huang have to take advantage of it in some way. Although what matters to us is how it affects us and see if we are going to experience a shortage of graphics cards again.

NVIDIA jumps on the ChatGPT bandwagon and brings out its dual graphics card in years

We recently told you that NVIDIA was going to seek to capitalize on the new AI boom with two elements that are obvious. On the one hand, selling graphics cards so that large companies and administrations can set up their own servers to provide or use services based on Deep or Machine Learning. On the other, the creation of servers in the cloud, for its part, so that small and medium-sized companies can access these resources.

Well, respectively, and in the first case they have submitted their NVIDIA H100 NVL Dual, which consists of two graphics cards interconnected via NVLink between them. They are not based on the same architecture as the RTX 40, but on the one designed for the supercomputing market, the H100. Its particularity is that the amount of VRAM that they have in total is 188 GB of the HBM3 type, which means that the system reserves 2 GB per GPU for certain tasks. Your target market? The one of the great language models in the ChatGPT style.

As for the second product is NVIDIA DGX Cloud, a cloud supercomputing service that gives access to NVIDIA servers to use the power of their GPUs for AI-centric applications. These servers will be based on Microsoft Azure and Google Cloud. The idea is that any small and medium business can rent such servers.

Will there be a shortage of gaming graphics cards?

As you can see, NVIDIA is not moving its stock RTX 40 towards the AI ​​market, instead using its other chip, the H100. Of course we have to start from the fact that both the latest GeForce for PCs and this powerful chip come from the same foundries using TSMC’s N4 node. Therefore, the demand for one will end up affecting the demand for another in theory, however, we must take into account a series of important points:

  • The margins of the H100 graphics cards are considerably higher, even than an RTX 40, what it costs for one of NVIDIA’s next-gen HPC graphics cards is almost 10 times what you’re paying for an RTX 4090.
  • Despite the rise in demand, it will not be as big as it was in the case of mining. So gaming graphics cards are safe.

However, we must take into account that the H100 chip is quite large, which means that few units come out per wafer and the failure rate is quite high. In any case, they have been in enormous demand by large multinationals that can buy this type of hardware. Just think that for ChatGPT 3 10,000 graphics cards were needed for AI training, we don’t know how many will be for the new version, but several times more sure than yes. And they will not be the only ones in demand for said hardware. In any case, we will see if TSMC and NVIDIA have the capacity to support this demand without affecting their older market, although less lucrative today.

Related Articles

Leave a Reply

Your email address will not be published. Required fields are marked *