Intel beats AMD and fights with NVIDIA in neural networks for gaming

In recent months we are seeing major graphics card manufacturers placing a special emphasis on AI Super Sampling technologies to improve gaming performance. Along with its first dedicated gaming GPUs, Intel has also created its own technology that does the same, called XeSS Upscaling and of which they have now revealed more details.

Intel XeSS does not require individual training per game

One of the most interesting things that the Intel engineer revealed during the interview was that XeSS technology will not require individual training per game to establish its neural networks; we are facing an independent library that will be compatible with multiple titles simultaneously, in an approach quite similar to how NVIDIA DLSS 2.0 works where the libraries can be moved between games without affecting their operation.

In addition, according to Vaidyanathan’s words, the XeSS technology of neural networks for gaming will be of Open SourceAnd it has been curious that he “plays Swedish” about NVIDIA DLSS by claiming precisely that he has no idea how it works because this NVIDIA technology is not. In any case, it has confirmed that from day one of its launch, the objective is that XeSS is already something generic that can be used in any game, without the “fragility” of other Super Sampling techniques that need training in each game in the one used to function well.

If your GPU supports Shader Model 6.4, Intel XeSS will work

Intel XeSS gaming neural networks

The Intel engineer has also explained why they expect XeSS to have a higher adoption rate than NVIDIA DLSS; Intel technology will be available in two different variants: XMX-accelerated, which will be exclusive to Intel gaming GPUs, and DP4a, a special mode that supports Microsoft Shader Model 6.4 and therefore is compatible with NVIDIA Pascal, Turing and AMD RDNA 1 and 2 graphics. As you can see in the slide, the DP4a mode has a bit more rendering latency than XMX mode, but it’s still a lot better than rendering the image at native 4K resolution.

At the moment, AMD has not publicly said if it intends to add support for XeSS technology in its graphics cards, and they have not even provided a list of compatible GPUs for now, something understandable because the technology is not yet on the street. The funny thing about this is that Intel did not wait for AMD to release its FSR technology to show interest in the technology.

On the other hand, it is also interesting to note (at least from the developer perspective) that XeSS will have a single API for both versions, so nothing needs to be done to enjoy both the XMX variant and the DP4a.

No support for Tensor Cores or FP16 / 32, at least for now


The Intel engineer has also confirmed that, at least initially, Intel XeSS gaming neural network technology will not support NVIDIA’s dedicated Tensor cores. On the other hand, many users will want to know that for now they do not have plans to support FP16 or FP32 operations, unlike AMD’s FSR technology (this is valid to guarantee a greater spectrum of support with older GPUs).

Another interesting fact revealed in this regard is that the technology XeSS will have multiple quality modes, exactly the same as FSR and DLSS. This will provide users with greater flexibility, because if the game developer implements it, it means that in the graphic settings we can select what level of quality we want to use, thus being able to choose the best balance between performance and quality.

An unstoppable development: XeSS 2.0 and 3.0 are already on their way

Intel XeSS gaming neural networks

During the interview, Karthik has also confirmed that versions 2.0 and even 3.0 of the technology are already in development, since they are obviously taking things seriously and want the technology to evolve over time. The manufacturer will make this technology Open Source once it begins to mature; An open source approach to this AI-powered neural network technology could help drive the popularity of XeSS and push the market towards a comprehensive solution that is present everywhere, but it could also spell the beginning of further market segmentation. (remember that we have DLSS, FSR and now XeSS).

Finally, another interesting fact is that Intel has confirmed that XeSS is already “trained” to use up to 64 samples per pixel, four times more than NVIDIA DLSS technology. Here is a comparison table between Intel XeSS, NVIDIA DLSS and AMD FSR:

AMD FidelityFX Super Resolution (FSR) Intel Xe Super Sampling (XeSS) NVIDIA Deep Learning Super Resolution (DLSS)
Scaling method AMD FidelityFX Super Resolution (FSR)Spatial scaling Intel Xe Super Sampling (XeSS)Neural networks NVIDIA Deep Learning Super Resolution (DLSS)Neural networks
Motion vectors AMD FidelityFX Super Resolution (FSR)No Intel Xe Super Sampling (XeSS)Yes NVIDIA Deep Learning Super Resolution (DLSS)Yes
Historical buffer AMD FidelityFX Super Resolution (FSR)No Intel Xe Super Sampling (XeSS)Yes NVIDIA Deep Learning Super Resolution (DLSS)Yes
AI training AMD FidelityFX Super Resolution (FSR)No Intel Xe Super Sampling (XeSS)64 samples per pixel NVIDIA Deep Learning Super Resolution (DLSS)16 samples per pixel
Implementation AMD FidelityFX Super Resolution (FSR)By game (officially) Intel Xe Super Sampling (XeSS)Per game NVIDIA Deep Learning Super Resolution (DLSS)Per game
State AMD FidelityFX Super Resolution (FSR)Released (1.0) Intel Xe Super Sampling (XeSS)Without throwing NVIDIA Deep Learning Super Resolution (DLSS)Released (2.2)
Discharge AMD FidelityFX Super Resolution (FSR)Open (MIT license) Intel Xe Super Sampling (XeSS)Closed, announced to be open source NVIDIA Deep Learning Super Resolution (DLSS)Closed
GPU support AMD FidelityFX Super Resolution (FSR)AMD Navi, Polaris, Vega
NVIDIA 10, 16, 20, 30 series
Intel Xe Super Sampling (XeSS)XMX: Intel Arc
DP4a: NVIDIA Pascal +, Intel Xe-LP, AMD Navi 1X +
NVIDIA Deep Learning Super Resolution (DLSS)GPUs with Tensor Cores (Volta, Turing, Ampere)

This is where the information we have for now ends, but rest assured that in the coming weeks and months Intel will reveal more official details about this promising technology, and a lot.

Related Articles

Leave a Reply

Your email address will not be published.