Computer

Video game NPCs will soon stop being dumb thanks to AI

It is the latest fashion in the technology sector and it is getting into full speed everywhere. Artificial intelligence is all the rage today, be it with solutions that speak to us, create images and a lot of other things. NVIDIA just presented Avatar Cloud Engine (ACE) a solution that integrates artificial intelligence to non-playable characters (NPCs) of video games.

One of the great supporters of this emerging technology is NVIDIA. Despite much talk about artificial intelligence, really, it has been with us for “four days”. Its explosion has been brutal and all developers rush to integrate it everywhere and provide new features.

It seems that this technology will soon reach video games, integrating itself into those characters that you come across in all video games and who contribute little or nothing.

“Smart” NPCs thanks to NVIDIA

So far, these characters you can’t handle have a message recorded and go. You can ask them all the times you want, they send you their bland and flat message, and run. Something that could change shortly and open a new world of possibilities.

The NVIDIA ACE technology intelligence. This tool designed for software and game developers allows you to create and to integrate model of voice with artificial intelligence. In addition, we will be able to start conversations with these characters who, until now, were not far from the trees.

This solution has been made possible through a partnership with Convai. This company works on the development of a conversational AI for online games.

This model is based on three elements:

  • NVIDIA NeMo: allows you to create, customize, and implement language models using proprietary data. They can be personalized with the backstory of the characters and protect against negative or unsafe conversations using NeMo Guardrails.
  • NVIDIA Riva: automatic speech recognition and text-to-speech to integrate live voice chat
  • NVIDIA Omniverse Audio2Face: automatically creates an expressive facial animation of a character to match the voice track. Connects Autio2Face with Omniverse for Unreal Engine 5, which allows developers to add facial animations to Meta Human characters

A demo of an AI-generated conversation with an NPC has been shown by the company. The character answers questions that are based on the story using a generative AI.

This NVIDIA tool represents another innovation from the company for the video game industry. Now, all that remains is for it to begin to be integrated into the games.

What games will integrate it?

It seems that the first game to be officially integrated is STALKER 2 Heart of Chernobyl. This game would be the first major title to use Audio2Face. In addition, Fort Soils, a game from the independent studio Fallen Leaf, will also use this technology.

It is not entirely clear if this technology runs entirely in the cloud or on premises. Note that NVIDIA graphics cards have Tensor Cores, specific cores for AI. We imagine that, if run locally, it won’t have as big of an impact as Ray Tracing.

We’ll see if AMD has an alternative to this technology. It should be noted that they do not have specific nuclei (at least for now) for artificial intelligence.

Related Articles

Leave a Reply

Your email address will not be published. Required fields are marked *