As part of its presentation at Computex 2023, NVIDIA introduced the Avatar Cloud Engine (ACE) service for games, which will make non-player characters (NPCs) in games more intelligent. The featured service will allow developers to create their own AI models that will allow them to generate more natural language for NPCs, including dialogue and movement.
Image source: NVIDIA
According to NVIDIA, developers of middleware, development tools and games can use ACE for Games to create and deploy custom AI models themselves to generate speech, conversations and animations in their software and games.
“Generative AI is able to revolutionize the interactive interaction between players and game characters and significantly increase the level of immersion in the game – says John Spitzer, NVIDIA’s vice president of developer technologies. “Building on our AI expertise and decades of experience working with game developers, NVIDIA is at the forefront of generative AI in games“.
Powered by NVIDIA Omniverse, ACE for Games features optimized core AI models for speech, conversation, and character animation, including:
- NVIDIA NeMo – to create, configure and deploy language models in developers’ own datasets. Larger speech models can be adapted to the characters’ stories and backstories, and are protected from “counterproductive or unsafe” conversations with NeMo Guardrails.
- NVIDIA Riva – for automatic speech recognition and text-to-speech. This model allows you to have a natural, real-time conversation in-game.
- NVIDIA Omniverse Audio2Face – Instantly create expressive facial animations of a game character for each language track. Audio2Face includes Omniverse connectors for Unreal Engine 5, allowing developers to add facial animation directly to MetaHuman characters.
According to NVIDIA, developers can integrate the entire NVIDIA ACE for Games solution or just use the components they need.
NVIDIA has teamed up with startup Convai to demonstrate how developers will soon be able to use NVIDIA ACE for Games to create NPCs. Focused on developing advanced conversational AI for virtual game worlds, Convai has integrated ACE modules into its end-to-end real-time avatar platform. In the demo called Kairos, players interact with Jin, the cafe owner. Despite being an NPC, Jin responds to natural language requests realistically and in keeping with the story’s backstory, all using generative AI.
NVIDIA ACE gaming neural networks are optimized for a wide range of needs and offer a variety of size, performance, and quality options. ACE for Games helps developers fine-tune models for their games and deploy them via NVIDIA DGX cloud systems or directly to GeForce RTX PCs with real-time debriefing.
Finally, NVIDIA noted that game developers and startups are already using its generative AI technologies in their workflows:
- GSC Game World introduces Audio2Face for its upcoming game STALKER 2 Heart of Chernobyl.
- Fallen Leaf, an indie game developer, uses Audio2Face to animate characters in Fort Solis, a third-person sci-fi thriller set on Mars.
- Charisma.ai, an artificial intelligence company that creates virtual characters, uses Audio2Face for animation in its conversation engine.
Add Comment