Editor’s Note: This post was AI Decode Seriesdemystifies AI by making the technology more accessible and introduces new hardware, software, tools and acceleration for RTX PC users.
Your digital characters will level up.
Non-player characters often play an important role in video game storytelling, but because they’re usually designed with a fixed purpose, they can easily become repetitive and boring, especially in vast worlds with thousands of characters.
Thanks to incredible advances in visual computing like ray tracing and DLSS, video games are more immersive and realistic than ever before, making bland encounters with NPCs particularly jarring.
Earlier this year, NVIDIA Avatar Cloud Engine production microservices were released, giving game developers and digital creators the edge to create realistic NPCs. ACE microservices enable developers to integrate cutting-edge generative AI models into digital avatars for their games and applications. ACE microservices enable NPCs to dynamically interact and converse with players in-game in real time.
Leading game developers, studios, and startups are already incorporating ACE into their titles, bringing new levels of personality and engagement to their NPCs and digital humans.
Bringing Avatars to Life with NVIDIA ACE
The process of creating an NPC begins with giving it a background and purpose, which drives the narrative and ensures contextually appropriate dialogue. Then, the ACE subcomponents work together to build the avatar’s interactivity and enhance responsiveness.
NPCs utilize up to four AI models to listen, process, generate and respond to dialogue.
The player’s voice is first sent to NVIDIA Riva, a technology that builds a fully customizable, real-time conversational AI pipeline and uses GPU-accelerated multilingual speech and translation microservices to turn the chatbot into an engaging, expressive assistant.
With ACE, Riva’s automatic speech recognition (ASR) capabilities process what’s said and use AI to provide highly accurate transcriptions in real time. Check out the Riva speech-to-text demo in 12 languages.
The transcript is then sent to an LLM (such as Google’s Gemma, Meta’s Llama 2, or Mistral) to generate a natural language text response using Riva’s neural machine translation, and then an audio response is generated using Riva’s text-to-speech capabilities.
Finally, NVIDIA Audio2Face (A2F) generates facial expressions that can be synchronized with speech in many languages. This microservice enables digital avatars to express dynamic, realistic emotions that can be streamed live or baked in in post-processing.
The AI network automatically animates face, eyes, mouth, tongue and head movements to match the range and intensity level of the emotion you select, and A2F can also automatically infer emotions directly from audio clips.
Each step happens in real time to ensure smooth interactions between player and character, and the tools are customizable, giving developers the flexibility to build the types of characters they need for immersive storytelling and world-building.
Born to Roll
At GDC and GTC, developers and platform partners showed demos powered by NVIDIA ACE microservices, ranging from interactive in-game NPCs to powerful digital human nurses.
Ubisoft is exploring new types of interactive gameplay with Dynamic NPCs. The result of its latest research and development project, NEO NPCs are designed to interact with the player, their environment, and other characters in real time, opening up dynamic new storytelling possibilities.
These NEO NPC capabilities were showcased through demos that focused on different aspects of NPC behavior, including environmental and situational awareness, real-time reactions and animations, dialogue memory, collaboration and strategic decision-making. Together, the demos highlighted the technology’s potential to push the boundaries of game design and immersion.
Ubisoft’s narrative team used Inworld AI technology to create two NEO NPCs, Bloom and Iron, each with their own backstory, knowledge base and unique conversation style. Inworld technology also gave the NEO NPCs unique knowledge of their surroundings and interactive responses through Inworld’s LLM. NVIDIA A2F provided real-time facial animation and lip sync for the two NPCs.
Inworld and NVIDIA wowed GDC with a new technology demo called Covert Protocol, showcasing NVIDIA ACE technology and the Inworld Engine. In the demo, players controlled a private investigator who achieved goals based on the results of conversations with NPCs in the field. Covert Protocol unlocked the mechanics of social simulation games with AI-powered digital characters that acted as conveyors of vital information, pose challenges and drive key narrative developments. This enhanced level of AI-driven interactivity and player agency opens up new possibilities for player-specific gameplay.
Built on Unreal Engine 5, Covert Protocol uses the Inworld Engine, including NVIDIA Riva ASR and A2F, and NVIDIA ACE to power the Inworld voice and animation pipeline.
The latest version of the NVIDIA Kairos tech demo, built in collaboration with Convai and shown at CES, uses Riva ASR and A2F to significantly improve NPC interactivity. Convai’s new framework enables NPCs to talk to each other and recognize objects, allowing them to pick up items and deliver them to desired areas. Additionally, NPCs can now guide players to destinations and traverse the world.
Digital characters in the real world
The technology used to create NPCs is also being used to animate avatars and digital humans. Beyond games, task-specific generative AI is also finding its way into areas such as healthcare and customer service.
At GTC, NVIDIA collaborated with Hippocratic AI to expand its healthcare agent solution, demonstrating the potential of generative AI healthcare agent avatars, and further efforts are underway to develop an ultra-low latency inference platform to enable real-time use cases.
“Our digital assistant provides helpful, timely, and accurate information to patients around the world,” said Munjal Shah, co-founder and CEO of Hippocratic AI. “NVIDIA ACE technology brings our digital assistant to life with cutting-edge visuals and realistic animations, helping us deepen our connections with patients.”
Hippocratic’s initial internal testing of its AI healthcare agent is focused on chronic disease management, health coaching, health risk assessment, social determinants of health surveys, pre-surgery outreach, and post-discharge follow-up.
UneeQ is an autonomous digital human platform focused on AI-powered avatars for customer service and interactive applications. UneeQ has integrated NVIDIA A2F microservices into its platform, combined with Synanim ML synthetic animation technology to create highly realistic avatars for enhanced customer experience and engagement.
“UneeQ combines NVIDIA animation AI with our proprietary Synanim ML synthetic animation technology to enable real-time, emotionally responsive digital human interactions, creating dynamic experiences powered by conversational AI,” said Danny Tomsett, founder and CEO of UneeQ.
AI in Games
ACE is one of many NVIDIA AI technologies that will take your gaming to the next level.
- NVIDIA DLSS is a groundbreaking graphics technology that uses AI to boost frame rates and improve image quality on GeForce RTX GPUs.
- NVIDIA RTX Remix allows modders to easily capture game assets, automatically enhance materials with generative AI tools, and quickly create stunning RTX remasters with full ray tracing and DLSS.
- Accessible through the new NVIDIA app beta, NVIDIA Freestyle allows users to customize the visual aesthetic of over 1,200 games through real-time post-processing filters with features like RTX HDR, RTX Dynamic Vibrance, and more.
- The NVIDIA Broadcast app provides live stream AI-enhanced audio and video tools, including noise and echo removal, virtual background and AI green screen, auto frame, video noise removal, and eye contact, to turn any room into a home studio.
Experience the latest and greatest AI-powered experiences on your NVIDIA RTX PCs and workstations, and stay up to date on what’s new and next with AI Decoded.
Subscribe to get weekly updates delivered straight to your inbox AI Decode Newsletter.