meta-lipsync-patent-facial-expression

Meta-patent could lead to better lip-syncing in video games and VR apps

A new patent filed by Meta showcases technology that could lead to more realistic lip-syncing and facial animation for video games and VR applications. Creating NPCs and player avatars with these more realistic facial movements opens the door to more realistic, engaging, and immersive games and experiences, including in VR games.


Even though graphics have advanced in video games over the decades, allowing for the creation of near photorealistic characters in recent years, the realistic representation of speech and facial expressions of these characters has continued to present a challenge for developers. Gamers’ brains are accustomed to seeing and reading other people’s faces, so even the best facial animation on a video game character can easily traverse the strange territory of the valley, making it seem like characters are “extinct” or even downright disturbing. Meta’s patent could enable the generation of realistic facial expressions and lip-syncing on the fly, saving developers time and increasing player immersion.

GAMER VIDEO OF THE DAY

RELATED: Facebook is Working on the System to Let Users Scan Their Clothes in the Metaverse

The patent details a variety of methods for translating a user’s speech into realistic lip-sync and facial animation, giving developers flexibility in how they choose to use the system. One method described in the patent uses a dataset of audio and video recordings of multiple people as they read 50 different phonetically balanced sentences, with the system tracking their facial expressions as they read them. By tracking how individuals’ faces and mouths move as they read each sentence, the system can then take advantage of these movements and blend them to realistically animate a character, including even movement subtle like blinks and raised eyebrows.

With upcoming VR headsets such as the Meta Quest Pro and Pico Neo 4 Pro including built-in face and eye tracking, this is one of the other possibilities described in the patent that fans of VR games and fans will enjoy. social experiments might find most interesting. In addition to being able to generate facial animation and lip-syncing based on an existing dataset, the patent also outlines how the system could do the same using cameras built into a VR headset. This feature could allow VR users to have avatars that accurately mimic their own facial expressions, providing a more realistic and immersive experience in multiplayer games and VR social experiences like VR Chat.

From Valve’s Source engine, which used an animation system that modeled facial movements for each phoneme (the individual sounds that make up words) of dialogue, to Team Bondi’s use of MotionScan technology to capture The blackWith the actors’ performances using a 32-camera high-definition system, creating realistic faces for the game always seemed out of reach. Meta’s patent could signal a huge technological leap forward in this quest and could help usher in a new era of high-quality realistic facial animation in games and VR experiences.

MORE: 11 Best Apps Every Meta Quest 2 Owner Should Try

Similar Posts

Leave a Reply

Your email address will not be published.