In Virtual Reality and Mixed Reality News
October 11, 2022 – Last year Meta introduced its Presence Platform, the company’s suite of machine perception and AI capabilities that empowers developers to build mixed reality (MR) and natural interactions-based experiences. Today, with the launch of Meta Quest Pro at Connect 2022, the company is unlocking full-color mixed reality and introducing a new pillar to its Presence Platform’s core stack: Social Presence.
Social presence with Movement SDK
Meta has introduced Movement SDK, its newest addition to the Presence Platform. Movement SDK consists of Meta’s eye, face, and three-point body tracking capabilities, which enable avatars to mimic facial movements in real time using Meta Quest Pro’s inward-facing sensors. This is also the same underlying technology that drives avatars in Meta Horizon Worlds and Meta Horizon Workrooms.
Eye and face tracking capabilities
Movement SDK includes eye and face tracking capabilities opened up by new technology in Meta Quest Pro. Inside the headset, there are five IR sensors directed towards the user’s face: three sensors pointed towards the eyes and upper face and two pointed towards the lower face. Face tracking is driven by a machine learning model that lets Meta Quest Pro detect a wide range of facial movements, according to the company.
In order to have avatars appear expressive but still feel natural, the abstracted facial movement data that is output through the Movement SDK is represented by linear blend shapes based on a ‘Facial Action Coding System’ (FACS)—a series of zero-to-one values that correspond with a set of generic facial movements (e.g. scrunching of the nose or furrowing of eyebrows).
Meta stated that these signals will make it easier for developers to preserve the semantic meaning of users’ original movements when mapping signals from the Face Tracking API to a character rig, whether their character is humanoid or something else. Working with an internal team of artists, Meta has created an alien character called Aura, who can wink, move her mouth from side to side, and more. Aura will be available as a sample that developers can download from GitHub starting later this month.
In addition to social presence, developers can also build new interactions and experiences using estimates of where someone is looking in virtual reality (VR) as an input, similar to hand tracking. According to Meta, this means that people will be able to interact with virtual content based on where they’re looking.
From a privacy perspective, Meta stated that apps will never get access to raw image data (or pictures) of users’ eyes and faces from these features, but can instead only access a set of numbers that estimate where someone is looking in VR and their facial movement. Images of eyes and face are inaccessible to developers and Meta because they’re deleted after processing and never leave the headset.
Body Tracking API
Available now, Movement SDK also includes Meta’s Body Tracking API, which uses three-point tracking based on the position of the controllers or hands relative to the headset. In order to maintain the sense of immersion, Meta stated that it used a large dataset of real human motions to learn and correct the errors that are commonly seen with simple inverse kinematics (IK) approaches.
The Body Tracking API works both when a player is using controllers or just their hands. A full simulated upper-body skeleton including hands is provided in both these cases as well. The API also automatically takes care of cases where a player puts down their controllers and starts using hands without any additional logic handling required for developers.
Body Tracking API works across both Meta Quest Pro and Meta Quest 2, and it will work on future Meta Quest VR devices, according to the company, with new improvements to body tracking in the coming years being made available through the same API.
Mixed reality functionality
The new mixed reality capabilities that the company unveiled today include: Scene Understanding, Full-Color Passthrough, and Shared Spatial Anchors.
Scene understanding
Meta’s scene understanding capabilities, which were released earlier this year, enable developers to create mixed reality experiences that can adapt to the user’s personal space for increased immersion. Digital content can interact with the physical world with support for occlusion, collision, nav meshes, and blob shadows—all of which improve the sense of immersion and realism in mixed reality.
The below example from Sir Aphino shows a basic example of how scene understanding enables interaction between a virtual character, the physical wall, and ceiling locations, as well as other objects within the player’s room.
Full-Color Passthrough
Passthrough gives people in VR a real-time representation of the physical world around them, and it lets developers build experiences that blend virtual content with the physical world. The obvious next step for Meta’s Passthrough functionality is full color Passthrough. While Meta Quest 2 enables mixed reality, albeit using a black and white Passthrough mode, Meta Quest Pro was designed for it. Meta Quest 2 apps built with Passthrough will still work on Meta Quest Pro too.
With Meta Quest Pro’s advanced sensors and 4X the number of pixels, developers can now build experiences that let people engage with the virtual world while maintaining presence in their physical space in full color. This is thanks to the company’s investments in advanced stereoscopic vision algorithms that help to deliver a higher quality and more comfortable Passthrough experience with better depth perception and fewer visual distortions for both close-up and room-scale mixed reality scenarios.
Shared Spatial Anchors
Meta is rolling out an update to its Spatial Anchors later this year, with the introduction of Shared Spatial Anchors. The update will enable developers to build local multiplayer (co-located) experiences by creating a shared world-locked frame of reference for multiple users. For example, developers can build games and apps where two or more people can collaborate or play together while in the same physical space and interact with the same virtual content.
At Connect, Meta showcased a preview of Magic Room, a mixed reality experience that it is building for Meta Horizon Workrooms that leverages Shared Spatial Anchors. Magic Room lets any mix of people, some together in a physical room and some remote, collaborate in the same physical room together.
Meta stated that as its hardware continues to advance, its Presence Platform will create more opportunities to bring mixed reality, natural interactions, and social presence into VR, and will enable developers to work towards delivering the company’s vision of the metaverse.
Image / video credit: Meta / YouTube
About the author
Sam is the Founder and Managing Editor of Auganix. With a background in research and report writing, he has been covering XR industry news for the past seven years.