Snapchat launches Lens Studio 3.2, allowing developers to build LiDAR-powered AR Lenses for the iPhone 12 Pro

In Augmented Reality News 

October 14, 2020 – Snapchat has today announced the launch of Lens Studio 3.2, which now lets augmented reality (AR) creators and developers build LiDAR-powered Lenses for the new iPhone 12 Pro.

The LiDAR Scanner on iPhone 12 Pro and iPhone 12 Pro Max enables immersive AR experiences that overlay more seamlessly onto the real world. It therefore allows Snapchat’s camera to see a metric scale mesh of a scene, understanding the geometry and meaning of surfaces and objects. As a result, this new level of scene understanding allows Lenses on the Snapchat app to interact realistically with the surrounding world.

Plus, through the power of Apple’s A14 Bionic chipset and its ARKit platform for AR experiences, devices are able to render thousands of AR objects in real time, letting developers create deeply immersive environments for Snapchat users to explore.

“The addition of the LiDAR Scanner to iPhone 12 Pro models enables a new level of creativity for augmented reality,” said Eitan Pilipski, Snap’s SVP of Camera Platform. “We’re excited to collaborate with Apple to bring this sophisticated technology to our Lens Creator community.”  

The Lens Studio 3.2 update adds new search bars, making it easier for developers to find specific templates or Material Editor Nodes. Snapchat has also extended capabilities with its Face Mesh and Hand Tracking features to allow for hand segmentation and tracking against an entire skull as compared to just a face. Features include:

  • Head MeshIncludes a new ‘skull’ property within the Face Mesh asset, which allows for the tracking of a user’s whole head shape;
  • Hand Segmentation Segment an image against a hand, as well as occlude things behind hands. Hand segmentation uses 2D space (Screen Space) to apply a segmentation effect to a user’s hands, allowing for hand gestures to control effects;
  • Behavior Script Used to set up different effects and interactions through a dropdown menu. Developers can also use the ‘Behavior Helper Script’ to choose different triggers like face or touch interactions, and respond to them with effects such as enabling objects.

Developers can use templates provided by Snapchat, which make it easier for beginners and experts alike to get acquainted with new Lens Studio features. Plus, through a new interactive preview mode in Lens Studio 3.2, developers can create Lenses and preview them in the world, even without having access to the iPhone 12 Pro. Developers can also open Snapchat on Apple’s latest iPad Pro to bring LiDAR-powered Lenses to life.

For more information on Snapchat’s Lens Studio 3.2 and to download the template to create LiDAR-Powered Lenses, click here.

Video credit: Snapchat Lens Studio / YouTube

About the author

Sam is the Founder and Managing Editor of Auganix. With a background in research and report writing, he has been covering XR industry news for the past seven years.

Sign Up to the Weekly Auganix Newsletter