Apple unveils ARKit 3 with RealityKit and Reality Composer for Augmented Reality app development at WWDC19

.

In Augmented Reality News

June 3, 2019 – At WWDC19 today, Apple has unveiled several technologies that the company states will make it easier and faster for developers to create new apps. Of those technologies is a new version of its ARKit software, with the announcement of ARKit 3, as well as RealityKit and Reality Composer, which are all tools designed to make it easier for developers to create AR experiences for consumer and business apps. Other technology announcements include SwiftUI – a development framework for building user interfaces, as well as new tools and APIs to help simplify the process of bringing iPad apps to Mac. The company also announced updates to Core ML and Create ML to allow for more powerful and streamlined on-device machine learning apps.

“The new app development technologies unveiled today make app development faster, easier and more fun for developers, and represent the future of app creation across all Apple platforms,” said Craig Federighi, Apple’s Senior Vice President of Software Engineering. “SwiftUI truly transforms user interface creation by automating large portions of the process and providing real-time previews of how UI code looks and behaves in-app. We think developers are going to love it.”

Some of ARKit 3’s features include:

  • Motion Capture – developers can now integrate people’s movement into apps;
  • People Occlusion – AR content will show up naturally in front of or behind people to enable more immersive AR experiences and green screen-like applications;
  • ARKit 3 also enables the front camera to track up to three faces, as well as simultaneous front and back camera support. It also enables collaborative sessions, which will allow users to jump into a shared AR experience.

Apple’s RealityKit was built from the ground up for AR. It features a photorealistic rendering, as well as environment mapping and support for camera effects like noise and motion blur for virtual content creation. RealityKit also features animation, physics and spatial audio, and developers can harness the capabilities of RealityKit with the new RealityKit Swift API.

Reality Composer, a new app for iOS, iPadOS and Mac, allows developers to prototype and produce AR experiences with no prior 3D experience. With a simple drag-and-drop interface and a library of high-quality 3D objects and animations, Reality Composer lets developers place, move and rotate AR objects to assemble an AR experience, which can be directly integrated into an app in Xcode or exported to AR Quick Look.

Apple’s SwiftUI provides a new user interface framework for building sophisticated app UIs. Using simple declarative code, developers can create full-featured user interfaces complete with animations. SwiftUI saves developers time by providing automatic functionality including interface layout, Dark Mode, Accessibility, right-to-left language support and internationalization. SwiftUI apps run natively, and because SwiftUI is the same API built into iOS, iPadOS, macOS, watchOS and tvOS, developers can, according to the company “more quickly and easily build rich, native apps across all Apple platforms”.

A new graphical UI design tool built into Xcode 11 also allows UI designers to assemble a user interface with SwiftUI without having to write any code. Swift code is automatically generated and when this code is modified, the changes to the UI instantly appear in the visual design tool. Developers will now be able to see automatic, real-time previews of how the UI will look and behave as they assemble, test and refine their code, allowing software developers and UI designers to collaborate more closely. Previews can run directly on connected Apple devices, including iPhone, iPad, iPod touch, Apple Watch and Apple TV, allowing developers to see how an app responds to Multi-Touch, or works with the camera and on-board sensors — live, as an interface is being built.

Other announcements made by Apple today include:

  • New tools and APIs to bring iPad apps to Mac. With Xcode, developers can open an existing iPad project and simply check a single box to automatically add fundamental Mac and windowing features, and adapt platform-unique elements like touch controls to keyboard and mouse — providing a huge head start on building a native Mac version of their app. Mac and iPad apps share the same project and source code, so any changes made to the code translate to both the iPadOS and macOS versions of the app, saving developers time and resources by allowing one team to work on both versions of an app.
  • Core ML 3 supports the acceleration of more types of advanced, real-time machine learning models. With over 100 model layers now supported with Core ML, apps can use models to deliver experiences that understand vision, natural language and speech. Developers can also update machine learning models on-device using model personalization. With Create ML, a dedicated app for machine learning development, developers can build machine learning models without writing code. Multiple-model training with different datasets can be used with new types of models like object detection, activity and sound classification.
  • The introduction of watchOS 6 and the App Store on Apple Watch, developers can now build and design apps for Apple Watch that can work completely independently, even without an iPhone.
  • Developers can also take advantage of the Apple Neural Engine on Apple Watch Series 4 using Core ML. A new streaming audio API means users can now also stream from third-party media apps with just their Apple Watch.
  • New anti-fraud features to Apple ID sign-in, which give developers confidence that new users are real people and not bots or farmed accounts. A new privacy-focused email relay service eliminates the need for users to disclose their personal email address, but still allows them to receive important messages from the app developer.
  • PencilKit – For developers to add Apple Pencil support to their apps and includes the redesigned tool palette.
  • SiriKit – Adds support for third-party audio apps, including music, podcasts and audiobooks, so developers can now integrate Siri directly into their iOS, iPadOS and watchOS apps, giving users the ability to control their audio with a simple voice command.
  • MapKit – Provides developers a number of new features such as vector overlays, point-of-interest filtering, camera zoom and pan limits, and support for Dark Mode.

Video credit: Apple

About the author

Sam is the Founder and Managing Editor of Auganix. With a background in research and report writing, he has been covering XR industry news for the past seven years.

Sign Up to the Weekly Auganix Newsletter