DEV Community

Cover image for Apple is getting ready for Mixed Reality development through iOS 16 APIs
Art Sh
Art Sh

Posted on • Edited on

Apple is getting ready for Mixed Reality development through iOS 16 APIs

If you’re following recent Apple announcements probably you’ve noticed the fact that Apple is just landing foundation for future mixed reality development. Without saying a word during WWDC keynotes.

Image description
(Source: Bloomberg)

Before Apple’s WWDC 2022 keynote kickstarted, media and tech world were speculating if Apple is going to mention or even announce a mixed reality headset.

At least, a toolkit similar to DTK for M1 development was expected. The reason was that multiple leaks and speculations were covered in media, but WWDC folded without even mentioning Apple’s revolutionary project.
What is more interesting? SceneKit and RealityKit framework barely got any updates this year. Instead, we have got M2-powered Macs, a stage manager in iPadOS and revamped iOS.

The release date for Apple’s reality headset initially was planned for 2020, finally moved to 2023, and now it could even go further and moved to 2024.

Apple is a type of company which will invest significantly into their products and eco-system before demonstrating any product details, and it make sense - deep integration into their eco-system and gaining AR developers market share to enable them to build for the metaverse.

Despite no news of realityOS, the iPhone maker has been making significant enhancements in its APIs and frameworks to shape up the developers for a mixed reality.

Let’s cover some APIs announced during WWDC 2022. Some of these are well-known and received a lot of limelight during WWDC 22. However, from an AR/VR development perspective, the role these APIs wasn’t that obvious to the public.

Live Text API And PDFKit for Scanning Text from Video

Apple had introduced a Live Text feature to extract text from images in iOS 15. They took feature to the next level in iOS 16 by releasing a Live Text API to easily grab text from images and video frames. Released as a part of VisionKit framework, DataScannerViewController class lets you configure various parameters for scanning. Under the hood, the Live Text API uses the VNRecognizeTextRequest to detect texts.

At a first glance, the Live Text API feature seems like a Google Lens. However, think of the possibilities it’ll bring when Apple’s future headset is in front of your eyes. For starters, imagine turning your head to quickly extract information with your eyes. Yup, it was already possible in iOS 15 through the AirPods spatial awareness for head-tracking that leverages CMHeadphoneMotionManager. Now throw iOS 16’s new personalized spatial audio into the mix and I can already see VR mechanics unfolding.

Two enhancements in the PDFKit framework — the ability to parse text fields and convert document pages into images — will matter a lot in building a rich AR experience.

Image description
(Source: WWDC 2022 video)

To ensure Apple’s mixed-reality device isn’t just a fancy gadget on your face, providing a toolset to interact with text, images and graphics is important.

With the introduction of two powerful image recognition functions, Apple is on the right path. A path that’ll lead to AR/VR apps with rich interactive interfaces connected to a real world.

Speech Recognition and Dictation
Let’s put text and images aside for a moment, iOS 16 has also rebuild the Dictation feature by letting users switch between voice and touch.

So, you could be walking down a hall and might want to quickly edit a text message on your phone. In iOS 16, you can quickly use your voice to easily modify a piece of text.

Image description
(Source: WWDC 2022 video)

More? The Speech framework has got a little enhancement — the ability to toggle punctuations in SFSpeechRecognitionRequest through addsPunctation. I’m optimistic this will give rise to rich communication apps as it's already found its way into live captions in FaceTime calls.

From a mixed reality perspective, these are fantastic features. Using voice to enter text would minimise our dependency on keyboard based input in the Virtual world. Apple’s also making it easy to integrate Siri into our apps using the new App Intents framework.

RoomPlan API, Background Assets Framework
The Background Assets framework is another tool that didn’t get a lot of attention yet. Introduced to handle the downloads of large files across different app states, I think the possibilities extend beyond this utility.

By downloading 3D assets from the cloud, we can quickly build and ship augmented reality apps with much smaller asset sizes, which could be critical on a headset.

Similarly, the RealityKit framework didn’t get any significant changes. But Apple quietly unveiled a new RoomPlan API. Powered by ARKit 6, the Swift-only API provides out-of-the-box support for scanning rooms and building 3D models out of it.

Image description
(Source: https://developer.apple.com)

Now, you can deem the RoomPlan API as an extension of the Object Capture API but think of it in AR/VR terms, and considering the fact that Apple’s mixed reality headset would have LiDAR sensors and multiple cameras, RoomPlan is going to be a game-changer for developers. Expect a lot of AR apps that let you reconstruct and restyle houses.

While those were the major APIs that would fit in the mixed reality use cases, Spatial is another new framework that enables working with 3D math primitives. It might prove its metal in dealing with graphics in the virtual space.

Finally, Apple didn’t mentioned a single word about its virtual reality headset plans, but the new APIs they released this year will play a crucial role in plugging all the pieces for metaverse development. I think these APIs are demonstrating a trend of Apple shifting towards enabling it’s eco-system for the future device.

It’s critical to prepare developers to build apps for the new ecosystem today. After all, for a product to get widespread adoption, there needs to be a mature ecosystem of apps and tools — which requires getting developers on board.

Top comments (7)

Collapse
 
patgr profile image
Partick Gr.

I'm currently using these API to build mobile apps, thank you for this overview! Very informative. Do you think more APIs/SDKs will be released soon? (2022)

Collapse
 
shiaart profile image
Art Sh

I think they will gradually release some new API until device will be announced with a much bigger SDK and tools release

Collapse
 
kungfukitty profile image
Kayla Anderson

I can only imagine the cool games that they will also release in Apple Arcade that may include this technology!

Collapse
 
patgr profile image
Partick Gr.

Yes, this technology opens a complete new industry to create!

Collapse
 
kungfukitty profile image
Kayla Anderson

I’m always amazed with the direction they take Apple products!

Collapse
 
kungfukitty profile image
Kayla Anderson

I had no idea that Room Plan API existed. I just moved into a new home so this could be very helpful for me!

Collapse
 
shiaart profile image
Art Sh

Yeah, I've used this API for a week - works as magic!