Introduction
Integrating 3D models into AR applications can significantly enhance the user experience by providing interactive and immersive elements. For iOS applications, using Apple's ARKit with the .usdz file format is often the preferred approach due to its native support and optimization for AR experiences. This introduction will guide you through the steps of integrating a 3D model into an iOS application using ARKit.
This is part four of a multi part article series about the integration of AR services via native modules into a React Native app. While React Native simplifies cross-platform app development, there are scenarios where leveraging native iOS capabilities becomes essential, especially when dealing with advanced features like augmented reality.
You can find the other parts here:
- Part one: Setting up a native module via Kotlin.
- Part Two: Integrating ARCore
- Part Three: Setting up a native module via Objective-C
The full code for this part is here.
3D-Model integration
Unfortunately, using GLB files in iOS AR applications is a bit of a hustle, so it is easier for us the use another format for the iOS version, namely USDZ, instead of using the already obtained GLB file from the Android version in part 2 of this series.
Download a model
Basically, download any model you like from a site like this. I downloaded the pancakes model.
XCode
The next step is to add the downloaded model to XCode. Naturally, with all things Apple, this is far more annoying than it has any right to be:
- Right-click on the project's root, select New Group and name it Resources.
- Drag the .usdz file from Finder and drop it into the Resources group in Xcode's project navigator.
- When you add the file, make sure the options are set correctly in the dialog that appears. Ensure the Copy items if needed is checked. Also, ensure that the file is added to the app's target by checking RN3DWorldExplorer, so it's included in the build.
Update info.plist
The last step is to add the following key to your info.plist file (located at ios/RN3dWorldExplorer):
<key>NSCameraUsageDescription</key>
<string>This app needs access to the camera to create 3D AR experiences.</string>
This is needed as we access the users camera for the 3D-Model, which Apple regards as privacy-sensitive data.
Coding
We will be doing several updates to our RCTARodule.m file in order to use ARKit. At first, add the following imports to the top of the file to leverage the AR capabilities on iOS:
#import <ModelIO/ModelIO.h>
#import <SceneKit/ModelIO.h>
#import <ARKit/ARKit.h>
Now we will be implementing a new interface. Copy this code between the imports and the @implementation RCTARModule
:
@interface RCTARModule () <ARSCNViewDelegate>
@property (nonatomic, strong) ARSCNView *sceneView;
@property (nonatomic, strong) SCNNode *modelNode;
@end
In essence, this code is preparing the RCTARModule
class to handle AR functionalities. It's setting up an AR SceneView (ARSCNView
) to display AR content and a node (SCNNode
) to load our 3D model in the AR scene. The module is also prepared to handle ARSCNView
events by conforming to the ARSCNViewDelegate
protocol.
We will update our showAR
function in the now to be able to display the model. This will take several steps. Exchange the current code for this:
RCT_EXPORT_METHOD(showAR:(NSString *)filename)
{
dispatch_async(dispatch_get_main_queue(), ^{
RCTLogInfo(@"Loading model: %@", filename);
[self initializeARView];
[self loadAndDisplayModel:filename];
[self presentARView];
});
}
Now we are logging the passed in parameter filename
before we are calling three functions to set up the AR view, load and display the model and present the view to the user in the end. Don't get shocked by the amount of errors popping up, they will go away as we are implementing the functions.
initializeARView
Use this code to implement the function:
- (void)initializeARView {
self.sceneView = [[ARSCNView alloc] initWithFrame:UIScreen.mainScreen.bounds];
self.sceneView.delegate = self;
// Configure AR session
ARWorldTrackingConfiguration *configuration = [ARWorldTrackingConfiguration new];
configuration.planeDetection = ARPlaneDetectionHorizontal;
[self.sceneView.session runWithConfiguration:configuration];
}
In general, the initializeARView
method sets up the AR environment for the application. It initializes an ARSCNView
to render the AR content, sets up the class as its delegate to handle AR-related events, configures the AR session to include horizontal plane detection, and finally starts the AR session with these configurations.
loadAndDisplayModel
Now implement the function with this code:
- (void)loadAndDisplayModel:(NSString *)filename {
NSString *filePath = [[NSBundle mainBundle] pathForResource:filename ofType:@"usdz"];
NSURL *fileURL = [NSURL fileURLWithPath:filePath];
NSError *error = nil;
SCNScene *scene = [SCNScene sceneWithURL:fileURL options:nil error:&error];
// Correctly set the modelNode property
self.modelNode = [scene.rootNode.childNodes firstObject];
if (self.modelNode) {
self.modelNode.scale = SCNVector3Make(0.1, 0.1, 0.1);
self.modelNode.position = SCNVector3Make(0, -1, -3); // Adjust as needed
[self.sceneView.scene.rootNode addChildNode:self.modelNode];
}
}
This method is designed to load a 3D model from the application's main bundle and display it in the AR scene. First it constructs a file path for a 3D model file (usdz format) based on the provided filename. Then, it creates a URL (fileURL) pointing to this file. After that, it creates a SCNScene
and adds the model node to the AR scene. Be aware that when you use your own model, you probably want to adjust the scale
and position
, as it could be too near / far from the camera.
presentARView
Finally, add these two functions to the file:
- (void)presentARView {
UIViewController *viewController = [UIViewController new];
[viewController.view addSubview:self.sceneView];
// Create a close button
UIButton *closeButton = [UIButton buttonWithType:UIButtonTypeSystem];
[closeButton setTitle:@"X" forState:UIControlStateNormal];
[closeButton addTarget:self action:@selector(closeARView) forControlEvents:UIControlEventTouchUpInside];
CGFloat buttonSize = 44.0;
CGFloat padding = 16.0;
closeButton.frame = CGRectMake(viewController.view.bounds.size.width - buttonSize - padding,
padding,
buttonSize,
buttonSize);
closeButton.autoresizingMask = UIViewAutoresizingFlexibleLeftMargin | UIViewAutoresizingFlexibleBottomMargin;
closeButton.backgroundColor = [UIColor blueColor];
closeButton.layer.cornerRadius = buttonSize / 2;
closeButton.clipsToBounds = YES;
[viewController.view addSubview:closeButton];
// Assuming you have access to the root view controller or the current view controller
UIViewController *rootViewController = RCTPresentedViewController();
[rootViewController presentViewController:viewController animated:YES completion:nil];
}
- (void)closeARView {
UIViewController *presentingController = self.sceneView.window.rootViewController;
[presentingController dismissViewControllerAnimated:YES completion:nil];
}
The first half of the presentARView
function just adds a close button to the view and assigns the helper function closeARView
to it. The latter part of the method concludes by presenting the new view controller modally on top of the current view hierarchy. This is done by retrieving the current application's presented view controller (RCTPresentedViewController()
) and calling presentViewController:animated:completion:
on it.
Update App.tsx
Now we finally move to our React Native code. Update the Button
s onPress
method:
await ARModule.showAR(Platform.OS == "ios" ? "pancakes" : "AR-model.glb");
If you use a different model, exchange pancakes for the file name. Don't forget to import { Platform } from "react-native"
at the top of the file. Start up the application and test the code.
Conclusion
Successfully integrating a 3D model into an iOS AR application involves several critical steps, from preparing the appropriate file format to coding the interaction with ARKit. While the process might seem daunting initially, especially due to platform-specific requirements like using .usdz files for iOS, the result is a compelling and immersive AR experience.
This concludes this series for now. Eventually there will be some articles about the integration of VR functionality like 360 videos for both platforms, but the setup process is way more complicated than anything explored in this series till now. For example on Android, it involves downloading the Cardboard SDK and surgically integrating it into the Android portion of the app via generating .proto files and .java files.
By now, you should have a basic understanding of the integration of AR services into React Native apps. Your next steps would the integration of advanced functionality like moving the models around or scaling them by using pinch gestures. The sky is the limit from here now on.
Top comments (0)