DEV Community

Jesse McCoy
Jesse McCoy

Posted on

AR App

This is a short tutorial for building an iOS AR app that tracks a users face.

Alt Text

This is the code you'll use to begin. First you need to import the proper modules, followed by creating your outlets. You will also create a variable that is set as an empty string to be used later.

Next you will be working in the view did load function. Set the scene view delegate to self and the scene view show statistics to true. After this you will set up the guard for the app. If your device doesn't support AR, the user will be notified.

Next up are the view will appear and view will disappear functions. Here you setup configuration for the AR face tracking.

Alt Text

The following step deals with the renderer functions. This code sets up the AR rendering and geometry for the app, and also sets up the face anchor.

Alt Text

Finally, we will work on the expression function, which will identify your facial expressions.

First, you set your variables for your expressions, which are mouth smile left, mouth smile right, cheek out, tongue out, jaw left and eye squint left. These variables will use the anchors and use AR to blend shapes. Set your action variable to waiting, to indicate that the app is waiting for your facial expressions.

Lastly, you will check for these facial expressions. If the app detects a proper facial expression, it will set your action to indicate which facial expression you are making.

Top comments (0)