Tutorial: Control a SwiftUI View Using a Smile Through ARKit

In this tutorial, we will walk through detecting if a user is smiling, and using this logic to control a SwiftUI view.

Cole Dennis
4 min readNov 5, 2022

By tapping into ARKit, we can build logic into our app that changes our UI based on if a user is smiling or not:

Gif of the ending project, where when the user smiles, text on screen changes from “not smiling” to “smiling”

We will be using the MVVM template I created in another article as a base — that article can be checked out here (LINK) or the GitHub template can be downloaded here:

This template will get us set with a RealityKit based starting project, where the logic is broken out into an ARModel and ARViewModel swift files.

In order to build our project, we will need to do the below:

  • Add an ARSessionDelegate to our project
  • Add an ARFaceAnchor through the ARFaceTrackingConfiguration
  • Identify the Smile Data
  • Pass the Smile Data through to our SwiftUI View

Add an ARSessionDelegate to our project

We will want our ARView to recognize updates every frame in order to be responsive to when the user starts smiling and stops smiling. To enable this, we will modify our ARViewModel to be our ARSessionDelegate.

Import ARKit, and modify the ARViewModel Class to conform to the UIViewController and ARSessionDelegate protocols:

import ARKitclass ARViewModel: UIViewController, ObservableObject, ARSessionDelegate {}

Next we will be adding a function to set the ARViewModel file as the session delegate:

func startSessionDelegate() {
model.arView.session.delegate = self
}

Finally, we will need to run this function in order to start the session delegate. Call the startSessionDelegate function from the makeUIView() function within the ARViewContainer struct:

struct ARViewContainer: UIViewRepresentable {
var arViewModel: ARViewModel
func makeUIView(context: Context) -> ARView {
arViewModel.startSessionDelegate()
return arViewModel.arView
}

func updateUIView(_ uiView: ARView, context: Context) {}

}

Now the ARViewModel will be able to respond to updates to our ARView when events occur! We will be using this functionality later in the project.

Add an ARFaceAnchor through the ARFaceTrackingConfiguration

As we want to identify if a user is smiling, we will be using the ARFaceTrackingConfiguration in our ARView session. This will automatically add the ARFaceAnchor to our project, which we will need to recognize gestures made by the user’s face.

Import ARKit and add the below to the ARModel init() to start the ARFaceTrackingConfiguration:

import ARKitstruct ARModel {
private(set) var arView : ARView

init() {
arView = ARView(frame: .zero)
arView.session.run(ARFaceTrackingConfiguration())
}
}

If you run the project, you will notice that the front-facing camera is automatically being activated!

Identify the Smile Data

Now that we are running the ARFaceTrackingConfiguration(), we want to identify when a user is smiling. This is accessible through the ARFaceAnchor automatically added to the session when ARKit recognizes a face.

The first step is to add some variables that will ultimately store our smile data to the ARModel. The smile data we are working with will be broken into a right side smile and a left side smile:

var smileRight: Float = 0
var smileLeft: Float = 0

Next, we will add a function to access the smile data from the ARFaceAnchor to set the smileRight and smileLeft variables. We will be creating a mutating function that takes an ARFaceAnchor as an input:

mutating func update(faceAnchor: ARFaceAnchor){

}

The smile data is part of the .blendShapes property of the ARFaceAnchor element. We will specifically be accessing the values for the .mouthSmileRight and .mouthSmileLeft blendShapes, which will tell us a numeric value from 0 to 1 of how much each side of the mouth is smiling.

mutating func update(faceAnchor: ARFaceAnchor){smileRight = Float(truncating: faceAnchor.blendShapes.first(where: {$0.key == .mouthSmileRight})?.value ?? 0)smileLeft = Float(truncating: faceAnchor.blendShapes.first(where: {$0.key == .mouthSmileLeft})?.value ?? 0)}

The next step will be to add a function our ARViewModel that will update when a new anchor is added. Because the ARFaceAnchor is automatically added, this function will run as soon as a face is identified:

func session(_ session: ARSession, didUpdate anchors: [ARAnchor]) {

}

We only want to be working with the ARFaceAnchor, so add a guard inside this function to make sure we are only working with the ARFaceAnchor:

func session(_ session: ARSession, didUpdate anchors: [ARAnchor]) {
if let faceAnchor = anchors.first as? ARFaceAnchor {
}

}

Finally, we will run our update(faceAnchor: ) function here to pass through the ARFaceAnchor:

func session(_ session: ARSession, didUpdate anchors: [ARAnchor]) {
if let faceAnchor = anchors.first as? ARFaceAnchor {
model.update(faceAnchor: faceAnchor)
}
}

With this, we are now updating the values of the SmileRight and SmileLeft variables every frame! The only remaining thing to do is to connect this to our SwiftUI view.

Pass the Smile Data through to our SwiftUI View

To simplify the data, will will be adding a computed variable to our ARViewModel to check if at least one side of the user’s face is smiling past a threshold.

Add the below computed variable to the project:

var isSmiling: Bool {
var temp = false
if model.smileLeft > 0.3 || model.smileRight > 0.3 {
temp = true
}
return temp
}

Finally, we’re going to add a simple Text view to our ContentView struct that will change based on the status of the isSmiling variable:

That’s it! if If you run the project, you should see that it is responsive to when you start and stop smiling:

Gif of the ending project, where when the user smiles, text on screen changes from “not smiling” to “smiling”
Smiling text on screen after a user has smiled.

I hope this tutorial was helpful! The full Github repository for this project can be found here:

--

--