Tutorial: Use Head Tilt To Control a SwiftUI App Using ARKit / RealityKit

In this tutorial, we will be creating a demo SwiftUI experience that uses ARKit to recognize if a user is tilting their head to update a SwiftUI View:

Example of the ending experience where a user tilting their head can change the color of SwiftUI text on screen.

We are going to start with a MVVM setup so that the ARView is in an “ARModel” struct and is easier to work with. The tutorial walking through this setup is below:

For easy access, the GitHub repository for this MVVM setup can be found here:

Adding Face Tracking

As we are specifically looking to trigger events based on the rotation of a user’s head, we will modify our ARModel to use ARFaceTrackingConfiguration(), which will load our experience using the front-facing camera (you will also need to import ARKit into the ARModel file for this to work):

import Foundation
import RealityKit
import ARKit
struct ARModel {
private(set) var arView : ARView

init() {
arView = ARView(frame: .zero)
arView.session.run(ARFaceTrackingConfiguration())
}

}

There are a couple things we need in order to build our Head Tilt Trigger logic:

  • We need to track the orientation of a user’s head relative to the device camera
  • We need to add an ARSessionDelegate to track updates in our AR Scene
  • We need to create a SwiftUI view that visualizes the head-tilt in action.

Tracking the orientation of the user’s head

We need to track the orientation of the user’s head in relation to their device camera to calculate when they tilt their head either left or right. The first step is to add an anchor to the user’s face. In our ARView, add a Face Anchor:

let faceAnchor = AnchorEntity(.face)
faceAnchor.name = "faceAnchor"
arView.scene.addAnchor(faceAnchor)

Next, we need to add an anchor to the camera itself, so that we can calculate the difference in rotation between the user’s head and the camera. Add a Camera anchor to our ARView:

let cameraAnchor = AnchorEntity(.camera)
cameraAnchor.name = "cameraAnchor"
arView.scene.addAnchor(cameraAnchor)

Now that we have anchors on the user’s face and camera, we need to track the difference in orientation between the two. Add a headTilt variable to our ARModel:

var headTilt: Float = 0

The final addition needed for our ARModel is a function to calculate the difference in rotation between the user’s face and camera. First we will find each anchor using the names we’ve set for them, then we will use the .orientation(relativeTo:) function to calculate the relative orientation of our face anchor to the camera anchor. We then will access the .axis property to get the calculated rotation axis difference, and finally, we will access the .z axis property (which calculates the rotation of the user’s head as they tilt it left and right):

mutating func updateHeadTilt() {let faceAnchor = arView.scene.anchors.first(where: {$0.name == "faceAnchor" })let cameraAnchor = arView.scene.anchors.first(where: {$0.name == "cameraAnchor" })headTilt = faceAnchor?.orientation(relativeTo: cameraAnchor).axis.z ?? 0}

All together, the final ARModel file should look like this:

Adding the ARViewModel as a Delegate

In order for our SwiftUI view to get updates when our ARView detects that the anchor position of our face and camera anchors has updated, we will need to set our ARViewModel class as the ARSessionDelegate for our ARModel Session. To do this, we will need to update our Class to conform for the UIViewController and ARSessionDelegate protocol (we will also need to import ARKit into this file):

//import ARKitclass ARViewModel: UIViewController, ObservableObject,  ARSessionDelegate {}

Next, we will add a function to set the ARViewModel as the Session Delegate:

func startSessionDelegate() {model.arView.session.delegate = self}

Finally, on our ARViewContainer struct, we will need to run this startSessionDelegate() function. Add arViewModel.startSessionDelegate() to the makeUIView() function:

struct ARViewContainer: UIViewRepresentable {
var arViewModel: ARViewModel

func makeUIView(context: Context) -> ARView {
arViewModel.startSessionDelegate()
return arViewModel.arView
}

func updateUIView(_ uiView: ARView, context: Context) {}

}
func updateUIView(_ uiView: ARView, context: Context) {}}

We have successfully set up the ARSessionDelegate! In order to actually use these updates, we will go back to the ARViewModel and add a function that will update our headTilt variable when the ARViewModel registers updates in our ARScene:

func session(_ session: ARSession, didUpdate anchors: [ARAnchor]) {
model.updateHeadTilt()
}

This runs the updateHeadTilt() function every time that our ARView registers changes to the position of our Anchors.

Connecting to the SwiftUI View

Time to connect all this data to our SwiftUI view. First, we need to pass through the headTilt variable to our ARViewModel so that we can access it in our SwiftUI view:

var headTilt: Float {
model.headTilt
}

To finish our logic, we will add 2 variables to check if we’re tilting our head left or right past a threshold. This threshold dictates how sensitivity our head-tilt logic will be:

var tiltLeft: Bool {
if headTilt > 0.5 {
return true
} else {
return false
}
}

var tiltRight: Bool {
if headTilt < -0.5 {
return true
} else {
return false
}
}

This finishes up our ARViewModel, which should look like this:

In our SwiftUI view, we now need to visualize when a user is tilting their head left or right. I’ve created a very simple screen that asks the user what their favorite food is. We will create a ZStack in our ContentView with both the ARViewContainer and our SwiftUI view, so that our SwiftUI view is overlaid on top of the AR scene.

In order to visualize our head tilt, I’m adding conditional logic to the SwiftUI text .foregroundColor property based on the tiltLeft and tiltRight variables we set up in the ARViewModel:

.foregroundColor(arViewModel.tiltLeft ? .green : .secondary)

The final ContentView should look like this:

If you run this on your device, you should see that when you tilt your head left and right, the corresponding text will turn green:

Example of the ending experience where a user tilting their head can change the color of SwiftUI text on screen.

And there you have it! I hope this helps you get started with creating head-tilt logic in your RealityKit and ARKit projects!

You can access the full GitHub repository for this project here:

--

--

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store