Tutorial: Integrate ARKit Image Recognition With SwiftUI
Use ARKit’s Image Detection to trigger changes in a SwiftUI View
In this tutorial, we will be using ARKit’s Image Detection feature to change a SwiftUI view when an image is recognized by ARKit.
First, we will be starting with a MVVM setup building upon the Augmented Reality App template in Xcode. I wrote a tutorial walking through this setup here:
For easy access, the final template setup can be found on GitHub here:
Enabling Image Detection
In order to add image detection to our project, we first need to set up which image(s) we want our project to detect.
Adding an ARReferenceImage Resource
Go to the Assets section in your project’s Xcode setup:
We will be setting up an “AR Resource Group” to add our image to. Either right click in the asset section or click on the “+” icon on the bottom of this section and go to “AR and Textures” and select “New AR Resource Group”:
This AR Resource Group is what we will be identifying in our project to pull reference images from.
Next, drag and drop your image into this AR Resources folder, and give it a name, width and height in the panel on the right:
You can add multiple images that can be recognized in a single project, but for this tutorial, we will only be identifying a single image.
You might notice that in my screenshot, there’s a yellow triangle warning on my reference image:
Xcode evaulates images that you add as AR reference images, and will flag if some images might not work as well as others. This doesn’t mean that the image WON’T work, but that it’s not as optimized for AR image recognition. Keep this in mind as you identify and work with images in your project.
Passing ARReferenceImage to ARWorldTrackingConfiguration
In AR Model, we need to set up the configuration on our ARView Session to detect the reference image that we added.
First, import ARKit into the file and add the below to the init() in the ARModel to get our tracker image(s) from the AR Resource group we added in Assets:
import ARKitguard let trackerImage = ARReferenceImage.referenceImages(inGroupNamed: "AR Resources", bundle: nil) else {
fatalError("Missing expected asset catalog resources.")
}
Now we will set up the AR configuration. ARKit Image Recognition only works in the ARWorldTrackingConfiguration, so we will be setting up a new instance of ARWorldTrackingConfiguration():
let configuration = ARWorldTrackingConfiguration()
Next, we’ll set up out configuration to use our AR Reference image as a detection image:
configuration.detectionImages = trackerImage
Finally, we need to run this new configuration to finish the setup of the ARView session.
arView.session.run(configuration)
There are a few more steps we need to set up for our project to be looking for our reference image(s). We need to set up our ARViewModel as a ARSessionDelegate for the ARView Session, so that the ARViewModel is updating when there are changes in our session .
In the ARViewModel, we will import ARKit and modify our class to conform to the UIViewController and ARSessionDelegate protocol:
import ARKitclass ARViewModel: UIViewController, ObservableObject, ARSessionDelegate {}
Now, we will add a startSessionDelegate() function to set our ARViewModel as the ARSessionDelegate for the ARView Session:
func startSessionDelegate() {
model.arView.session.delegate = self
}
The final step is to run this startSessionDelegate() function in the ARViewContainer ahead of the return:
Our ARViewModel is now updating when there are changes in the ARView! The final element is to actually DO something when an image is recognized.
Performing an action when an image is recognized
For this tutorial, we will be setting a bool to True in our ARModel when an image is recognized, and using that bool to change a SwiftUI view.
First, add a variable to the ARModel that we will be modifying when an image is recognized:
var imageRecognizedVar = false
Next we’ll be adding a mutating function to modify our imageRecognizedVar variable. When ARKit recognizes an image, it adds an anchor to our ARView session. We will take advantage of this to identify when an ARImageAnchor is added to our scene, indicating our image has been recognized.
Create a mutating function that takes an input of an array of ARAnchors:
mutating func imageRecognized(anchors: [ARAnchor]) {
}
We will want to make sure our reference images are set up correctly, otherwise we’ll never be able to actually recognize anything. Add this guard to the function:
guard let referenceImages = ARReferenceImage.referenceImages(inGroupNamed: "AR Resources", bundle: nil) else {
fatalError("Missing expected asset catalog resources.")
}
We will pass through our array of anchors in the scene to identify if any of the anchors are an ARImageAnchor, and if so, set our imageRecognizedVar to true:
for anchor in anchors {
guard let imageAnchor = anchor as? ARImageAnchor else { return }
imageRecognizedVar = true
}
Note that here is where we could add future logic that would be checking for a specific anchor image, but for this tutorial, we only have one image we are looking for.
The final ARModel should look like this:
Now we need to finish setting up our ARViewModel. First, we will be adding a session(_ session: ARSession, didAdd anchors: []) function from the ARSessionDelegate protocol that runs when a new anchor is added to the ARView. Because our project is adding an anchor when it recognizes our image, this function will run when the image is recognized. We will use this to run our imageRecognized() function in our ARModel, and pass through the array of all ARAnchors currently in our scene:
func session(_ session: ARSession, didAdd anchors: [ARAnchor]) {
model.imageRecognized(anchors: anchors)
}
The final addition to our ARViewModel is to pass through our imageRecognizedVar so that we can access it in our SwiftUI View:
var imageRecognizedVar : Bool {
model.imageRecognizedVar
}
The final ARViewModel should look like this:
Connecting to the SwiftUI View
The final step in this tutorial is to setup our SwiftUI view to change when the imageRecognizedVar is changed in our ARModel. I’ve added a simple VStack with a Switch element, and placed that inside a ZStack with our ARViewContainer:
With this Switch element, our SwiftUI view will update when the imageRecognizedVar variable set in our ARModel is set to true by our imageRecognized() function.
If you run your project and point your camera at the image, if it can recognize the image, you should see the SwiftUI text change!
I hope this tutorial was helpful! The full Github repository for the final code can be found here: