Tutorial: “Tap to Place” AR Content Using RealityKit and SwiftUI
In this tutorial, we will be building logic to place RealityKit based Augmented Reality content in a user’s environment where they tap on screen.

To start, we will be using the MVVM RealityKit template I’ve setup here:
We will start by adding a function to ARModel that will be performing the Raycast. We will be using the makeRaycastQuery() function that is part of the ARView to make a raycast query:
guard let query = arView.makeRaycastQuery(from: location, allowing: .estimatedPlane, alignment: .any)
else { return }
We will then be using that query to make a raycast, and will grab the first result:
guard let result = arView.session.raycast(query).first
else { return }
We can use the world transform we recieve from the Raycast to place an AnchorEntity:
let raycastAnchor = AnchorEntity(world: result.worldTransform)
We need something to actually place in this project, so we will make a simple Sphere ModelEntity:
let sphereEntity = ModelEntity(mesh: .generateSphere(radius: 0.1), materials: [SimpleMaterial(color: .white, isMetallic: false)])
Finally, we add our sphere to the anchor, and add the anchor the the ARView Scene:
raycastAnchor.addChild(sphereEntity)
arView.scene.anchors.append(raycastAnchor)
All together, the function should look like this:
mutating func raycastFunc(location: CGPoint) {
guard let query = arView.makeRaycastQuery(from: location, allowing: .estimatedPlane, alignment: .any)
else { return }
guard let result = arView.session.raycast(query).first
else { return }
let sphereEntity = ModelEntity(mesh: .generateSphere(radius: 0.1), materials: [SimpleMaterial(color: .white, isMetallic: false)])
let raycastAnchor = AnchorEntity(world: result.worldTransform)
raycastAnchor.addChild(sphereEntity)
arView.scene.anchors.append(raycastAnchor)
}
Next, we will pass this function through to the ARViewModel:
func raycastFunc(location: CGPoint) {
model.raycastFunc(location: location)
}
As a final step, we will be connecting this to the ContentView using the new onTapGesture(count:coordinateSpace:perform:) instance in iOS 16. This will allow us to pass through the screen location we are tapping to our Raycast function:
ARViewContainer(arViewModel: arViewModel).edgesIgnoringSafeArea(.all)
.onTapGesture(coordinateSpace: .global) { location in
arViewModel.raycastFunc(location: location)
}
And that’s it! If you run the project now, you should see that you can tap the screen to place spheres around your environment:

The full GitHub repository for this project can be found here: