A Guide to JigSpace-Style Anchors in RealityKit
Tired of floaty AR objects? Learn how to create stable, 'sticky' anchors in RealityKit, just like the pros at JigSpace. A practical guide for developers.
Alex Carter
An experienced iOS and AR developer passionate about creating intuitive user experiences.
Ever used an app like JigSpace and marveled at how smoothly its 3D models snap into place? You move your phone, and the object glides perfectly across a tabletop, waiting for you to tap. When you do, it sticks. It doesn't jitter, it doesn't drift, it just... exists. It feels like magic. Then you go back to your own RealityKit project, and your carefully placed model seems to have a life of its own, subtly shifting and re-adjusting as you move. What gives?
That buttery-smooth experience isn't magic; it's a clever technique that sidesteps the default behavior of ARKit's anchors. Today, we're pulling back the curtain and showing you how to implement this "JigSpace-style" anchoring in your own RealityKit apps. Get ready to take your AR user experience from good to absolutely seamless.
The Problem with Standard Anchors
To understand the solution, we first need to grasp the problem. When you use ARKit to detect a horizontal plane, it gives you an ARPlaneAnchor
. This is fantastic for understanding the environment, but this anchor is not static. ARKit is constantly refining its understanding of the world. As you move your device and it gathers more data, it updates the position, size, and orientation of that ARPlaneAnchor
.
If your 3D model is a direct child of an entity based on that ARPlaneAnchor
, it will move every time the anchor is updated. This is what causes the subtle (and sometimes not-so-subtle) jittering. The system is trying to be helpful by giving you the most accurate placement, but for a user who has already decided "I want it right here," these micro-adjustments feel like instability.
The JigSpace approach flips this on its head: it uses ARKit's real-time updates for a preview, but once the user commits, it locks the object in a fixed position in world space.
Deconstructing the Magic: Preview and Commit
The secret lies in a two-step process:
- The Preview Phase (The "Ghost"): Before the user taps, a semi-transparent preview of the model, or a "ghost," continuously follows a viable surface. This is achieved by constantly shooting a ray from the center of the screen to find a surface. This gives the user instant, clear feedback on where the object could be placed.
- The Commit Phase (The "Sticky" Placement): When the user taps the screen, the app performs one final raycast to get a precise location. It then creates a new, static anchor at that exact world transform and places the final, fully opaque model there. This anchor isn't tied to an
ARPlaneAnchor
, so it won't be updated by ARKit. It's locked in place.
Let's build this step-by-step.
Step 1: Building the "Ghost" Preview
First, we need something to show the user. Let's assume you have a 3D model loaded, for instance, a toy car. We'll create two versions: the real one and a ghost one with a semi-transparent material.
Creating a Ghost Material
A simple way to create a ghost material is to use an UnlitMaterial
with a translucent color. This avoids complex lighting calculations and looks clean.
// Create a semi-transparent material for our ghost
let ghostMaterial = UnlitMaterial(color: .white.withAlphaComponent(0.5))
// Clone your model and apply the ghost material
// We'll assume 'originalCarEntity' is already loaded
ghostCarEntity = originalCarEntity.clone(recursive: true)
ghostCarEntity.model?.materials = [ghostMaterial]
// We'll also need a static anchor to hold the ghost in our scene
let ghostAnchor = AnchorEntity()
ghostAnchor.addChild(ghostCarEntity)
arView.scene.addAnchor(ghostAnchor)
ghostCarEntity.isEnabled = false // Start with it hidden
Continuous Raycasting
To make the ghost follow surfaces, we need to perform a raycast from the center of the screen on every frame. The ARView
's session delegate is the perfect place for this. Make sure your view controller conforms to ARSessionDelegate
.
func session(_ session: ARSession, didUpdate frame: ARFrame) {
// Use the center of the view for the raycast
let screenCenter = arView.center
// Perform a raycast to find a horizontal plane
guard let query = arView.makeRaycastQuery(from: screenCenter,
allowing: .estimatedPlane,
alignment: .horizontal),
let result = arView.session.raycast(query).first else {
// If no surface is found, hide the ghost
ghostCarEntity.isEnabled = false
return
}
// If we found a surface, show and position the ghost
ghostCarEntity.isEnabled = true
// Update the ghost's position to the raycast result's world transform
ghostCarEntity.transform.matrix = result.worldTransform
}
With this code, you'll now see a ghostly car gliding across any horizontal surfaces your camera sees. It's a bit jittery, but we'll fix that later. First, let's handle the placement.
Step 2: The "Sticky" Placement
Now for the satisfying part: tapping to place the object permanently. We'll add a tap gesture recognizer to our ARView
.
override func viewDidLoad() {
super.viewDidLoad()
// ... other setup ...
let tapGesture = UITapGestureRecognizer(target: self, action: #selector(handleTap))
arView.addGestureRecognizer(tapGesture)
}
@objc func handleTap(_ sender: UITapGestureRecognizer) {
// Get the tap location
let tapLocation = sender.location(in: arView)
// Perform a raycast from the tap location
guard let query = arView.makeRaycastQuery(from: tapLocation,
allowing: .estimatedPlane,
alignment: .horizontal),
let result = arView.session.raycast(query).first else {
// User tapped on a non-viable surface
return
}
// Create a NEW anchor at the exact world position
// This is the key! We use AnchorEntity(world:) instead of AnchorEntity(plane:)
let finalAnchor = AnchorEntity(world: result.worldTransform)
// Add your REAL model to this new, static anchor
// Assuming 'realCarEntity' is your fully-textured model
finalAnchor.addChild(realCarEntity.clone(recursive: true))
// Add the new anchor to the scene
arView.scene.addAnchor(finalAnchor)
// Optional: Hide the ghost after placing an object
ghostCarEntity.isEnabled = false
}
And that's the core logic! By creating an AnchorEntity
with a specific world
transform, we're telling RealityKit, "Put this here and leave it alone." It's no longer tethered to the ever-changing ARPlaneAnchor
, resulting in a perfectly stable placement.
Refining the Experience
The basic functionality is there, but a few refinements can elevate the experience from functional to fantastic.
Smoothing the Ghost's Movement
The ghost's movement can be jarring as it snaps to new positions every frame. We can smooth this out with a simple linear interpolation (LERP). Instead of instantly setting the ghost's transform, we'll move it a fraction of the way towards the target transform each frame.
Update your session(_:didUpdate:)
method:
func session(_ session: ARSession, didUpdate frame: ARFrame) {
// ... raycast logic as before ...
guard let result = arView.session.raycast(query).first else {
// ...
return
}
ghostCarEntity.isEnabled = true
// The new target transform
let targetTransform = Transform(matrix: result.worldTransform)
// Smoothly move the ghost towards the target
// The 'deltaTime' is the time since the last frame update
// The '0.1' factor controls the speed of the smoothing
let smoothingFactor: Float = 0.1
ghostCarEntity.transform = Transform.lerp(from: ghostCarEntity.transform, to: targetTransform, t: smoothingFactor)
}
// You might need to add a LERP function to Transform if it's not available
// extension Transform {
// static func lerp(from start: Transform, to end: Transform, t: Float) -> Transform {
// let newTranslation = simd_mix(start.translation, end.translation, t)
// let newRotation = simd_slerp(start.rotation, end.rotation, t)
// let newScale = simd_mix(start.scale, end.scale, t)
// return Transform(scale: newScale, rotation: newRotation, translation: newTranslation)
// }
// }
This small change makes the ghost feel like it's gliding gracefully over surfaces, which is much more visually appealing.
Guiding the User
Don't forget to help your user! The ARCoachingOverlayView
is essential for telling users to move their phone around to find a surface. If no surface is found, the ghost won't appear, and the user might not know why. The coaching overlay solves this problem elegantly.
Conclusion: It's All About Control
The "JigSpace-style" anchor isn't a secret API; it's a design pattern centered on taking control back from the system. By separating the real-time preview from the final, committed placement, you provide an experience that feels both responsive and incredibly stable.
To recap the key points:
- Use continuous raycasting to power a "ghost" preview object.
- Smooth the ghost's movement with LERP for a polished feel.
- On user tap, perform a final raycast to get a precise location.
- Create a static
AnchorEntity(world:)
at that exact transform for a rock-solid placement.
Give this technique a try in your next RealityKit project. Your users will notice the difference, even if they can't quite put their finger on why it just feels so... right.