Jon Vogel Jon Vogel - 1 month ago 17
Swift Question

Displaying a dae file with ARKit and tracking an anchor in the scene

I'm trying out ARKit, and I set up an

using this tutorial.

Then set up tracking horizontal 3D planes with the second part of this tutorial.

I created a single view application then constrained an
flush to the root view with an outlet to my

Here is the code in the

import UIKit
import ARKit

class ViewController: UIViewController {

//MARK: Properties
@IBOutlet weak var arScene: ARSCNView!

//MARK: ARKit variables
var realityConfiguration: ARWorldTrackingSessionConfiguration?

//MARK: Lifecycle
override func viewDidLoad() {


override func viewDidAppear(_ animated: Bool) {

//MARK: Actions

//MARK: Overrides

extension ViewController {
func prepare() {
//Check to see if active reality is supported
guard ARSessionConfiguration.isSupported else {
//Custom alert function that just quickly displays a UIAlertController
AppDelegate.alert(title: "Not Supported", message: "Active Reality is not supported on this device")
//Set up the ARSessionConfiguration
self.realityConfiguration = ARWorldTrackingSessionConfiguration()
//Set up the ARSCNView
guard let config = self.realityConfiguration else {
//Run the ARSCNView and set its delegate
self.arScene.delegate = self

extension ViewController: ARSCNViewDelegate {
func renderer(_ renderer: SCNSceneRenderer, nodeFor anchor: ARAnchor) -> SCNNode? {
return nil

func renderer(_ renderer: SCNSceneRenderer, didAdd node: SCNNode, for anchor: ARAnchor) {
guard let planAnchor = anchor as? ARPlaneAnchor else {

let plane = SCNPlane(width: CGFloat(planAnchor.extent.x), height: CGFloat(planAnchor.extent.z))
let planeNode = SCNNode(geometry: plane)
planeNode.position = SCNVector3Make(, 0,

planeNode.transform = SCNMatrix4MakeRotation(-Float.pi / 2, 1, 0, 0)



func renderer(_ renderer: SCNSceneRenderer, willUpdate node: SCNNode, for anchor: ARAnchor) {
print("Will updated Node on Anchor: \(anchor.identifier)")

func renderer(_ renderer: SCNSceneRenderer, didUpdate node: SCNNode, for anchor: ARAnchor) {
print("Did updated Node on Anchor: \(anchor.identifier)")

func renderer(_ renderer: SCNSceneRenderer, didRemove node: SCNNode, for anchor: ARAnchor) {
print("Removed Node on Anchor: \(anchor.identifier)")

I downloaded Xcode 9 beta, followed Apple's tutorials and realized my phone (iPhone 6) does not have the A9 chip required for the

About halfway down in the first tutorial I linked, Apple says you can still create AR experiences with out the A9 chip. However, they do not go into further detail. Has anyone else found a starting point and or willing to provide a code example of using a .dae file and

  • Choosing an anchor point to display it

  • Tracking that anchor point

  • Actually displaying the .dae file

Answer Source

Number 1: The challenge, demand satisfaction...

There's not really anything to see — just a live camera view. (I figure that's not really worth a screenshot.)

The main reason you're not seeing any augmentations in your reality is that your code adds SceneKit content to the scene only when anchors are added to the ARSession... but you're not manually adding any anchors, and you haven't enabled plane detection so ARKit isn't automatically adding anchors. If you enable plane detection, you'll start getting somewhere...

self.realityConfiguration = ARWorldTrackingSessionConfiguration()
realityConfiguration?.planeDetection = .horizontal

But you still won't see anything. That's because your ARSCNViewDelegate implementation has conflicting instructions. This part:

func renderer(_ renderer: SCNSceneRenderer, nodeFor anchor: ARAnchor) -> SCNNode? {
    return nil

...means that no SceneKit nodes will be created for your anchors. Because there's no nodes, your renderer(_:didAdd:for:) function is never being called, so your code within that method is never creating any SceneKit content.

If you turn on plane detection and delete / comment out the renderer(_: nodeFor:) method, the rest of your code should get you something like this:

enter image description here

(The pure white area is your SCNPlane. I had to unfold my iPad cover on the white table get enough scene detail for plane detection to find anything. Also, check the background... there was actually a moment at WWDC today where the merch store wasn't packed full of people.)

Number 2: Grab a friend...

Apple's messaging is a little unclear here. When they say ARKit requires A9 or better, what they really mean is ARWorldTrackingSessionConfiguration does. And that's where all the best AR magic is. (There's even a UIRequiredDeviceCapabilities key for arkit that actually covers devices with world tracking support, so you can restrict your app on the App Store to being offered only to those devices.)

There's still some ARKit without world tracking, though. Run a session with the base class ARSessionConfiguration and you get orientation-only tracking. (No position tracking, no plane detection, no hit testing.)

What does that get you? Well, if you've played the current version of Pokémon GO, it works like that: because it tracks only device orientation, not position, you can't get closer to Pikachu or walk around behind him — the illusion that he occupies the real world holds as long as you only tilt or turn your device without moving it.

Number 3: If they don't, that's alright...

You load 3D content with SceneKit and place it in AR just like you load and place it in any other SceneKit app/game. There are plenty of resources out there for this, and lots of ways to do it. One of them you'll find in the Xcode template when you create a new AR project and choose SceneKit. The loading part goes something like this:

let scene = SCNScene(named: "ship.scn" inDirectory: "assets.scnassets")
let ship = scene.rootNode.childNode(withName: "ship", recursively: true)

Then to place it:

ship.simdPosition = float3(0, 0, -0.5) 
// half a meter in front of the *initial* camera position

The main difference to remember for placing things in AR is that positions are measured in meters (and your content needs to be designed so that it's sensibly sized in meters).