© Wallace Wang 2018
Wallace WangBeginning ARKit for iPhone and iPadhttps://doi.org/10.1007/978-1-4842-4102-8_11

11. Plane Detection

Wallace Wang1 
(1)
San Diego, CA, USA
 

Placing virtual objects in mid-air is fine, but augmented reality works best when it also interacts with the real world. One of the most basic ways augmented reality interacts with the real world is through detecting horizontal or vertical planes. When ARKit can detect a flat surface, it can later place a virtual object so that it appears to be resting on that real flat surface, such as a table or floor.

Each time ARKit detects a plane, it places an anchor in that augmented reality view. This anchor, one per plane, contains information about the plane’s:
  • Orientation

  • Position

  • Size

As you move the camera in an iOS device around, ARKit constantly updates its information about a plane. Typically this involves recognizing that a plane may be larger than it initially thought. For example, when you first point an iOS device’s camera at a floor, ARKit can only recognize that portion of the floor that the camera sees. As you move the iOS device’s camera around, ARKit can detect other parts of the floor, forcing it to update its information on how large the plane might really be.

For this chapter, let’s create a new Xcode project by following these steps:
  1. 1.

    Start Xcode. (Make sure you’re using Xcode 10 or greater.)

     
  2. 2.

    Choose File ➤ New ➤ Project. Xcode asks you to choose a template.

     
  3. 3.

    Click the iOS category.

     
  4. 4.

    Click the Single View App icon and click the Next button. Xcode asks for a product name, organization name, organization identifiers, and content technology.

     
  5. 5.

    Click in the Product Name text field and type a descriptive name for your project, such as PlaneDetection. (The exact name does not matter.)

     
  6. 6.

    Click the Next button. Xcode asks where you want to store your project.

     
  7. 7.

    Choose a folder and click the Create button. Xcode creates an iOS project.

     
Now modify the Info.plist file to allow access to the camera and to use ARKit by following these steps:
  1. 1.

    Click the Info.plist file in the Navigator pane. Xcode displays a list of keys, types, and values.

     
  2. 2.

    Click the disclosure triangle to expand the Required Device Capabilities category to display Item 0.

     
  3. 3.

    Move the mouse pointer over Item 0 to display a plus (+) icon.

     
  4. 4.

    Click this plus (+) icon to display a blank Item 1.

     
  5. 5.

    Type arkit under the Value category in the Item 1 row.

     
  6. 6.

    Move the mouse pointer over the last row to display a plus (+) icon.

     
  7. 7.

    Click on the plus (+) icon to create a new row. A popup menu appears.

     
  8. 8.

    Choose Privacy – Camera Usage Description.

     
  9. 9.

    Type AR needs to use the camera under the Value category in the Privacy – Camera Usage Description row.

     
Now it’s time to modify the ViewController.swift file to use ARKit and SceneKit by following these steps:
  1. 1.

    Click on the ViewController.swift file in the Navigator pane.

     
  2. 2.

    Edit the ViewController.swift file so it looks like this:

    import UIKit
    import SceneKit
    import ARKit
    class ViewController: UIViewController, ARSCNViewDelegate {
    let configuration = ARWorldTrackingConfiguration()
        override func viewDidLoad() {
            super.viewDidLoad()
            // Do any additional setup after loading the view, typically from a nib.
        }
    }
     
To view augmented reality in our app, add a single ARKit SceneKit View (ARSCNView) so the user interface looks similar to Figure 11-1.
../images/469983_1_En_11_Chapter/469983_1_En_11_Fig1_HTML.jpg
Figure 11-1

The user interface includes a single ARSCNView

After you’ve designed your user interface, you need to add constraints. To add constraints, choose Editor ➤ Resolve Auto Layout Issues ➤ Reset to Suggested Constraints at the bottom half of the menu under the All Views in Container category.

The next step is to connect the user interface items to the Swift code in the ViewController.swift file . To do this, follow these steps:
  1. 1.

    Click the Main.storyboard file in the Navigator pane.

     
  2. 2.

    Click the Assistant Editor icon or choose View ➤ Assistant Editor ➤ Show Assistant Editor to display the Main.storyboard and the ViewController.swift file side by side.

     
  3. 3.

    Move the mouse pointer over the ARSCNView, hold down the Control key, and Ctrl-drag under the class ViewController line.

     
  4. 4.

    Release the Control key and the left mouse button. A popup menu appears.

     
  5. 5.
    Click in the Name text field and type sceneView, then click the Connect button. Xcode creates an IBOutlet as shown here:
    @IBOutlet var sceneView: ARSCNView!
     
  6. 6.
    Edit the viewDidLoad function so it looks like this:
        override func viewDidLoad() {
            super.viewDidLoad()
            // Do any additional setup after loading the view, typically from a nib.
            sceneView.debugOptions = [ARSCNDebugOptions.showWorldOrigin, ARSCNDebugOptions.showFeaturePoints]
            sceneView.delegate = self
            configuration.planeDetection = .horizontal
            sceneView.session.run(configuration)
        }
     
  7. 7.
    Edit the viewWillAppear function so it looks like this:
        override func viewWillAppear(_ animated: Bool) {
            super.viewWillAppear(animated)
            sceneView.session.run(configuration)
        }
     
Notice that to detect a horizontal plane, we just need one line of code:
configuration.planeDetection = .horizontal

Detecting a horizontal plane requires ARKit identifying enough feature points (those tiny yellow dots) on a flat, horizontal surface. To increase the odds that ARKit will detect a horizontal plane, aim your iOS device’s camera at a flat surface with plenty of texture or color variation such as a bed, a rug or carpet, or a table. In comparison, a solid white floor will be much harder to identify since there will be much less detail for ARKit to identify.

To detect if ARKit has identified a horizontal plane, we’ll need a didAdd renderer function. This function runs each time ARKit identifies a horizontal plane and identifies it as a plane anchor called ARPlaneAnchor, which defines the position an orientation of the flat surface. Add the following didAdd renderer function in your ViewController.swift file:
    func renderer(_ renderer: SCNSceneRenderer, didAdd node: SCNNode, for anchor: ARAnchor) {
        guard anchor is ARPlaneAnchor else { return }
        print ("plane detected")
    }

The first time ARKit identifies a horizontal plane, it assumes the horizontal plane is only as large as what it sees through the iOS device’s camera. As you move the iOS device’s camera around, ARKit will spot additional points of the horizontal plane. When that occurs, it updates its floor anchor information so it stores a larger dimension of the horizontal plane.

Each time ARKit updates its ARPlaneAnchor information by realizing a horizontal plane may be larger, it runs an didUpdate renderer function . Add the following didUpdate renderer function in the ViewController.swift file:
    func renderer(_ renderer: SCNSceneRenderer, didUpdate node: SCNNode, for anchor: ARAnchor) {
        guard anchor is ARPlaneAnchor else { return }
        print("updating floor anchor")
    }

In both of these renderer functions, there’s an initial guard statement that checks if the renderer function identifies a horizontal plane (ARPlaneAnchor). If not, then the renderer function exits. If the renderer function does identify a horizontal plane, then each renderer function prints a statement (“plane detected” or “updating floor anchor”).

The entire ViewController.swift file should look like this:
import UIKit
import SceneKit
import ARKit
class ViewController: UIViewController, ARSCNViewDelegate {
    @IBOutlet var sceneView: ARSCNView!
    let configuration = ARWorldTrackingConfiguration()
    override func viewDidLoad() {
        super.viewDidLoad()
        // Do any additional setup after loading the view, typically from a nib.
        sceneView.debugOptions = [ARSCNDebugOptions.showWorldOrigin, ARSCNDebugOptions.showFeaturePoints]
        sceneView.delegate = self
        configuration.planeDetection = .horizontal
        sceneView.session.run(configuration)
    }
    func renderer(_ renderer: SCNSceneRenderer, didAdd node: SCNNode, for anchor: ARAnchor) {
        guard anchor is ARPlaneAnchor else { return }
        print ("plane detected")
    }
    func renderer(_ renderer: SCNSceneRenderer, didUpdate node: SCNNode, for anchor: ARAnchor) {
        guard anchor is ARPlaneAnchor else { return }
        print("updating plane anchor")
    }
}
To test this project, follow these steps:
  1. 1.

    Connect an iOS device to your Macintosh through its USB cable.

     
  2. 2.

    Click the Run button or choose Product ➤ Run. The first time you run this app, it will ask permission to access the camera so give it permission.

     
  3. 3.

    Aim the iOS device’s camera at a horizontal plane such as the seat of a chair or the floor. The first time ARKit identifies a horizontal plane, the Xcode debug area displays the message “plane detected”.

     
  4. 4.

    Move the iOS device around to capture more of the horizontal plane. Each time ARKit recognizes a new part of the horizontal plane, the Xcode debug area displays the message “updating plane anchor”.

     
  5. 5.

    Click the Stop button or choose Product ➤ Stop.

     

Displaying Planes as Images

Once we can get ARKit to recognize horizontal surfaces, we can place an image on that horizontal surface. First, you need to get an image such as searching for “texture image public domain” in your favorite search engine. Now you can download any texture image you want, such as a brick sidewalk, wooden floor, or the rippling surface of water.

To add a .png or .jpg image to an Xcode project, simply drag and drop that image into the Navigator pane of your Xcode project, as shown in Figure 11-2.
../images/469983_1_En_11_Chapter/469983_1_En_11_Fig2_HTML.jpg
Figure 11-2

Drag and drop an image into the Navigator pane

Once you’ve added a texture image for a plane, the next step is to create a plane, with a texture, and add it to the sceneView rootnode inside the didAdd renderer function like this:
        let planeNode = displayTexture()
        sceneView.scene.rootNode.addChildNode(planeNode)
To create the plane, we’ll need to create a function called displayTexture, which creates a SCNNode, defines that node as an SCNPlane with a width and height of 0.5 meters, and appears at the position 0, 0, -0.5. Most importantly, the SCNPlane needs to use the texture image you dragged into the Navigator pane such as an image named water.jpg (change this name to the name of the image you dragged into your Xcode Navigator pane). The displayTexture function should look like this:
    func displayTexture() -> SCNNode {
        let planeNode = SCNNode()
        planeNode.geometry = SCNPlane(width: 0.5, height: 0.5)
        planeNode.geometry?.firstMaterial?.diffuse.contents = UIImage(named: "water.jpg")
        planeNode.position = SCNVector3(0, 0, -0.5)
        return planeNode
    }
If you run this code, it will create a plane that displays water.jpg as a vertical plane, as shown in Figure 11-3.
../images/469983_1_En_11_Chapter/469983_1_En_11_Fig3_HTML.jpg
Figure 11-3

Displaying an image on a plane

Besides rotating the plane 90 degrees along the x-axis so it appears flat, another problem is that if you look behind the plane, the image only appears on one side. To make the image appear on both sides of the plane, we need to define the plane as double-sided like this:
planeNode.geometry?.firstMaterial?.isDoubleSided = true
Then we need to rotate the plane around the x-axis by 90 degrees. Remember, Xcode measures everything in radians, so we’ll first need to convert 90 degrees into radians like this:
let ninetyDegrees = GLKMathDegreesToRadians(90)
Then we can rotate the plane around its x-axis by defining its eulerAngles position like this:
planeNode.eulerAngles = SCNVector3(ninetyDegrees, 0, 0)
The entire displayTexture function should look like this:
    func displayTexture() -> SCNNode {
        let planeNode = SCNNode()
        planeNode.geometry = SCNPlane(width: 0.5, height: 0.5)
        planeNode.geometry?.firstMaterial?.diffuse.contents = UIImage(named: "water.jpg")
        planeNode.position = SCNVector3(0, 0, -0.5)
        let ninetyDegrees = GLKMathDegreesToRadians(90)
        planeNode.eulerAngles = SCNVector3(ninetyDegrees, 0, 0)
        planeNode.geometry?.firstMaterial?.isDoubleSided = true
        return planeNode
    }
The entire ViewController.swift file should look like this to display a horizontal plane with an image:
import UIKit
import SceneKit
import ARKit
class ViewController: UIViewController, ARSCNViewDelegate {
    @IBOutlet var sceneView: ARSCNView!
    let configuration = ARWorldTrackingConfiguration()
    override func viewDidLoad() {
        super.viewDidLoad()
        // Do any additional setup after loading the view, typically from a nib.
        sceneView.debugOptions = [ARSCNDebugOptions.showWorldOrigin, ARSCNDebugOptions.showFeaturePoints]
        sceneView.delegate = self
        configuration.planeDetection = .horizontal
        sceneView.session.run(configuration)
        let planeNode = displayTexture()
        sceneView.scene.rootNode.addChildNode(planeNode)
    }
    func displayTexture() -> SCNNode {
        let planeNode = SCNNode()
        planeNode.geometry = SCNPlane(width: 0.5, height: 0.5)
        planeNode.geometry?.firstMaterial?.diffuse.contents = UIImage(named: "water.jpg")
        planeNode.position = SCNVector3(0, 0, -0.5)
        let ninetyDegrees = GLKMathDegreesToRadians(90)
        planeNode.eulerAngles = SCNVector3(ninetyDegrees, 0, 0)
        planeNode.geometry?.firstMaterial?.isDoubleSided = true
        return planeNode
    }
    func renderer(_ renderer: SCNSceneRenderer, didAdd node: SCNNode, for anchor: ARAnchor) {
        guard anchor is ARPlaneAnchor else { return }
        print ("plane detected")
    }
    func renderer(_ renderer: SCNSceneRenderer, didUpdate node: SCNNode, for anchor: ARAnchor) {
        guard anchor is ARPlaneAnchor else { return }
        print("updating plane anchor")
    }
}
If you run this code, you’ll see a horizontal plane displaying the texture image on both the top and bottom, as shown in Figure 11-4.
../images/469983_1_En_11_Chapter/469983_1_En_11_Fig4_HTML.jpg
Figure 11-4

The plane appearing horizontally

Right now the plane appears at an arbitrary size and position. What we want is to make that plane appear where ARKit detects a horizontal plane, such as a floor or table top.

First, delete these two lines from the viewDidLoad function:
        let planeNode = displayTexture()
        sceneView.scene.rootNode.addChildNode(planeNode)
Now, in the didAdd renderer function, add the following two lines:
        let planeNode = displayTexture(anchor: anchor as! ARPlaneAnchor)
        node.addChildNode(planeNode)
The entire didAdd renderer function should look like this:
    func renderer(_ renderer: SCNSceneRenderer, didAdd node: SCNNode, for anchor: ARAnchor) {
        guard anchor is ARPlaneAnchor else { return }
        let planeNode = displayTexture(anchor: anchor as! ARPlaneAnchor)
        node.addChildNode(planeNode)
        print ("plane detected")
    }

What this didAdd renderer function does now is that once it detects a horizontal plane, it stores the size and position of that horizontal plane as an ARPlaneAnchor. So we take this size and position information and pass it to the displayTexture function that creates the actual plane.

That means we need to change the displayTexture function so it can accept a parameter like this:
    func displayTexture(anchor: ARPlaneAnchor) -> SCNNode {
Within the displayTexture function, we need to modify both the size of the plane and the position, both of which come from the anchor parameter that gets passed into the displayTexture function. First, we need to define the plane’s size based on the anchor. To do that, we need to change the plane’s size dimensions from an arbitrary fixed value to the size of the anchor like this:
     planeNode.geometry = SCNPlane(width: CGFloat(anchor.extent.x), height: CGFloat(anchor.extent.z))
Next, we need to define the horizontal plane’s position based on the anchor like this:
     planeNode.position = SCNVector3(anchor.center.x, anchor.center.y, anchor.center.z)
The entire modified ViewController.swift file should now look like this:
import UIKit
import SceneKit
import ARKit
class ViewController: UIViewController, ARSCNViewDelegate {
    @IBOutlet var sceneView: ARSCNView!
    let configuration = ARWorldTrackingConfiguration()
    override func viewDidLoad() {
        super.viewDidLoad()
        // Do any additional setup after loading the view, typically from a nib.
        sceneView.debugOptions = [ARSCNDebugOptions.showWorldOrigin, ARSCNDebugOptions.showFeaturePoints]
        sceneView.delegate = self
        configuration.planeDetection = .horizontal
        sceneView.session.run(configuration)
    }
    func displayTexture(anchor: ARPlaneAnchor) -> SCNNode {
        let planeNode = SCNNode()
        planeNode.geometry = SCNPlane(width: CGFloat(anchor.extent.x), height: CGFloat(anchor.extent.z))
        planeNode.geometry?.firstMaterial?.diffuse.contents = UIImage(named: "water.jpg")
        planeNode.position = SCNVector3(anchor.center.x, anchor.center.y, anchor.center.z)
        let ninetyDegrees = GLKMathDegreesToRadians(90)
        planeNode.eulerAngles = SCNVector3(ninetyDegrees, 0, 0)
        planeNode.geometry?.firstMaterial?.isDoubleSided = true
        return planeNode
    }
    func renderer(_ renderer: SCNSceneRenderer, didAdd node: SCNNode, for anchor: ARAnchor) {
        guard anchor is ARPlaneAnchor else { return }
        let planeNode = displayTexture(anchor: anchor as! ARPlaneAnchor)
        node.addChildNode(planeNode)
        print ("plane detected")
    }
    func renderer(_ renderer: SCNSceneRenderer, didUpdate node: SCNNode, for anchor: ARAnchor) {
        guard anchor is ARPlaneAnchor else { return }
        print("updating plane anchor")
    }
}

If you run this code, you’ll notice that when the app identifies a horizontal plane, it places a plane, displaying a texture image, in that location. However, one additional problem remains. The didUpdate renderer function constantly scans the real world and updates the ARPlaneAnchor for what it identifies as the horizontal plane.

That means we need to add code inside this didUpdate renderer function so when it recognizes that the horizontal plane is larger, it will display a larger virtual plane in that area.

The didUpdate renderer function runs every time ARKit detects that the horizontal plane is larger. Each time it needs to remove the currently displayed horizontal plane before adding an updated horizontal plane. To do this, we use the enumeratechildNodes loop to constantly remove the old horizontal plane like this:
        node.enumerateChildNodes { (childNode, _) in
            childNode.removeFromParentNode()
        }
After removing the old horizontal plane, we need to add a new one, so the entire didUpdate renderer function should look like this:
    func renderer(_ renderer: SCNSceneRenderer, didUpdate node: SCNNode, for anchor: ARAnchor) {
        guard anchor is ARPlaneAnchor else { return }
        node.enumerateChildNodes { (childNode, _) in
            childNode.removeFromParentNode()
        }
        let planeNode = displayTexture(anchor: anchor as! ARPlaneAnchor)
        node.addChildNode(planeNode)
        print("updating plane anchor")
    }
The entire ViewController.swift file should now look like this:
import UIKit
import SceneKit
import ARKit
class ViewController: UIViewController, ARSCNViewDelegate {
    @IBOutlet var sceneView: ARSCNView!
    let configuration = ARWorldTrackingConfiguration()
    override func viewDidLoad() {
        super.viewDidLoad()
        // Do any additional setup after loading the view, typically from a nib.
        sceneView.debugOptions = [ARSCNDebugOptions.showWorldOrigin, ARSCNDebugOptions.showFeaturePoints]
        sceneView.delegate = self
        configuration.planeDetection = .horizontal
        sceneView.session.run(configuration)
    }
    func displayTexture(anchor: ARPlaneAnchor) -> SCNNode {
        let planeNode = SCNNode()
        planeNode.geometry = SCNPlane(width: CGFloat(anchor.extent.x), height: CGFloat(anchor.extent.z))
        planeNode.geometry?.firstMaterial?.diffuse.contents = UIImage(named: "water.jpg")
        planeNode.position = SCNVector3(anchor.center.x, anchor.center.y, anchor.center.z)
        let ninetyDegrees = GLKMathDegreesToRadians(90)
        planeNode.eulerAngles = SCNVector3(ninetyDegrees, 0, 0)
        planeNode.geometry?.firstMaterial?.isDoubleSided = true
        return planeNode
    }
    func renderer(_ renderer: SCNSceneRenderer, didAdd node: SCNNode, for anchor: ARAnchor) {
        guard anchor is ARPlaneAnchor else { return }
        let planeNode = displayTexture(anchor: anchor as! ARPlaneAnchor)
        node.addChildNode(planeNode)
        print ("plane detected")
    }
    func renderer(_ renderer: SCNSceneRenderer, didUpdate node: SCNNode, for anchor: ARAnchor) {
        guard anchor is ARPlaneAnchor else { return }
        node.enumerateChildNodes { (childNode, _) in
            childNode.removeFromParentNode()
        }
        let planeNode = displayTexture(anchor: anchor as! ARPlaneAnchor)
        node.addChildNode(planeNode)
        print("updating plane anchor")
    }
}
To test this code, follow these steps:
  1. 1.

    Connect an iOS device to your Macintosh through its USB cable.

     
  2. 2.

    Click the Run button or choose Product ➤ Run. The first time you run this app, it will ask permission to access the camera so give it permission.

     
  3. 3.

    Aim the iOS device’s camera at a horizontal plane such as the seat of a chair or the floor. The first time ARKit identifies a horizontal plane, the Xcode debug area displays the message “plane detected”. That’s when ARKit first displays the virtual plane that displays the texture image you dragged and dropped into the Navigator pane of Xcode.

     
  4. 4.

    Move the iOS device around to capture more of the horizontal plane. Each time ARKit recognizes a new part of the horizontal plane, the Xcode debug area displays the message “updating plane anchor”. Notice that the horizontal virtual plane keeps expanding in size.

     
  5. 5.

    Click the Stop button or choose Product ➤ Stop.

     

Placing Virtual Objects on a Horizontal Plane

One way users can interact with augmented reality is by placing virtual objects on horizontal planes identified in the real world, such as a floor or table top. This involves first detecting the horizontal plane, then placing a virtual object on that detected horizontal plane.

Let’s create a new Xcode project by following these steps:
  1. 1.

    Start Xcode. (Make sure you’re using Xcode 10 or greater.)

     
  2. 2.

    Choose File ➤ New ➤ Project. Xcode asks you to choose a template.

     
  3. 3.

    Click the iOS category.

     
  4. 4.

    Click the Single View App icon and click the Next button. Xcode asks for a product name, organization name, organization identifiers, and content technology.

     
  5. 5.

    Click in the Product Name text field and type a descriptive name for your project, such as PlacingObjects. (The exact name does not matter.)

     
  6. 6.

    Click the Next button. Xcode asks where you want to store your project.

     
  7. 7.

    Choose a folder and click the Create button. Xcode creates an iOS project.

     
Now modify the Info.plist file to allow access to the camera and to use ARKit by following these steps:
  1. 1.

    Click the Info.plist file in the Navigator pane. Xcode displays a list of keys, types, and values.

     
  2. 2.

    Click the disclosure triangle to expand the Required Device Capabilities category to display Item 0.

     
  3. 3.

    Move the mouse pointer over Item 0 to display a plus (+) icon.

     
  4. 4.

    Click this plus (+) icon to display a blank Item 1.

     
  5. 5.

    Type arkit under the Value category in the Item 1 row.

     
  6. 6.

    Move the mouse pointer over the last row to display a plus (+) icon.

     
  7. 7.

    Click on the plus (+) icon to create a new row. A popup menu appears.

     
  8. 8.

    Choose Privacy – Camera Usage Description.

     
  9. 9.

    Type AR needs to use the camera under the Value category in the Privacy – Camera Usage Description row.

     
Now it’s time to modify the ViewController.swift file to use ARKit and SceneKit by following these steps:
  1. 1.

    Click on the ViewController.swift file in the Navigator pane.

     
  2. 2.
    Edit the ViewController.swift file so it looks like this:
    import UIKit
    import SceneKit
    import ARKit
    class ViewController: UIViewController, ARSCNViewDelegate {
    let configuration = ARWorldTrackingConfiguration()
        override func viewDidLoad() {
            super.viewDidLoad()
            // Do any additional setup after loading the view, typically from a nib.
        }
    }
     

To view augmented reality in our app, add a single ARKit SceneKit View (ARSCNView) so it fills the entire user interface (see Figure 11-1).

After you’ve designed your user interface, you need to add constraints. To add constraints, choose Editor ➤ Resolve Auto Layout Issues ➤ Reset to Suggested Constraints at the bottom half of the menu under the All Views in Container category.

The next step is to connect the user interface items to the Swift code in the ViewController.swift file. To do this, follow these steps:
  1. 1.

    Click the Main.storyboard file in the Navigator pane.

     
  2. 2.

    Click the Assistant Editor icon or choose View ➤ Assistant Editor ➤ Show Assistant Editor to display the Main.storyboard and the ViewController.swift file side by side.

     
  3. 3.

    Move the mouse pointer over the ARSCNView, hold down the Control key, and Ctrl-drag under the class ViewController line.

     
  4. 4.

    Release the Control key and the left mouse button. A popup menu appears.

     
  5. 5.
    Click in the Name text field and type sceneView, then click the Connect button. Xcode creates an IBOutlet as shown here:
    @IBOutlet var sceneView: ARSCNView!
     
  6. 6.
    Edit the viewDidLoad function so it looks like this:
        override func viewDidLoad() {
            super.viewDidLoad()
            // Do any additional setup after loading the view, typically from a nib.
            sceneView.debugOptions = [ARSCNDebugOptions.showWorldOrigin, ARSCNDebugOptions.showFeaturePoints]
            sceneView.delegate = self
            configuration.planeDetection = .horizontal
            sceneView.session.run(configuration)
        }
     
This app currently detects horizontal planes, but we also need it to accept tap gestures to place a virtual object on a detected horizontal plane. To place a tap gesture recognizer in our app, follow these steps:
  1. 1.

    Click the ViewController.swift file in the Navigator pane.

     
  2. 2.
    Add the following two lines to the end of the viewDidLoad function:
       let tapGesture = UITapGestureRecognizer(target: self, action: #selector(tapResponse))
       sceneView.addGestureRecognizer(tapGesture)
     
  3. 3.
    Type the following underneath the viewDidLoad function:
        @objc func tapResponse(sender: UITapGestureRecognizer) {
            let scene = sender.view as! ARSCNView
            let tapLocation = sender.location(in: scene)
            let hitTest = scene.hitTest(tapLocation, types: .existingPlaneUsingExtent)
            if hitTest.isEmpty{
                print ("no plane detected")
            } else {
                print("found a horizontal plane")
            }
        }
     
  4. 4.

    Connect an iOS device to your Macintosh through its USB cable.

     
  5. 5.

    Click the Run button or choose Product ➤ Run.

     
  6. 6.

    Point your iOS device’s camera at a flat, horizontal surface until you see plenty of yellow feature points appearing. Then tap on the screen. The debug area of Xcode should display the “found a horizontal plane” message.

     
  7. 7.

    Point your iOS device’s camera at a wall. Then tap on the screen. The debug area of Xcode should display the “no plane detected” message.

     
  8. 8.

    Click the Stop button or choose Product ➤ Stop.

     
This code shows that we can detect horizontal planes and recognize a tap gesture. This tap gesture will be used to place a virtual object when ARKit recognizes a horizontal plane. To do this, we must first modify the tapResponse function with two additional lines below the print("found a horizontal plane") line like this:
        guard let hitResult = hitTest.first else { return }
       addObject(hitResult: hitResult)

This first guard line checks to make sure that the user tapped on a horizontal plane. Then the second line calls an addObject function and sends the position of where the user tapped.

Next, we’ll need to create an addObject function like this:
    func addObject(hitResult: ARHitTestResult) {
    }
This function receives the location of the horizontal plane send by the tapResponse function . What we’ll add each time the user taps the screen on a horizontal plane will be an orange pyramid, so we can write the following code inside the addObject function :
    func addObject(hitResult: ARHitTestResult) {
        let objectNode = SCNNode()
        objectNode.geometry = SCNPyramid(width: 0.1, height: 0.2, length: 0.1)
        objectNode.geometry?.firstMaterial?.diffuse.contents = UIColor.orange
    }
Finally, we need to define the position of each pyramid. The x, y, and z positions where the user tapped is stored in a 4x4 matrix called worldTransform. The x, y, and z positions are stored in the third column of this worldTransform matrix so the last two lines in the addObject function look like this:
   objectNode.position = SCNVector3(hitResult.worldTransform.columns.3.x, hitResult.worldTransform.columns.3.y, hitResult.worldTransform.columns.3.z)
   sceneView.scene.rootNode.addChildNode(objectNode)
The first line retrieves the x, y, and z location of the tap on the screen, while the second line places the virtual object (the orange pyramid) where the user tapped. The entire ViewController.swift file should look like this:
import UIKit
import SceneKit
import ARKit
class ViewController: UIViewController, ARSCNViewDelegate {
    @IBOutlet var sceneView: ARSCNView!
    let configuration = ARWorldTrackingConfiguration()
    override func viewDidLoad() {
        super.viewDidLoad()
        // Do any additional setup after loading the view, typically from a nib.
        sceneView.debugOptions = [ARSCNDebugOptions.showWorldOrigin, ARSCNDebugOptions.showFeaturePoints]
        sceneView.delegate = self
        configuration.planeDetection = .horizontal
        sceneView.session.run(configuration)
        let tapGesture = UITapGestureRecognizer(target: self, action: #selector(tapResponse))
        sceneView.addGestureRecognizer(tapGesture)
    }
    @objc func tapResponse(sender: UITapGestureRecognizer) {
        let scene = sender.view as! ARSCNView
        let tapLocation = sender.location(in: scene)
        let hitTest = scene.hitTest(tapLocation, types: .existingPlaneUsingExtent)
        if hitTest.isEmpty{
            print ("no plane detected")
        } else {
            print("found a horizontal plane")
            guard let hitResult = hitTest.first else { return }
            addObject(hitResult: hitResult)
        }
    }
    func addObject(hitResult: ARHitTestResult) {
        let objectNode = SCNNode()
        objectNode.geometry = SCNPyramid(width: 0.1, height: 0.2, length: 0.1)
        objectNode.geometry?.firstMaterial?.diffuse.contents = UIColor.orange
        objectNode.position = SCNVector3(hitResult.worldTransform.columns.3.x, hitResult.worldTransform.columns.3.y, hitResult.worldTransform.columns.3.z)
        sceneView.scene.rootNode.addChildNode(objectNode)
    }
}
To test this code, follow these steps:
  1. 1.

    Connect an iOS device to your Macintosh through its USB cable.

     
  2. 2.

    Click the Run button or choose Product ➤ Run. The first time you run this app, it will ask permission to access the camera, so give it permission.

     
  3. 3.

    Aim the iOS device’s camera at a horizontal plane, such as the seat of a chair or the floor. The first time ARKit identifies a horizontal plane, the Xcode debug area displays the message “plane detected”.

     
  4. 4.

    Tap the screen. An orange pyramid appears where you tapped, as shown in Figure 11-5. Repeat Steps 3 and 4 to add orange pyramids to the detected horizontal plane.

     
../images/469983_1_En_11_Chapter/469983_1_En_11_Fig5_HTML.jpg
Figure 11-5

Adding orange pyramids to a horizontal plane

  1. 5.

    Click the Stop button or choose Product ➤ Stop.

     

Detecting Vertical Planes

Detecting vertical planes is identical to detecting horizontal planes. Instead of defining planeDetection as .horizontal, you define it as .vertical like this:
      configuration.planeDetection = .vertical
Now your app can detect vertical planes such as walls instead of only horizontal planes like floors. To see how detecting vertical planes can work, simply modify the PlacingObjects project by editing the viewDidLoad function so it detects vertical planes. The entire viewDidLoad function should look like this:
    override func viewDidLoad() {
        super.viewDidLoad()
        // Do any additional setup after loading the view, typically from a nib.
        sceneView.debugOptions = [ARSCNDebugOptions.showWorldOrigin, ARSCNDebugOptions.showFeaturePoints]
        sceneView.delegate = self
        configuration.planeDetection = .vertical
        sceneView.session.run(configuration)
        let tapGesture = UITapGestureRecognizer(target: self, action: #selector(tapResponse))
        sceneView.addGestureRecognizer(tapGesture)
    }
In the tapResponse function , simply replace horizontal with vertical in the print statement so the entire tapResponse function looks like this:
    @objc func tapResponse(sender: UITapGestureRecognizer) {
        let scene = sender.view as! ARSCNView
        let tapLocation = sender.location(in: scene)
        let hitTest = scene.hitTest(tapLocation, types: .existingPlaneUsingExtent)
        if hitTest.isEmpty{
            print ("no plane detected")
        } else {
            print("found a vertical plane")
            guard let hitResult = hitTest.first else { return }
            addObject(hitResult: hitResult)
        }
    }
Search the Internet for an image of your favorite painting and drag it into the Navigator pane, as shown in Figure 11-6.
../images/469983_1_En_11_Chapter/469983_1_En_11_Fig6_HTML.jpg
Figure 11-6

Adding an image of a painting

Most of the changes will need to occur in the addObject function. First, we need to create a plane to add, define the plane’s size, and display the painting image on that plane like this:
   let objectNode = SCNNode()
   objectNode.geometry = SCNPlane(width: 0.3, height: 0.3)
        objectNode.geometry?.firstMaterial?.diffuse.contents = UIImage(named: "Mona_Lisa.jpg")

This code adds an image called Mona_Lisa.jpg to the plane, but you’ll need to replace this image name with the name of the image you dragged and dropped into the Navigator pane.

The entire addObject function should look like this:
    func addObject(hitResult: ARHitTestResult) {
        let objectNode = SCNNode()
        objectNode.geometry = SCNPlane(width: 0.3, height: 0.3)
        objectNode.geometry?.firstMaterial?.diffuse.contents = UIImage(named: "Mona_Lisa.jpg")
        objectNode.position = SCNVector3(hitResult.worldTransform.columns.3.x, hitResult.worldTransform.columns.3.y, hitResult.worldTransform.columns.3.z)
        sceneView.scene.rootNode.addChildNode(objectNode)
    }
The entire ViewController.swift file should look like this:
import UIKit
import SceneKit
import ARKit
class ViewController: UIViewController, ARSCNViewDelegate {
    @IBOutlet var sceneView: ARSCNView!
    let configuration = ARWorldTrackingConfiguration()
    override func viewDidLoad() {
        super.viewDidLoad()
        // Do any additional setup after loading the view, typically from a nib.
        sceneView.debugOptions = [ARSCNDebugOptions.showWorldOrigin, ARSCNDebugOptions.showFeaturePoints]
        sceneView.delegate = self
        configuration.planeDetection = .vertical
        sceneView.session.run(configuration)
        let tapGesture = UITapGestureRecognizer(target: self, action: #selector(tapResponse))
        sceneView.addGestureRecognizer(tapGesture)
    }
    @objc func tapResponse(sender: UITapGestureRecognizer) {
        let scene = sender.view as! ARSCNView
        let tapLocation = sender.location(in: scene)
        let hitTest = scene.hitTest(tapLocation, types: .existingPlaneUsingExtent)
        if hitTest.isEmpty{
            print ("no plane detected")
        } else {
            print("found a vertical plane")
            guard let hitResult = hitTest.first else { return }
            addObject(hitResult: hitResult)
        }
    }
    func addObject(hitResult: ARHitTestResult) {
        let objectNode = SCNNode()
        objectNode.geometry = SCNPlane(width: 0.3, height: 0.3)
        objectNode.geometry?.firstMaterial?.diffuse.contents = UIImage(named: "Mona_Lisa.jpg")
        objectNode.position = SCNVector3(hitResult.worldTransform.columns.3.x, hitResult.worldTransform.columns.3.y, hitResult.worldTransform.columns.3.z)
        sceneView.scene.rootNode.addChildNode(objectNode)
    }
}
To test this code, follow these steps:
  1. 1.

    Connect an iOS device to your Macintosh through its USB cable.

     
  2. 2.

    Click the Run button or choose Product ➤ Run. The first time you run this app, it will ask permission to access the camera so give it permission.

     
  3. 3.

    Aim the iOS device’s camera at a vertical plane such as a wall or a door. Pick a vertical surface that isn’t smooth but has plenty of distinctive features for the app to identify as tiny yellow feature points on the screen. The more feature points the app can identify on a vertical plane, the more likely it will be able to identify that vertical plane. The first time ARKit identifies a vertical plane, the Xcode debug area displays the message “plane detected”.

     
  4. 4.

    Tap the screen. The image of your painting appears on the vertical plane, as shown in Figure 11-7.

     
../images/469983_1_En_11_Chapter/469983_1_En_11_Fig7_HTML.jpg
Figure 11-7

Adding a plane to a vertical surface

  1. 5.

    Click the Stop button or choose Product ➤ Stop.

     

As you can see, detecting vertical planes is identical to detecting horizontal planes.

Summary

Detecting horizontal or vertical planes can be crucial when you want to add virtual objects to the real world. You simply need to use one line of code to detect horizontal or vertical planes:
configuration.planeDetection = .horizontal
Or
configuration.planeDetection = .vertical

Once ARKit detects a horizontal or vertical plane, it can define that plane with a texture image, such as water, bricks, or sand. As you move an iOS device’s camera around, the size of a detected plane will gradually grow.