© Wallace Wang 2018
Wallace WangBeginning ARKit for iPhone and iPadhttps://doi.org/10.1007/978-1-4842-4102-8_9

9. Adding Touch Gestures to Augmented Reality

Wallace Wang1 
(1)
San Diego, CA, USA
 

Up until now, we’ve created augmented reality apps that display virtual objects on the screen. What gives augmented reality a greater sense of existing in front of your eyes is when you can interact with virtual objects.

Naturally, you won’t be able to touch or manipulate any virtual objects with your hand, but you can touch and manipulate virtual objects through the augmented reality view on your iOS screen. This means you can tap, drag, and manipulate items on the screen with your fingertips. Interacting with augmented reality can create a greater sense of realism.

Some of the different kinds of gestures available include:
  • Tap—A brief touch on the screen before lifting the finger up

  • Long press—Press a finger on the screen and hold it there for a period of time

  • Swipe—Slide a finger left or right across an area

  • Pan—Press a finger on the screen and then slide it across the screen

  • Pinch—A two-finger gesture that moves the two fingertips closer or farther apart

  • Rotation—A two-finger gesture that moves the two fingertips in a circular motion

To interact with virtual objects in augmented reality, we must first detect a touch gesture such as a tap or a swipe. Once we can detect a touch gesture, our next step is to identify if the touch gesture touched a virtual object or not.

To detect a touch gesture in an augmented reality view, we need to create a new Xcode project by following these steps:
  1. 1.

    Start Xcode. (Make sure you’re using Xcode 10 or greater.)

     
  2. 2.

    Choose File ➤ New ➤ Project. Xcode asks you to choose a template.

     
  3. 3.

    Click the iOS category.

     
  4. 4.

    Click the Single View App icon and click the Next button. Xcode asks for a product name, organization name, organization identifiers, and content technology.

     
  5. 5.

    Click in the Product Name text field and type a descriptive name for your project such as TouchGesture. (The exact name does not matter.)

     
  6. 6.

    Click the Next button. Xcode asks where you want to store your project.

     
  7. 7.

    Choose a folder and click the Create button. Xcode creates an iOS project.

     
Now modify the Info.plist file to allow access to the camera and to use ARKit by following these steps:
  1. 1.

    Click the Info.plist file in the Navigator pane. Xcode displays a list of keys, types, and values.

     
  2. 2.

    Click the disclosure triangle to expand the Required device capabilities category to display Item 0.

     
  3. 3.

    Move the mouse pointer over Item 0 to display a plus (+) icon.

     
  4. 4.

    Click this plus (+) icon to display a blank Item 1.

     
  5. 5.

    Type arkit under the Value category in the Item 1 row.

     
  6. 6.

    Move the mouse pointer over the last row to display a plus (+) icon.

     
  7. 7.

    Click on the plus (+) icon to create a new row. A popup menu appears.

     
  8. 8.

    Choose Privacy – Camera Usage Description.

     
  9. 9.

    Type AR needs to use the camera under the Value category in the Privacy – Camera Usage Description row.

     
Now it’s time to modify the ViewController.swift file to use ARKit and SceneKit by following these steps:
  1. 1.

    Click on the ViewController.swift file in the Navigator pane.

     
  2. 2.

    Edit the ViewController.swift file so it looks like this:

    import UIKit
    import SceneKit
    import ARKit
    class ViewController: UIViewController, ARSCNViewDelegate {
    let configuration = ARWorldTrackingConfiguration()
        override func viewDidLoad() {
            super.viewDidLoad()
            // Do any additional setup after loading the view, typically from a nib.
        }
    }
     
To view augmented reality in our app, add a single ARKit SceneKit View (ARSCNView) so the user interface looks similar to Figure 9-1.
../images/469983_1_En_9_Chapter/469983_1_En_9_Fig1_HTML.jpg
Figure 9-1

The user interface includes a single ARSCNView

After you’ve designed your user interface , you need to add constraints. To add constraints, choose Editor ➤ Resolve Auto Layout Issues ➤ Reset to Suggested Constraints at the bottom half of the menu under the All Views in Container category.

The next step is to connect the user interface items to the Swift code in the ViewController.swift file. To do this, follow these steps:
  1. 1.

    Click the Main.storyboard file in the Navigator pane.

     
  2. 2.

    Click the Assistant Editor icon or choose View ➤ Assistant Editor ➤ Show Assistant Editor to display the Main.storyboard and the ViewController.swift file side by side.

     
  3. 3.

    Move the mouse pointer over the ARSCNView, hold down the Control key, and Ctrl-drag under the class ViewController line.

     
  4. 4.

    Release the Control key and the left mouse button. A popup menu appears.

     
  5. 5.
    Click in the Name text field and type sceneView, then click the Connect button. Xcode creates an IBOutlet as shown here:
    @IBOutlet var sceneView: ARSCNView!
     
  6. 6.
    Edit the viewDidLoad function so it looks like this:
        override func viewDidLoad() {
            super.viewDidLoad()
            // Do any additional setup after loading the view, typically from a nib.
            sceneView.delegate = self
            sceneView.showsStatistics = true
            sceneView.debugOptions = [ARSCNDebugOptions.showWorldOrigin, ARSCNDebugOptions.showFeaturePoints]
        }
     
  7. 7.
    Edit the viewWillAppear function so it looks like this:
        override func viewWillAppear(_ animated: Bool) {
            super.viewWillAppear(animated)
            sceneView.session.run(configuration)
        }
     

Recognizing Touch Gestures

There’s a three-step process to recognizing touch gestures in an app:
  • Create a function to handle the touch gesture when it’s recognized

  • Define a UIGestureRecognizer class that identifies the function to handle touch gestures

  • Add the Touch Gesture Recognizer to the ARKit Scene View

First, you need to create a function that will run when your app recognizes a touch gesture. Unlike other functions, this touch gesture function needs the @objc keyword in front of it. This allows the function to access Objective-C code used to create Apple’s frameworks, some of which still use Objective-C. The basic structure of a function to handle touch gestures looks like this, where handleTap is any arbitrary function name you choose:
    @objc func handleTap() {
    }
Second, you need to use the UITapGestureRecognizer class to define the function that will handle the tap gesture like this:
let tapGesture = UITapGestureRecognizer(target: self, action: #selector(handleTap))

If you want to detect other types of gestures such as rotation or pan, then you would need to use a different class such as UIRotationGestureRecognizer or UIPanGestureRecognizer.

The #selector keyword identifies the function name that will handle the touch gesture. The actual name, such as tapGesture, is arbitrary.

Finally, the third step is to add the gesture recognizer to the augmented reality scene like this:
sceneView.addGestureRecognizer(tapGesture)

This code adds the touch gesture (such as tapGesture) to the scene (sceneView), but both are arbitrary names that you can replace with anything else. This line then makes the scene (sceneView) able to recognize touch gestures and handle them through the @obj function you defined.

Detecting touch gestures can be fairly simple, so let’s see how that works by following these steps:
  1. 1.

    Click the Main.storyboard file in the Navigator pane. The user interface appears.

     
  2. 2.

    Click the Assistant Editor icon or choose View ➤ Assistant Editor ➤ Show Assistant Editor to show the Main.storyboard and ViewController.swift file side by side.

     
  3. 3.

    Move the mouse pointer over the ARSCNView, hold down the Control key, and Ctrl-drag under the class ViewController line in the ViewController.swift file.

     
  4. 4.

    Release the Control key and the left mouse button. A window pops up.

     
  5. 5.

    Click in the Name text field and type sceneView.

     
  6. 6.
    Click the Connect button. Xcode creates an IBOutlet as follows:
    @IBOutlet var sceneView: ARSCNView!
     
  7. 7.
    Edit the viewDidLoad function so it looks like this:
        override func viewDidLoad() {
            super.viewDidLoad()
            // Do any additional setup after loading the view, typically from a nib.
            sceneView.delegate = self
            sceneView.showsStatistics = true
            sceneView.debugOptions = [ARSCNDebugOptions.showWorldOrigin, ARSCNDebugOptions.showFeaturePoints]
            let tapGesture = UITapGestureRecognizer(target: self, action: #selector(handleTap))
            sceneView.addGestureRecognizer(tapGesture)
        }
     
  8. 8.
    Underneath this viewDidLoad function , add the following two functions:
        override func viewWillAppear(_ animated: Bool) {
            super.viewWillAppear(animated)
            sceneView.session.run(configuration)
        }
        @objc func handleTap() {
            print ("tap detected")
        }
     
  9. 9.

    Connect an iOS device to your Macintosh through its USB cable.

     
  10. 10.

    Click the Run button or choose Product ➤ Run.

     
  11. 11.

    Tap anywhere on the screen when the camera view appears. Notice that each time you tap the screen, Xcode prints “tap detected” in the bottom pane of the Xcode window.

     
  12. 12.

    Click the Stop button or choose Product ➤ Stop.

     

Once we have the app to recognize taps anywhere on augmented reality view, the next step is to detect when those tap gestures occur over a virtual object such as a sphere or a box.

Identifying Touch Gestures on Virtual Objects

Once you can identify touch gestures on an augmented reality view, the next step is to identify when those touch gestures occur on a virtual object. To do this, we need to modify our function that handles touch gestures. Currently, it looks like this:
    @objc func handleTap() {
    }
We need to modify this slightly so the function receives information about what the user actually tapped on the screen:
    @objc func handleTap(sender: UITapGestureRecognizer) {
    }
This information from the UITapGestureRecognizer can tell us whether the user tapped on a virtual object or not. First, we need to get the area or view of the tapped portion of the screen like this:
let areaTapped = sender.view as! SCNView
Once we know the area tapped, we need to get the actual coordinates of that area like this:
let tappedCoordinates = sender.location(in: areaTapped)
Now we need to determine if there is any virtual objects in the tapped area using a function called hitTest. This hitTest function does the hard work of identifying virtual objects within a specific set of coordinates:
let hitTest = areaTapped.hitTest(tappedCoordinates)
We need an if-else statement to respond depending if the hitTest identified a virtual object or not:
        if hitTest.isEmpty {
        } else {
        }
If the hitTest function fails to identify a virtual object, isEmpty should be true, so let’s put a simple print statement inside to verify this:
        if hitTest.isEmpty {
             print ("Nothing")
        } else {
        }
If the hitTest function identifies a virtual object, it stores this information in an array, so we need to retrieve the first item from this array:
let results = hitTest.first!
Now we’ll retrieve the name of the virtual object identified by the hitTest function and print it:
    let name = results.node.name
    print(name ?? "background")

The print statement on the second line either prints the name of the virtual object found or if a name hasn’t been assigned to the virtual object, it defaults to printing “background” instead. That’s because the hitTest function isn’t always accurate so if you tap away from a virtual object like the pyramid or box, it might detect a background image as a virtual object.

The entire handleTap function should look like this:
    @objc func handleTap(sender: UITapGestureRecognizer) {
        let areaTapped = sender.view as! SCNView
        let tappedCoordinates = sender.location(in: areaTapped)
        let hitTest = areaTapped.hitTest(tappedCoordinates)
        if hitTest.isEmpty {
            print ("Nothing")
        } else {
            let results = hitTest.first!
            let name = results.node.name
            print(name ?? "background")
        }
    }
To see how the hitTest function works, we need to create one or more virtual objects, give each virtual object a name, and place it in the augmented reality view. This means creating an addShapes function like this:
    func addShapes() {
        let node = SCNNode(geometry: SCNBox(width: 0.05, height: 0.05, length: 0.05, chamferRadius: 0))
        node.position = SCNVector3(0.1,0,-0.1)
        node.geometry?.firstMaterial?.diffuse.contents = UIColor.blue
        node.name = "box"
        sceneView.scene.rootNode.addChildNode(node)
        let node2 = SCNNode(geometry: SCNPyramid(width: 0.05, height: 0.06, length: 0.05))
        node2.position = SCNVector3(0.1,0.1,-0.1)
        node2.geometry?.firstMaterial?.diffuse.contents = UIColor.red
        node2.name = "pyramid"
        sceneView.scene.rootNode.addChildNode(node2)
    }
Modify the entire ViewController.swift file so it looks like this:
import UIKit
import SceneKit
import ARKit
class ViewController: UIViewController, ARSCNViewDelegate {
    @IBOutlet var sceneView: ARSCNView!
    let configuration = ARWorldTrackingConfiguration()
    override func viewDidLoad() {
        super.viewDidLoad()
        // Do any additional setup after loading the view, typically from a nib.
        sceneView.delegate = self
        sceneView.showsStatistics = true
        sceneView.debugOptions = [ARSCNDebugOptions.showWorldOrigin, ARSCNDebugOptions.showFeaturePoints]
        let tapGesture = UITapGestureRecognizer(target: self, action: #selector(handleTap))
        sceneView.addGestureRecognizer(tapGesture)
        addShapes()
    }
    override func viewWillAppear(_ animated: Bool) {
        super.viewWillAppear(animated)
        sceneView.session.run(configuration)
    }
    func addShapes() {
        let node = SCNNode(geometry: SCNBox(width: 0.05, height: 0.05, length: 0.05, chamferRadius: 0))
        node.position = SCNVector3(0.1,0,-0.1)
        node.geometry?.firstMaterial?.diffuse.contents = UIColor.blue
        node.name = "box"
        sceneView.scene.rootNode.addChildNode(node)
        let node2 = SCNNode(geometry: SCNPyramid(width: 0.05, height: 0.06, length: 0.05))
        node2.position = SCNVector3(0.1,0.1,-0.1)
        node2.geometry?.firstMaterial?.diffuse.contents = UIColor.red
        node2.name = "pyramid"
        sceneView.scene.rootNode.addChildNode(node2)
    }
    @objc func handleTap(sender: UITapGestureRecognizer) {
        let areaTapped = sender.view as! SCNView
        let tappedCoordinates = sender.location(in: areaTapped)
        let hitTest = areaTapped.hitTest(tappedCoordinates)
        if hitTest.isEmpty {
            print ("Nothing")
        } else {
            let results = hitTest.first!
            let name = results.node.name
            print(name ?? "background")
        }
    }
}
To test this app, follow these steps:
  1. 1.

    Connect an iOS device to your Macintosh through its USB cable.

     
  2. 2.

    Click the Run button or choose Product ➤ Run.

     
  3. 3.

    Tap anywhere on the screen. The bottom debug pane of Xcode should display “Nothing” or “background”.

     
  4. 4.

    Tap on the pyramid. The bottom debug pane of Xcode should display “pyramid”.

     
  5. 5.

    Tap on the box. The bottom debug pane of Xcode should display “box”. Notice each time you tap on the pyramid or box, your app identifies it by name. Each time you tap away from the pyramid and box, the app identifies is as “Nothing” or “background”.

     
  6. 6.

    Click the Stop button or choose Product ➤ Stop.

     

Identifying Swipe Gestures on Virtual Objects

Besides detecting taps on a virtual object, you may also want to allow the user to swipe across a virtual object. Swiping can involve one or more fingertips that move up, down, left, or right. For each swipe gesture you want to detect (up, down, left, or right), you have to define a separate UISwipeGestureRecognizer like this:
        let swipeRightGesture = UISwipeGestureRecognizer(target: self, action: #selector(handleSwipe))
        let swipeLeftGesture = UISwipeGestureRecognizer(target: self, action: #selector(handleSwipe))
        let swipeUpGesture = UISwipeGestureRecognizer(target: self, action: #selector(handleSwipe))
        let swipeDownGesture = UISwipeGestureRecognizer(target: self, action: #selector(handleSwipe))

Each Swipe Gesture Recognizer can define the same function to handle the swipe. In these examples, all four gestures are handled by a function called handleSwipe.

By default, each swipe gesture only requires a single fingertip. If you want, you can require two or more fingertips by defining a numberOfTouchesRequired property , such as:
        let swipeRightGesture = UISwipeGestureRecognizer(target: self, action: #selector(handleSwipe))
        swipeRightGesture.direction = .right
        swipeRightGesture.numberOfTouchesRequired = 2
If you don’t define the numberofTouchesRequired, Xcode defaults to 1 for a single fingertip to swipe. Most importantly, you must define a direction for the swipe gesture and add it to the scene. So if you want to detect all four swipe gestures, you need to define each swipe gestures separately like this:
      let swipeRightGesture = UISwipeGestureRecognizer(target: self, action: #selector(handleSwipe))
        swipeRightGesture.direction = .right
        sceneView.addGestureRecognizer(swipeRightGesture)
        let swipeLeftGesture = UISwipeGestureRecognizer(target: self, action: #selector(handleSwipe))
        swipeLeftGesture.direction = .left
        sceneView.addGestureRecognizer(swipeLeftGesture)
        let swipeUpGesture = UISwipeGestureRecognizer(target: self, action: #selector(handleSwipe))
        swipeUpGesture.direction = .up
        sceneView.addGestureRecognizer(swipeUpGesture)
        let swipeDownGesture = UISwipeGestureRecognizer(target: self, action: #selector(handleSwipe))
        swipeDownGesture.direction = .down
        sceneView.addGestureRecognizer(swipeDownGesture)

Once we’ve defined the four swipe gestures, the next step is to write the handleSwipe function to respond to the swipe. For this example, we’ll detect both the swipe gesture and the object that the swipe gesture occurred on, such as on a virtual object.

The handleSwipe function needs to receive the UISwipeGestureRecognizer data like this:
    @objc func handleSwipe(sender: UISwipeGestureRecognizer) {
  }
Now we need to identify the swiped area by receiving the view, getting the coordinates of the touched area, and then using the hitTest method to identify where in the view the user swiped:
    @objc func handleSwipe(sender: UISwipeGestureRecognizer) {
        let areaSwiped = sender.view as! SCNView
        let tappedCoordinates = sender.location(in: areaSwiped)
        let hitTest = areaSwiped.hitTest(tappedCoordinates)
    }
Once we get the area swiped, we can respond by determining what the user may have swiped on and the direction the user swiped like this:
    @objc func handleSwipe(sender: UISwipeGestureRecognizer) {
        let areaSwiped = sender.view as! SCNView
        let tappedCoordinates = sender.location(in: areaSwiped)
        let hitTest = areaSwiped.hitTest(tappedCoordinates)
        if hitTest.isEmpty {
            print ("Nothing")
        } else {
            let results = hitTest.first!
            let name = results.node.name
            print(name ?? "background")
        }
        switch sender.direction {
        case.up:
            print("Up")
        case .down:
            print("Down")
        case .right:
            print("Right")
        case .left:
            print("Left")
        default:
            break
        }
    }
To test this app, follow these steps:
  1. 1.

    Connect an iOS device to your Macintosh through its USB cable.

     
  2. 2.

    Click the Run button or choose Product ➤ Run.

     
  3. 3.

    Swipe anywhere on the screen. The bottom debug pane of Xcode should display the direction of your swipe. If you swiped over the box or pyramid, then you should also see the name of the virtual object as well.

     
  4. 4.

    Swipe on the pyramid. The bottom debug pane of Xcode should display “pyramid”.

     
  5. 5.

    Swipe on the box. The bottom debug pane of Xcode should display “box”. Notice each time you swipe across the pyramid or box, your app identifies it by name. Each time you swipe away from the pyramid and box, the app identifies it as “Nothing” or “background”.

     
  6. 6.

    Click the Stop button or choose Product ➤ Stop.

     

Identifying Virtual Objects with Pan Gestures

Another touch gesture that iOS can recognize is the pan, which essentially means placing one or more fingertips on the screen and sliding them around. A pan gesture is similar to a swipe except a swipe gesture slides up/down or right/left. A pan gesture can also move in a straight line but could also move in a squiggly line instead.

Creating a Pan Gesture Recognizer involves defining a function to handle pan gestures and then assigning the pan gesture to the scene like this:
        let panGesture = UIPanGestureRecognizer(target: self, action: #selector(handlePan))
        sceneView.addGestureRecognizer(panGesture)
To further customize the pan gesture, we can define the minimum number of fingertips to start panning along with a maximum number. So if we wanted to detect one, two, or three fingers in a pan gesture, we could define a minimumNumberOfTouches as 1 and a maximumNumberOfTouches as 3 like this:
        panGesture.maximumNumberOfTouches = 3
        panGesture.minimumNumberOfTouches = 1

If we don’t define either property, Xcode simply allows any number of fingertips to initiate a pan gesture with a minimum of at least one fingertip.

The function to handle the pan gesture can respond to three parts of a pan gesture:
  • Location—Where the pan gesture started on the iOS device screen, where the upper-left corner is the origin (0, 0)

  • Velocity—How fast the pan gesture moves in the x- and y-axes, measured in points per second

  • Translation—How far the pan gesture moves in the x- and y-axes

We can retrieve the location, velocity, and translation information in a function that handles pan gestures like this:
        @objc func handlePan(sender: UIPanGestureRecognizer) {
        let location = sender.location(in: view)
        let velocity = sender.velocity(in: view)
        let translation = sender.translation(in: view)
    }
Then we can retrieve the x and y values of each item like this:
        print(location.x, location.y)
        print(velocity.x, velocity.y)
        print(translation.x, translation.y)
For our app, we’ll only use the translation property to identify how far the pan gesture moves. We need to identify whether the pan gesture occurs over a virtual object or not, so we need to get the coordinates panned over like this:
        let areaPanned = sender.view as! SCNView
        let tappedCoordinates = sender.location(in: areaPanned)
        let hitTest = areaPanned.hitTest(tappedCoordinates)
Then we need to check if the pan gesture occurred over the box or pyramid like this:
        if hitTest.isEmpty {
            print ("Nothing")
        } else {
            let results = hitTest.first!
            let name = results.node.name
            print(name ?? "background")
            if sender.state == .began {
                print("Gesture began")
            } else if sender.state == .changed {
                print("Gesture is changing")
                print(translation.x, translation.y)
            } else if sender.state == .ended {
                print("Gesture ended")
            }
        }
The entire ViewController.swift code should look like this:
import UIKit
import SceneKit
import ARKit
class ViewController: UIViewController, ARSCNViewDelegate {
    @IBOutlet var sceneView: ARSCNView!
    let configuration = ARWorldTrackingConfiguration()
    override func viewDidLoad() {
        super.viewDidLoad()
        // Do any additional setup after loading the view, typically from a nib.
        sceneView.delegate = self
        sceneView.showsStatistics = true
        sceneView.debugOptions = [ARSCNDebugOptions.showWorldOrigin, ARSCNDebugOptions.showFeaturePoints]
        let tapGesture = UITapGestureRecognizer(target: self, action: #selector(handleTap))
        sceneView.addGestureRecognizer(tapGesture)
        let swipeRightGesture = UISwipeGestureRecognizer(target: self, action: #selector(handleSwipe))
        swipeRightGesture.direction = .right
        sceneView.addGestureRecognizer(swipeRightGesture)
        let swipeLeftGesture = UISwipeGestureRecognizer(target: self, action: #selector(handleSwipe))
        swipeLeftGesture.direction = .left
        sceneView.addGestureRecognizer(swipeLeftGesture)
        let swipeUpGesture = UISwipeGestureRecognizer(target: self, action: #selector(handleSwipe))
        swipeUpGesture.direction = .up
        sceneView.addGestureRecognizer(swipeUpGesture)
        let swipeDownGesture = UISwipeGestureRecognizer(target: self, action: #selector(handleSwipe))
        swipeDownGesture.direction = .down
        sceneView.addGestureRecognizer(swipeDownGesture)
        let panGesture = UIPanGestureRecognizer(target: self, action: #selector(handlePan))
        sceneView.addGestureRecognizer(panGesture)
        addShapes()
    }
    override func viewWillAppear(_ animated: Bool) {
        super.viewWillAppear(animated)
        sceneView.session.run(configuration)
    }
    func addShapes() {
        let node = SCNNode(geometry: SCNBox(width: 0.05, height: 0.05, length: 0.05, chamferRadius: 0))
        node.position = SCNVector3(0.1,0,-0.1)
        node.geometry?.firstMaterial?.diffuse.contents = UIColor.blue
        node.name = "box"
        sceneView.scene.rootNode.addChildNode(node)
        let node2 = SCNNode(geometry: SCNPyramid(width: 0.05, height: 0.06, length: 0.05))
        node2.position = SCNVector3(0.1,0.1,-0.1)
        node2.geometry?.firstMaterial?.diffuse.contents = UIColor.red
        node2.name = "pyramid"
        sceneView.scene.rootNode.addChildNode(node2)
    }
    @objc func handleTap(sender: UITapGestureRecognizer) {
        let areaTapped = sender.view as! SCNView
        let tappedCoordinates = sender.location(in: areaTapped)
        let hitTest = areaTapped.hitTest(tappedCoordinates)
        if hitTest.isEmpty {
            print ("Nothing")
        } else {
            let results = hitTest.first!
            let name = results.node.name
            print(name ?? "background")
        }
    }
    @objc func handleSwipe(sender: UISwipeGestureRecognizer) {
        let areaSwiped = sender.view as! SCNView
        let tappedCoordinates = sender.location(in: areaSwiped)
        let hitTest = areaSwiped.hitTest(tappedCoordinates)
        if hitTest.isEmpty {
            print ("Nothing")
        } else {
            let results = hitTest.first!
            let name = results.node.name
            print(name ?? "background")
        }
        switch sender.direction {
        case.up:
            print("Up")
        case .down:
            print("Down")
        case .right:
            print("Right")
        case .left:
            print("Left")
        default:
            break
        }
    }
    @objc func handlePan(sender: UIPanGestureRecognizer) {
//        let location = sender.location(in: view)
//        print(location.x, location.y)
//        let velocity = sender.velocity(in: view)
//        print(velocity.x, velocity.y)
        let translation = sender.translation(in: view)
        let areaPanned = sender.view as! SCNView
        let tappedCoordinates = sender.location(in: areaPanned)
        let hitTest = areaPanned.hitTest(tappedCoordinates)
        if hitTest.isEmpty {
            print ("Nothing")
        } else {
            let results = hitTest.first!
            let name = results.node.name
            print(name ?? "background")
            if sender.state == .began {
                print("Gesture began")
            } else if sender.state == .changed {
                print("Gesture is changing")
                print(translation.x, translation.y)
            } else if sender.state == .ended {
                print("Gesture ended")
            }
        }
    }
}
Modify the code in your ViewController.swift file to match this code. Then to see the app run, do the following:
  1. 1.

    Connect an iOS device to your Macintosh through its USB cable.

     
  2. 2.

    Click the Run button or choose Product ➤ Run.

     
  3. 3.

    Place one finger over the box or pyramid on the screen and slide it around. Notice that the debug area in Xcode identifies the area panned over, when the pan gesture starts, stops, and changes, and what the translation values are.

     
  4. 4.

    Click the Stop button or choose Product ➤ Stop.

     

Identifying Long Press Gestures on Virtual Objects

A tap gesture occurs when the user presses a fingertip on the screen and releases it. A long press gesture is similar except that it allows the user to press and hold one or more fingertips on the screen, then release. You can modify four properties of a long press gesture:
  • minimumPressDuration—Defines how long the user must press on the screen. The default is 0.5 seconds.

  • numberOfTouchesRequired—Defines how many fingertips the user must press on the screen. The default is 1.

  • numberOfTapsRequired—Defines how many times the user must press and lift one or more fingertips to initiate a long press gesture. The default is 0.

  • allowableMovement—Defines how far the user can slide one or more fingertips to initiate a long press gesture. The default is 10 points.

To define a long press gesture and any four properties, you just need to define a function to handle the long press gesture and then add the long press gesture to a scene like this:
        let longPressGesture = UILongPressGestureRecognizer(target: self, action: #selector(handleLongPress))
        longPressGesture.minimumPressDuration = 1
        sceneView.addGestureRecognizer(longPressGesture)
This code defines the minimum press duration to 1 second but if you omit this line, then it will use the default minimum press duration of 0.5 seconds instead. This code also defines a function called handleLongPress to respond to the long press gesture, so we need to create this function like this:
    @objc func handleLongPress(sender: UILongPressGestureRecognizer) {
    }
To determine where the user pressed, we need to identify the area like this:
        let areaPressed = sender.view as! SCNView
        let tappedCoordinates = sender.location(in: areaPressed)
        let hitTest = areaPressed.hitTest(tappedCoordinates)
Then we need to use the hitTest function to identify if the user pressed on the box or pyramid like this:
        if hitTest.isEmpty {
            print ("Nothing")
        } else {
            let results = hitTest.first!
            let name = results.node.name ?? "background"
            print("Long press on \(name)")
        }
The entire handleLongPress function should look like this:
    @objc func handleLongPress(sender: UILongPressGestureRecognizer) {
        let areaPressed = sender.view as! SCNView
        let tappedCoordinates = sender.location(in: areaPressed)
        let hitTest = areaPressed.hitTest(tappedCoordinates)
        if hitTest.isEmpty {
            print ("Nothing")
        } else {
            let results = hitTest.first!
            let name = results.node.name ?? "background"
            print("Long press on \(name)")
        }
    }
Add the handleLongPress function to your ViewController.swift file and define the long press recognizer like this:
        let longPressGesture = UILongPressGestureRecognizer(target: self, action: #selector(handleLongPress))
        sceneView.addGestureRecognizer(longPressGesture)
Then to see the app run, do the following:
  1. 1.

    Connect an iOS device to your Macintosh through its USB cable.

     
  2. 2.

    Click the Run button or choose Product ➤ Run.

     
  3. 3.

    Press one finger over the box or pyramid on the screen. Notice that the debug area in displays “Long press on box” or “Long press on pyramid”.

     
  4. 4.

    Click the Stop button or choose Product ➤ Stop.

     

Adding Pinch and Rotation Gestures

Up until now, we’ve added gesture recognizers programmatically by writing at least two lines of code. First, we’ve created a constant to represent the gesture recognizer along with defining a function to handle the gesture like this:
let tapGesture = UITapGestureRecognizer(target: self, action: #selector(handleTap))
Second, we added the gesture recognizer to the view like this:
sceneView.addGestureRecognizer(tapGesture)
A second way to add gesture recognizers to an app is to use the Object Library and drag a gesture recognizer directly on a scene. If you click on the Object Library and search for “gesture recognizer,” Xcode displays a list of all available gesture recognizers, as shown in Figure 9-2.
../images/469983_1_En_9_Chapter/469983_1_En_9_Fig2_HTML.jpg
Figure 9-2

Viewing a list of gesture recognizers in the Object Library window

Dragging and dropping a gesture recognizer on to a view is equivalent to writing code to add a gesture recognizer to a view:
sceneView.addGestureRecognizer(tapGesture)
Now if you open the Assistant Editor, you can Ctrl-drag from the gesture recognizer to the ViewController.swift file . If you Ctrl-drag to create an IBOutlet, this is equivalent to declaring a gesture recognizer like this:
let tapGesture = UITapGestureRecognizer(target: self, action: #selector(handleTap))
Of course, every gesture recognizer needs a function that contains code to handle the gesture, so you also need to Ctrl-drag to create an IBAction method . By dragging and dropping a gesture recognizer on a view, you can avoid writing code. Let’s see how this works with both the rotation and pinch gesture recognizers by following these steps:
  1. 1.

    Make sure your TouchGestures project is open.

     
  2. 2.

    Click the Main.storyboard file in the Navigator pane.

     
  3. 3.

    Click the Object Library icon to display the Object Library window.

     
../images/469983_1_En_9_Chapter/469983_1_En_9_Fig3_HTML.jpg
Figure 9-3

The Object Library icon

  1. 4.

    Type Rotation in the Object Library window. The Object Library window displays the Rotation Gesture Recognizer.

     
../images/469983_1_En_9_Chapter/469983_1_En_9_Fig4_HTML.jpg
Figure 9-4

The Rotation Gesture Recognizer in the Object Library window

  1. 5.

    Drag the Rotation Gesture Recognizer and drop it over the ARSCNView. Xcode displays the Rotation Gesture Recognizer in the Document Outline, as shown in Figure 9-5. (You can toggle between hiding or showing the Document Outline by choosing Editor ➤ Show/Hide Document Outline.)

     
../images/469983_1_En_9_Chapter/469983_1_En_9_Fig5_HTML.jpg
Figure 9-5

The Document Outline pane displays all user interface items

  1. 6.

    Repeat Steps 3-5, except drag and drop the Pinch Gesture Recognizer on to the ARSCNView so the Pinch Gesture Recognizer also appears in the Document Outline window.

     
  2. 7.

    Click the Assistant Editor icon or choose View ➤ Assistant Editor ➤ Show Assistant Editor. Xcode displays the Document Outline, Main.storyboard file, and the ViewController.swift file side by side.

     
  3. 8.

    Click on the Rotation Gesture Recognizer in the Document Outline and Ctrl-drag to the bottom of the ViewController.swift file.

     
  4. 9.

    Release the Control key and the left mouse button. A popup window appears.

     
  5. 10.

    Make sure the Connection popup menu displays Action.

     
  6. 11.

    Click in the Name text field and type handleRotation.

     
  7. 12.

    Click in the Type popup menu and choose UIRotationGestureRecognizer.

     
  8. 13.
    Click the Connect button. Xcode creates an empty IBAction method like this:
        @IBAction func handleRotation(_ sender: UIRotationGestureRecognizer) {
        }
     
  9. 14.

    Click on the Pinch Gesture Recognizer in the Document Outline and Ctrl-drag to the bottom of the ViewController.swift file.

     
  10. 15.

    Release the Control key and the left mouse button. A popup window appears.

     
  11. 16.

    Make sure the Connection popup menu displays Action.

     
  12. 17.

    Click in the Name text field and type handlePinch.

     
  13. 18.

    Click in the Type popup menu and choose UIPinchGestureRecognizer.

     
  14. 19.
    Click the Connect button. Xcode creates an empty IBAction method like this:
        @IBAction func handlePinch(_ sender: UIPinchGestureRecognizer) {
        }
     
  15. 20.
    Edit both functions to include a print statement like this:
        @IBAction func handleRotation(_ sender: UIRotationGestureRecognizer) {
            print("Rotation detected")
        }
        @IBAction func handlePinch(_ sender: UIPinchGestureRecognizer) {
            print("Pinch detected")
        }
     
  16. 21.

    Connect an iOS device to your Macintosh through its USB cable.

     
  17. 22.

    Click the Run button or choose Product ➤ Run.

     
  18. 23.

    Press two fingers on the screen and rotate them in a circular motion. The Xcode debug area should display “Rotation detected”.

     
  19. 24.

    Press two fingers on the screen and pinch them closer together. The Xcode debug area should display “Pinch detected”.

     
  20. 25.

    Click the Stop button or choose Product ➤ Stop.

     

Notice that by dragging and dropping gesture recognizers on to the ARSCNView, we’ve saved ourselves writing code. You can use either method or a combination of both methods.

Summary

The touch screen of an iOS device can recognize multiple types of touch gestures, from simple taps to rotation, swipes, and pans. To recognize gestures, you have two methods. First, you can add a gesture recognizer to your app by writing code such as:
        let panGesture = UIPanGestureRecognizer(target: self, action: #selector(handlePan))
        sceneView.addGestureRecognizer(panGesture)
Then you’ll need to write a function to handle the touch gesture such as:
   @objc func handlePan(sender: UIPanGestureRecognizer) {
    }

As an alternative, we can simply drag and drop a gesture recognizer from the Object Library on to the ARSCNView. Then we can Ctrl-drag to create an IBOutlet for the gesture recognizer and an IBAction method to handle the gesture when it’s detected.