Skip to main content
Version: Eyedid Beta

Quick Start (Swift)

Supported Environment

  • Minimum supported iOS version is 13.0

  • Minimum supported device is iPhone 6s

  • Swift and Objective-C are supported.

  • Camera permission and Network connected environment are required.

  • Running GazeTracker on simulator is not supported.

info

Eyedid SDK uses the front camera of the device

Sample Project

Please go to the sample project page for the purpose of using it rather than implementing it.

Environment Set-ups

  1. Install CocoaPods

    If you don't have CocoaPods installed on your machine, follow the steps below:

    • Open Terminal and run the following command to install CocoaPods:

      sudo gem install cocoapods
    • After installation, navigate to your project directory and run:

      pod setup
  2. Create a New Xcode Project

    • Open Xcode and choose “Create a new Xcode project”.
    • Select “App” under the iOS platform, and name the project EyedidSample.
    • Choose Swift as the language and select either Storyboard or SwiftUI as the interface, depending on your preference.
    • Save the project to your desired location.
  3. Create a Podfile Manually

  • Open Terminal and navigate to the newly created project directory:

    cd path/to/your/EyedidSample
  • Create a Podfile manually using a text editor, for example, Nano:

    nano Podfile
  • Add the following content to the Podfile:

      platform :ios, '13.0'
    use_frameworks!

    target 'YourProject' do
    pod 'Eyedid', '~> 1.0.0-beta2'
    end
  • Save and exit the editor (in Nano, use CTRL + X, then press Y to confirm, and hit Enter to save).

  1. Install the Dependencies
  • After creating the Podfile, install the necessary dependencies by running:
    pod install
  1. Open the Project with the Workspace File
  • After running pod install, always open your project using the .xcworkspace file
  1. Change option 'Do Not Embed' of [Frameworks, Libraries, and Embedded Content > Eyedid.framework] to 'Embed & sign'.

Camera permission

  1. Move to [General > Info].

    ios-quick-start-6

  2. Click '+' button in [Custom iOS Target Properties] table. The button will appear when hovering the mouse.

    ios-quick-start-7

  3. Append the [Privacy - Camera Usage Description] option.

Sample App Implementation

What are we making

This documnet provides a way to create a sample swift application which visualize user's gaze point coordinates with proper UI. (Portrait direction only)

Technical Overview

You will get sample codes to use followings:

  • Object construction and destruction

  • startTracking, stopTracking, startCalibration, stopCalibration

  • Filter toggle

  • Implementation :

    1. GazeTracker Construction

      import UIKit
      import Eyedid
      import AVFoundation

      class ViewController: UIViewController {

      var tracker : GazeTracker? = nil

      override func viewDidLoad() {
      super.viewDidLoad()
      // Do any additional setup after loading the view.
      }
      }

      First, import Eyedid for the GazeTracker constructions. You should define the variable to assign the created GazeTracker object. Please import the AVFoundation to detect the camera permission status.

      extension ViewController : InitializationDelegate {
      func onInitialized(tracker: GazeTracker?, error: InitializationError) {
      if (tracker != nil){
      self.tracker = tracker
      print("initalized GazeTracker")
      }else{
      print("init failed : \(error.description)")
      }
      }
      }

      Define the InitializationDelegate implementation on ViewController. Because the GazeTracker object will be relayed from InitalizationDelegat.onInitalized, it should be defined as above. When the GazeTracker construction is finished, print log to console.

      // `True` when the camera persimmion exists
      if AVCaptureDevice .authorizationStatus(for: .video) == .authorized {
      // Run initGazeTracker when the permission exists.
      }else{
      // Reqeust camera permission to user.
      AVCaptureDevice.requestAccess(for: .video, completionHandler: {
      response in
      if response {
      // Permission allowed.
      // Run initGazeTracker here.
      }
      })
      }

      Due to the GazeTracker uses data from the camera, it needs the camera permission. Check if the camera permission exists, request the permission if it doesn't.

      override func viewDidLoad() {
      super.viewDidLoad()
      if AVCaptureDevice .authorizationStatus(for: .video) == .authorized {
      GazeTracker.initGazeTracker(license: "YOUR_DEVELOPMENT_LICENSE_KEY", delegate: self)
      }else{
      AVCaptureDevice.requestAccess(for: .video, completionHandler: {
      response in
      if response {
      GazeTracker.initGazeTracker(license: "YOUR_DEVELOPMENT_LICENSE_KEY", delegate: self)
      }
      })
      }
      }

      Implement the GazeTracker constructing codes. You MUST replace the license key to the key that you generated from the Eyedid Console.

  1. Start Tracking

    extension ViewController : StatusDelegate {
    func onStarted() {
    print("tracker starts tracking.")
    }

    func onStopped(error: StatusError) {
    print("stop error : \(error.description)")
    }
    }

    Implement StatusDelegate to check if the gaze tracking is started when GazeTracker.startTracking function is called.

    extension ViewController : InitializationDelegate {
    func onInitialized(tracker: GazeTracker?, error: InitializationError) {
    if (tracker != nil){
    self.tracker = tracker
    print("initalized GazeTracker")
    self.tracker?.statusDelegate = self
    self.tracker?.startTracking()
    }else{
    print("init failed : \(error.description)")
    }
    }
    }

    Connect the StatusDelegate implementation to the GazeTracker object, then call startTracking.

  2. Log the gaze data

    extension ViewController : TrackingDelegate {
    func onMetrics(timestamp: Int, gazeInfo : GazeInfo, faceInfo: FaceInfo, blinkInfo: BlinkInfo, userStatusInfo: UserStatusInfo) {
    print("timestamp : \(timestamp), (x , y) : (\(gazeInfo.x), \(gazeInfo.y)) , state : \(gazeInfo.tracking.description)")
    }
    }

    Implement the TrackingDelegate, the delegate provides the gaze data and the eye movement status from the GazeTracker.

    extension ViewController : InitializationDelegate {
    func onInitialized(tracker: GazeTracker?, error: InitializationError) {
    if (tracker != nil) {
    ...
    self.tracker?.trackingDelegate = self

    self.tracker?.startTracking()
    }else {
    print("init failed : \(error.description)")
    }
    }
    }

    Connect the implemented TrackingDelegate.

    1. Overall codes

      import Eyedid
      import UIKit
      import AVFoundation

      class ViewController: UIViewController {
      var tracker : GazeTracker? = nil
      override func viewDidLoad() {
      super.viewDidLoad()
      if AVCaptureDevice .authorizationStatus(for: .video) == .authorized {
      GazeTracker.initGazeTracker(license: "YOUR_DEVELOPMENT_LICENSE_KEY", delegate: self)
      }else{
      AVCaptureDevice.requestAccess(for: .video, completionHandler: {
      response in
      if response {
      GazeTracker.initGazeTracker(license: "YOUR_DEVELOPMENT_LICENSE_KEY", delegate: self)
      }
      })
      }
      }
      }
      extension ViewController : InitializationDelegate {
      func onInitialized(tracker: GazeTracker?, error: InitializationError) {
      if (tracker != nil){
      self.tracker = tracker
      print("initalized GazeTracker")
      self.tracker?.statusDelegate = self
      self.tracker?.trackingDelegate = self
      self.tracker?.startTracking()
      }else{
      print("init failed : \(error.description)")
      }
      }
      }

      extension ViewController : StatusDelegate {
      func onStarted() {
      print("tracker starts tracking.")
      }

      func onStopped(error: StatusError) {
      print("stop error : \(error.description)")
      }
      }

      extension ViewController : TrackingDelegate {
      func onMetrics(timestamp: Int, gazeInfo : GazeInfo, faceInfo: FaceInfo, blinkInfo: BlinkInfo, userStatusInfo: UserStatusInfo) {
      print("timestamp : \(timestamp), (x , y) : (\(gazeInfo.x), \(gazeInfo.y)) , state : \(gazeInfo.trackingState.description)")
      }
      }

Run

  1. Build on device

    Click the build button on the left-top side.

    ios_quick_start-11

  2. Feature

    • You should allow the camera permission first.
    /img/ios-quick-start/capture.jpeg
    • When setup is successful and the camera permission allowed, the data from the GazeTracker will be shown on the console.

      ios-quick-start-12