Quick Start (Swift)
Supported Environment
-
Minimum supported iOS version is
13.0 -
Minimum supported device is
iPhone 6s -
SwiftandObjective-Care supported. -
Camera permissionandNetwork connected environmentare required. -
Running GazeTracker on simulator is not supported.
Eyedid SDK uses the front camera of the device
Sample Project
Please go to the sample project page for the purpose of using it rather than implementing it.
Environment Set-ups
-
Install CocoaPods
If you don't have CocoaPods installed on your machine, follow the steps below:
-
Open Terminal and run the following command to install CocoaPods:
sudo gem install cocoapods -
After installation, navigate to your project directory and run:
pod setup
-
-
Create a New Xcode Project
- Open Xcode and choose “Create a new Xcode project”.
- Select “App” under the iOS platform, and name the project EyedidSample.
- Choose Swift as the language and select either Storyboard or SwiftUI as the interface, depending on your preference.
- Save the project to your desired location.
-
Create a Podfile Manually
-
Open Terminal and navigate to the newly created project directory:
cd path/to/your/EyedidSample -
Create a Podfile manually using a text editor, for example, Nano:
nano Podfile -
Add the following content to the Podfile:
platform :ios, '13.0'
use_frameworks!
target 'YourProject' do
pod 'Eyedid', '~> 1.0.0-beta2'
end -
Save and exit the editor (in Nano, use CTRL + X, then press Y to confirm, and hit Enter to save).
- Install the Dependencies
- After creating the Podfile, install the necessary dependencies by running:
pod install
- Open the Project with the Workspace File
- After running pod install, always open your project using the
.xcworkspacefile
- Change option 'Do Not Embed' of [Frameworks, Libraries, and Embedded Content > Eyedid.framework] to 'Embed & sign'.
Camera permission
-
Move to [General > Info].

-
Click '+' button in [Custom iOS Target Properties] table. The button will appear when
hoveringthe mouse.
-
Append the [Privacy - Camera Usage Description] option.
Sample App Implementation
What are we making
This documnet provides a way to create a sample swift application which visualize user's gaze point coordinates with proper UI. (Portrait direction only)
Technical Overview
You will get sample codes to use followings:
-
Object construction and destruction
-
startTracking,stopTracking,startCalibration,stopCalibration -
Filter toggle
-
Implementation :
-
GazeTrackerConstructionimport UIKit
import Eyedid
import AVFoundation
class ViewController: UIViewController {
var tracker : GazeTracker? = nil
override func viewDidLoad() {
super.viewDidLoad()
// Do any additional setup after loading the view.
}
}First, import Eyedid for the
GazeTrackerconstructions. You should define the variable to assign the createdGazeTrackerobject. Please import theAVFoundationto detect the camera permission status.extension ViewController : InitializationDelegate {
func onInitialized(tracker: GazeTracker?, error: InitializationError) {
if (tracker != nil){
self.tracker = tracker
print("initalized GazeTracker")
}else{
print("init failed : \(error.description)")
}
}
}Define the
InitializationDelegateimplementation onViewController. Because theGazeTrackerobject will be relayed fromInitalizationDelegat.onInitalized, it should be defined as above. When theGazeTrackerconstruction is finished, print log to console.// `True` when the camera persimmion exists
if AVCaptureDevice .authorizationStatus(for: .video) == .authorized {
// Run initGazeTracker when the permission exists.
}else{
// Reqeust camera permission to user.
AVCaptureDevice.requestAccess(for: .video, completionHandler: {
response in
if response {
// Permission allowed.
// Run initGazeTracker here.
}
})
}Due to the
GazeTrackeruses data from the camera, it needs the camera permission. Check if the camera permission exists, request the permission if it doesn't.override func viewDidLoad() {
super.viewDidLoad()
if AVCaptureDevice .authorizationStatus(for: .video) == .authorized {
GazeTracker.initGazeTracker(license: "YOUR_DEVELOPMENT_LICENSE_KEY", delegate: self)
}else{
AVCaptureDevice.requestAccess(for: .video, completionHandler: {
response in
if response {
GazeTracker.initGazeTracker(license: "YOUR_DEVELOPMENT_LICENSE_KEY", delegate: self)
}
})
}
}Implement the
GazeTrackerconstructing codes. You MUST replace thelicense keyto the key that you generated from the Eyedid Console.
-
-
Start Tracking
extension ViewController : StatusDelegate {
func onStarted() {
print("tracker starts tracking.")
}
func onStopped(error: StatusError) {
print("stop error : \(error.description)")
}
}Implement
StatusDelegateto check if the gaze tracking is started whenGazeTracker.startTrackingfunction is called.extension ViewController : InitializationDelegate {
func onInitialized(tracker: GazeTracker?, error: InitializationError) {
if (tracker != nil){
self.tracker = tracker
print("initalized GazeTracker")
self.tracker?.statusDelegate = self
self.tracker?.startTracking()
}else{
print("init failed : \(error.description)")
}
}
}Connect the
StatusDelegateimplementation to theGazeTrackerobject, then callstartTracking. -
Log the gaze data
extension ViewController : TrackingDelegate {
func onMetrics(timestamp: Int, gazeInfo : GazeInfo, faceInfo: FaceInfo, blinkInfo: BlinkInfo, userStatusInfo: UserStatusInfo) {
print("timestamp : \(timestamp), (x , y) : (\(gazeInfo.x), \(gazeInfo.y)) , state : \(gazeInfo.tracking.description)")
}
}Implement the
TrackingDelegate, the delegate provides the gaze data and the eye movement status from theGazeTracker.extension ViewController : InitializationDelegate {
func onInitialized(tracker: GazeTracker?, error: InitializationError) {
if (tracker != nil) {
...
self.tracker?.trackingDelegate = self
self.tracker?.startTracking()
}else {
print("init failed : \(error.description)")
}
}
}Connect the implemented
TrackingDelegate.-
Overall codes
import Eyedid
import UIKit
import AVFoundation
class ViewController: UIViewController {
var tracker : GazeTracker? = nil
override func viewDidLoad() {
super.viewDidLoad()
if AVCaptureDevice .authorizationStatus(for: .video) == .authorized {
GazeTracker.initGazeTracker(license: "YOUR_DEVELOPMENT_LICENSE_KEY", delegate: self)
}else{
AVCaptureDevice.requestAccess(for: .video, completionHandler: {
response in
if response {
GazeTracker.initGazeTracker(license: "YOUR_DEVELOPMENT_LICENSE_KEY", delegate: self)
}
})
}
}
}
extension ViewController : InitializationDelegate {
func onInitialized(tracker: GazeTracker?, error: InitializationError) {
if (tracker != nil){
self.tracker = tracker
print("initalized GazeTracker")
self.tracker?.statusDelegate = self
self.tracker?.trackingDelegate = self
self.tracker?.startTracking()
}else{
print("init failed : \(error.description)")
}
}
}
extension ViewController : StatusDelegate {
func onStarted() {
print("tracker starts tracking.")
}
func onStopped(error: StatusError) {
print("stop error : \(error.description)")
}
}
extension ViewController : TrackingDelegate {
func onMetrics(timestamp: Int, gazeInfo : GazeInfo, faceInfo: FaceInfo, blinkInfo: BlinkInfo, userStatusInfo: UserStatusInfo) {
print("timestamp : \(timestamp), (x , y) : (\(gazeInfo.x), \(gazeInfo.y)) , state : \(gazeInfo.trackingState.description)")
}
}
-
Run
-
Build on device
Click the build button on the left-top side.

-
Feature
- You should allow the camera permission first.
-
When setup is successful and the camera permission allowed, the data from the
GazeTrackerwill be shown on the console.