Skip to main content
Version: Eyedid Beta

API Docs

Overview

NametypeDescription
GazeTrackerclassThe main class that captures the user's face through the device's camera, processes it, and provides gaze data.
GazeTrackerOptionsclassConfiguration options for creating a GazeTracker.
GazeTrackerOptions.builderclassA builder for configuring and creating a GazeTrackerOptions instance.
GazeInfoclassA class that contains information about gaze tracking.
FaceInfoclassA class containing information about face tracking.
BlinkInfoclassRepresents information about eye blinking.
UserStatusInfoclassRepresents information about user is status
OneEuroFilterManagerclassThe OneEuroFilterManager class manages multiple One Euro Filters.
InitializationDelegateprotocolA delegate protocol to receive the results of the GazeTracker initialization.
TrackingDelegateprotocolA delegate protocol to receive real-time tracking data from the GazeTracker.
StatusDelegateprotocolA delegate protocol to receive status updates from the GazeTracker.
CalibrationDelegateprotocolA delegate protocol that provides status updates during the calibration process.
ImageDelegateprotocolA delegate protocol for receiving images from the front camera.
CameraPresetenumRepresents the available camera resolution presets.
InitializationErrorenumEnum representing the different types of initialization errors that can occur during the setup of the Gaze Tracker system.
CalibrationModeenumEnum representing the calibration mode types for starting calibration. Currently, available modes are default, onePoint, and fivePoint. default is set to the recommended mode, which is currently the five-point calibration mode.
AccuracyCriteriaenumEnum representing the accuracy criteria for starting calibration.
TrackingStateenumEnum representing the state of gaze tracking.
EyeMovementStateenumEnum representing the state of eye movement.
ScreenStateenumEnum representing the state of whether the gaze is inside or outside of the screen.
StatusErrorTypeenumEnum that describes the reason why gaze tracking stopped.

GazeTracker

@objcMembers public class GazeTracker: NSObject

The main class that captures the user's face through the device's camera, processes it, and provides gaze data.

note

Requires the Camera Permission

Summary

FunctionDescriptionReturn TypeType
initGazeTrackerGazeTracker initialization requires a valid license key.voidstatic
releaseGazeTrackerTerminates the specified v instance.voidstatic
getFrameworkVersionReturns the current SDK version.Stringstatic
startTrackingOpens the camera and starts gaze tracking.voidMember
stopTrackingCloses the camera and stops gaze tracking.voidMember
isTrackingChecks whether the GazeTracker is currently tracking.BoolMember
setDelegatesRegisters multiple delegate at once. The method checks the type of each callback and assigns it to the appropriate field.voidMember
removeDelegatesRemoves all currently set delegates.voidMember
setTrackingFPSSets the FPS (frames per second) for gaze tracking.BoolMember
setCameraPreviewSets the camera preview using the specified UIView.voidMember
removeCameraPreviewRemoves the camera preview set in the GazeTracker.voidMember
setForcedOrientationSet the forced device orientation for the GazeTracker.BoolMember
resetForcedOrientationReset the forced device orientation for the GazeTracker.BoolMember
startCalibrationStarts the calibration process.BoolMember
stopCalibrationStops the calibration process.voidMember
isCalibratingIndicates whether the calibration process is currently active.BoolMember
startCollectSamplesNotifies the GazeTracker that the UI has been updated with the coordinates provided by onCalibrationNextPoint and that the system is ready to start collecting user gaze datavoidMember
setCalibrationDataApplies previously stored calibration data to the current GazeTracker.voidMember
setAttentionRegionSets the attention region using the specified coordinates.voidMember
getAttentionRegionRetrieves the attention region.CGRect?Member
removeAttentionRegionRemoves the currently set attention region.voidMember

initGazeTracker

static public func initGazeTracker(license: String,
delegate: InitializationDelegate,
options: GazeTrackerOptions? = nil)

GazeTracker initialization requires a valid license key. The created GazeTracker instance is delivered through InitializationDelegate.

note

The function is asynchronous.

ParametersTypeDescription
licenseStringThe license key required to activate the GazeTracker.
delegateInitializationDelegateThe delegate to receive initialization status.
optionsGazeTrackerOptionsOptional GazeTracker configuration options.

Example

var options: GazeTrackerOptions? = nil;

if (useBlink) {
options = GazeTrackerOptions.Builder()
.setUseBlink(true)
.build();
}

GazeTracker.initGazeTracker(license: "YOUR_LICENSE_KEY", delegate: initializationDelegate, options: options);

...


releaseGazeTracker

static public func releaseGazeTracker(tracker: GazeTracker?)

Terminates the specified GazeTracker instance.

note

Once terminated, the GazeTracker instance cannot be used.

ParametersTypeDescription
trackerGazeTrackerThe GazeTracker instance to terminate.

Example

public void onInitialized(tracker: GazeTracker?, error: InitializationError) {
if (gazeTracker != nil) {
// gazeTracker init success

// you can deinit gazeTracker instance
GazeTracker.releaseGazeTracker(tracker: gazeTracker);
gazeTracker = nil;

} else {
// gazeTracker init fail
}
}

getFrameworkVersion

public static func getFrameworkVersion() -> String

Returns the current SDK version.

Return TypeDescription
StringA String representing the SDK version, or an error message if unavailable.

Example

let version = GazeTracker.getFrameworkVersion()
print("Eyedid SDK version : \(version)")

startTracking

public func startTracking()

Opens the camera and starts gaze tracking.

note

On successful start, StatusDelegate.onStarted() is called.

Example

gazeTracker.startTracking();

...


func onStarted() {
// gazeTracker.startTracking() Success
}


stopTracking

public func stopTracking()

Closes the camera and stops gaze tracking.

note

On successful stop, StatusDelegate.onStopped() is called.

Example

gazeTracker.stopTracking()

...


func onStopped(error: StatusError) {
// gazeTracker.stopTracking() Success
}


isTracking

public func isTracking() -> Bool

Checks whether the GazeTracker is currently tracking.

Return TypeDescription
Booltrue if gaze tracking is active.

Example

let isTracking = gazeTracker.isTracking()
print("Tracking state : \(isTracking)")

setDelegates

public func setDelegates(
statusDelegate: StatusDelegate?,
trackingDelegate: TrackingDelegate?,
calibrationDelegate: CalibrationDelegate?,
imageDelegate: ImageDelegate?)

Registers multiple delegate at once. The method checks the type of each callback and assigns it to the appropriate field.

ParametersTypeDescription
statusDelegateStatusDelegateA delegate protocol to receive status updates from the GazeTracker.
trackingDelegateTrackingDelegateA delegate protocol to receive real-time tracking data from the GazeTracker.
calibrationDelegateCalibrationDelegateA delegate protocol that provides status updates during the calibration process.
imageDelegateImageDelegateA delegate protocol for receiving images from the front camera.

Example

gazeTracker.setDelegate(statusDelegate: self, trackingDelegate: self, calibrationDelegate: self, imageDelegate: nil)

removeDelegates

public func removeDelegates()

Removes all currently set delegates.

Example

gazetracker.removeDelegates();

setTrackingFPS

@discardableResult
public func setTrackingFPS(fps: Int) -> Bool

Sets the FPS (frames per second) for gaze tracking. If the eyes cannot be detected, the FPS setting may not be applied (gaze tracking calculations are bypassed), and the actual FPS may be lower than the specified value depending on the device's performance.

warning

fps must be greater than 0.

ParametersTypeDescription
fpsIntThe desired frames per second for gaze tracking.
Return TypeDescription
Booltrue if the FPS was successfully set, false otherwise.

Example


// Set the tracking FPS to 15
let isSuccess = gazeTracker.setTrackingFPS(fps: 15)

if (isSuccess) {
// Action to perform upon success
}

setCameraPreview

public func setCameraPreview(preview: UIView)

Sets the camera preview using the specified UIView.

ParametersTypeDescription
previewUIViewThe UiView to be used for the camera preview.

Example

gazeTracker.setCameraPreview(preview: preview)

removeCameraPreview

public func removeCameraPreview()

Removes the camera preview set in the GazeTracker.

Example

gazeTracker.removeCameraPreview()

setForcedOrientation

@discardableResult
public func setForcedOrientation(orientation: UIInterfaceOrientation) -> Bool

Set the forced device orientation for the GazeTracker.

This function sets the orientation of the device forcibly for certain situations. Specifically, when the device is in portrait lock mode and the developer wants to rotate it to landscape mode, the gaze coordinates also need to be rotated. However, the function only works when not in the calibration process, meaning that the device's orientation cannot be forcibly changed during calibration.

ParametersTypeDescription
orientationUIInterfaceOrientationThe forced device orientation.
Return TypeDescription
BoolTrue if the forced orientation is set successfully. False if the function is called during the calibration process.

Example

gazetracker.setForcedOrientation(orientation: .landscapeLeft)

resetForcedOrientation

@discardableResult
public func resetForcedOrientation() -> Bool

Reset the forced device orientation for the GazeTracker.

This function resets the orientation of the device that was forcibly set by the setForcedOrientation function.

After this function is called, the device orientation will no longer be forcibly controlled, and the gaze coordinates will be determined based on the actual device orientation.

However, this function only works when not in the calibration process, meaning that the device's orientation cannot be forcibly changed during calibration.

Return TypeDescription
BoolTrue if the forced orientation is reset successfully. False if the function is called during the calibration process.

Example

let isReset = gazetracker.resetForcedOrientation()

startCalibration

@MainActor @discardableResult
public func startCalibration(
mode: CalibrationMode = .default,
criteria: AccuracyCriteria = .default,
region: CGRect) -> Bool

Starts the calibration process.

ParametersTypeDescription
modeCalibrationModeThe number of points to be used for calibration.
criteriaAccuracyCriteriaThe accuracy criteria to be used for calibration.
regionCGRectRegion that needs to be calibrated. The unit is point(pt).
Return TypeDescription
Booltrue if the calibration process has started successfully, false otherwise.

Example

let startedResult = tracker.startCabliration(mode: .onePoint, region: CGRect(x: 100, y: 100, width: 100, height: 100))

stopCalibration

public func stopCalibration()

Stops the calibration process.

Example

if(gazeTracker.isCalibrating()) {
gazeTracker.stopCalibration()
}

isCalibrating

public func isCalibrating() -> Bool

Indicates whether the calibration process is currently active.

Return TypeDescription
Booltrue if the calibration process is ongoing.

Example

if(gazeTracker.isCalibrating()) {
print("current calibrating....")
}

startCollectSamples

public func startCollectSamples()

Notifies the GazeTracker that the UI has been updated with the coordinates provided by onCalibrationNextPoint and that the system is ready to start collecting user gaze data.

note

Ensure that this function is called after the UI has been updated and the user has had enough time to focus on the calibration point.

Example

func onCalibrationNextPoint(x: Double, y: Double) {
DispatchQueue.main.async {
self.targetView.center = CGPoint(x: x, y: y)
self.gazeTracker.startCollectSamples()
}
}

setCalibrationData

public func setCalibrationData(calibrationData: [Double])

Applies previously stored calibration data to the current GazeTracker.

ParametersTypeDescription
calibrationData[Double]An array of calibration values obtained after calibration is completed.

Exmaple

func onCalibrationFinished(calibrationData: [Double]) {
saveData = calibrationData
}

...

gazeTracker.setCalibrationData(calibrationData: saveData)


setAttentionRegion

public func setAttentionRegion(region: CGRect)

Sets the attention region using the specified coordinates.

note

default is UIScreen Bounds

ParametersTypeDescription
regionCGRectAttention region area

Example

// Set attention region that (x,y) = (100,100), (width, height) = 100,100
gazeTracker.setAttentionRegion(region: CGRect(x:100, y:100, width: 100, height: 100))

getAttentionRegion

public func getAttentionRegion() -> CGRect?

Retrieves the attention region.

Return TypeDescription
CGRect?CGRect representing the attention region in screen coordinates if the ROI (Region of Interest) is set, or null if the region is not set.

Example


if let region = tracker.getAttentionRegion() {
print("current attention region : \(region)")
}


removeAttentionRegion

public func removeAttentionRegion()

Removes the currently set attention region.

Example

if let region = tracker.getAttentionRegion() {
tracker.removeAttentionRegion()
}

GazeTrackerOptions

@objc public class GazeTrackerOptions: NSObject {
public let useBlink: Bool
public let useUserStatus: Bool
public let useGazeFilter: Bool
public let maxConcurrency: Int
public let cameraPreset: CameraPreset
@objc public class Builder: NSObject {
...
}
}

Configuration options for creating a GazeTracker. GazeTrackerOptions allows you to specify settings such as blink detection, user status tracking, gaze filtering, and more. It uses the GazeTrackerOptions.Builder pattern to create an instance.

ProperitesTypeDescription
useBlinkBoolEnables blink detection.
useUserStatusBoolEnables user status tracking.
useGazeFilterBoolApplies a filter to smooth gaze data.
maxConcurrencyBoolSets the maximum number of concurrent operations.
cameraPresetCameraPresetSpecifies the camera settings using CameraPreset.

GazeTrackerOptions.Builder

@objc public class Builder: NSObject

A builder for configuring and creating a GazeTrackerOptions instance. Use the Builder class to set various options such as blink detection, user status tracking, gaze filtering, and more. After configuring the desired options, call build() to generate a GazeTrackerOptions instance.

Summary

FunctionDescriptionReturn Type
buildBuilds the GazeTrackerOptions instance with the specified settings.GazeTrackerOptions
setCameraPresetSets the camera resolution preset to be used for gaze tracking.Builder
setMaxConcurrencySets the maximum number of concurrent threads allowed for processing.Builder
setUseBlinkSets whether to enable blink detection.Builder
setUseUserStatusSets whether to enable user status tracking (e.g., focus, drowsiness).Builder
setUseGazeFilterSets whether to apply a gaze filter to stabilize gaze tracking data.Builder

build

public func build() -> GazeTrackerOptions

Builds the GazeTrackerOptions instance with the specified settings.

note

Default Properties

  • useBlink : false
  • useUserStatus: false
  • useGazeFilter: true
  • maxConcurrency: 3
  • cameraPreset : VGA640X480

Example

let options = GazeTrackerOptions.Builder().build()

setCameraPreset

public func setCameraPreset(_ preset: CameraPreset) -> Builder

Sets the camera resolution preset to be used for gaze tracking.

ParametersTypeDescription
presetCameraPresetThe camera resolution preset to use.
Return TypeDescription
GazeTrackerOptions.BuilderThe Builder instance for method chaining.
let options = GazeTrackerOptions.Builder()
.setCameraPreset(.hd)
.build()

setMaxConcurrency

public func setMaxConcurrency(_ maxConcurrency: Int) -> Builder

Sets the maximum number of concurrent threads allowed for processing.

note

If maxConcurrency is 0, there is no limit on concurrency.

ParametersTypeDescription
maxConcurrencyintThe maximum number of concurrent threads.
Return TypeDescription
GazeTrackerOptions.BuilderThe Builder instance for method chaining.

Example

let options = GazeTrackerOptions.Builder()
.setMaxConcurrency(0)
.setCameraPreset(.hd)
.build()

public func setUseBlink(_ useBlink: Bool) -> Builder 

Sets whether to enable blink detection.

ParametersTypeDescription
useBlinkBooltrue to enable blink detection, false to disable it.
Return TypeDescription
GazeTrackerOptions.BuilderThe Builder instance for method chaining.

Example

let options = GazeTrackerOptions.Builder()
.setUseBlink(true)
.setMaxConcurrency(0)
.setCameraPreset(.hd)
.build()

setUseUserStatus

public func setUseUserStatus(_ useUserStatus: Bool) -> Builder

Sets whether to enable user status tracking (e.g., focus, drowsiness).

ParametersTypeDescription
useUserStatusBooltrue to enable user status tracking, false to disable it.
Return TypeDescription
GazeTrackerOptions.BuilderThe Builder instance for method chaining.

Example

let options = GazeTrackerOptions.Builder()
.setUseUserStatus(true)
.setUseBlink(true)
.setMaxConcurrency(0)
.setCameraPreset(.hd)
.build()

setUseGazeFilter

public func setUseGazeFilter(_ useGazeFilter: Bool) -> Builder

Sets whether to apply a gaze filter to stabilize gaze tracking data.

ParametersTypeDescription
useGazeFilterBooltrue to apply the gaze filter, false to disable it.
Return TypeDescription
GazeTrackerOptions.BuilderThe Builder instance for method chaining.

Example

let options = GazeTrackerOptions.Builder()
.setUseGazeFilter(false)
.setUseUserStatus(true)
.setUseBlink(true)
.setMaxConcurrency(0)
.setCameraPreset(.hd)
.build()

GazeInfo

@objcMembers
public class GazeInfo: NSObject {
public let x: Double
public let y: Double
public let fixationX: Double
public let fixationY: Double
public let trackingState: TrackingState
public let eyeMovementState: EyeMovementState
public let screenState: ScreenState
}

A class that contains information about gaze tracking.

PropertiesTypeDescription
xDoubleThe x-coordinate value of the gaze point. The origin is the device screen, and the unit is in point(pt).
yDoubleThe y-coordinate value of the gaze point. The origin is the device screen, and the unit is in point(pt).
fixationXDoubleThe x-coordinate value of the last fixation point. The origin is the device screen, and the unit is in point(pt).
fixationYDoubleThe y-coordinate value of the last fixation point. The origin is the device screen, and the unit is in point(pt).
trackingStateTrackingStateEnum indicating the tracking state of the camera image.
eyeMovementStateEyeMovementStateEnum indicating the type of eye movement.
screenStateScreenStateEnum indicating whether the gaze has moved off the screen.

FaceInfo

@objcMembers
public class FaceInfo: NSObject {
public let score: Double
public let rect: CGRect
public let pitch: Double
public let yaw: Double
public let roll: Double
public let imageSize: CGSize
public let centerX: Double
public let centerY: Double
public let centerZ: Double

A class containing information about face tracking.

note

When the camera is positioned at the top, we defined it as 0 degrees. (pitch, yaw, roll)

PropertiesTypeDescription
scoreDoubleConfidence score for facial recognition, ranging from 0.0 (no confidence) to 1.0 (full confidence).
rectCGRectThe rectangle representing the face within the image.
pitchDoublePitch refers to the rotation of the face around the X-axis. It is also known as the attitude angle, often referred to as the nose-up or nose-down angle in aviation.
yawDoubleYaw refers to the rotation of the face around the Y-axis. It is also known as the heading angle, often referred to as the bearing or compass angle in aviation.
rollDoubleRoll refers to the rotation of the face around the Z-axis. It is also known as the bank angle, often referred to as the wing-up or wing-down angle in aviation.
imageSizeCGSizeThe size of the image in which the face was detected.
centerXDoubleThe X-coordinate distance of the center of the face from the camera, in millimeters (mm).
centerYDoubleThe Y-coordinate distance of the center of the face from the camera, in millimeters (mm).
centerZDoubleThe Z-coordinate distance of the center of the face from the camera, in millimeters (mm).

BlinkInfo

@objcMembers
public class BlinkInfo: NSObject {
public let isBlink: Bool
public let isBlinkLeft: Bool
public let isBlinkRight: Bool
public let leftOpenness: Double
public let rightOpenness: Double
}

Represents information about eye blinking.

PropertiesTypeDescription
isBlinkBoolIndicates if both eyes are blinking.
isBlinkLeftBoolIndicates if the left eye is blinking.
isBlinkRightBoolIndicates if the right eye is blinking.
leftOpennessDoubleOpen ratio of the left eye (0.0~1.0).
rightOpennessDoubleOpen ratio of the right eye (0.0~1.0).

UserStatusInfo

@objcMembers
public class UserStatusInfo: NSObject {
public let isDrowsy: Bool
public let drowsinessIntensity: Double
public let attentionScore: Double
}

Represents information about user is status

PropertiesTypeDescription
isDrowsyBool
drowsinessIntensityDoubleIntensity level of drowsiness, ranging from 0.0 (not drowsy) to 1.0 (very drowsy).
attentionScoreDoubleScore indicating the user's attention rate, ranging from 0.0 to 1.0.

OneEuroFilterManager

@objc public class OneEuroFilterManager: NSObject {
public static let DEFAULT_COUNT = 2
public static let DEFAULT_FREQUENCY = 30.0
public static let DEFAULT_MIN_CUT_OFF = 1.0
public static let DEFAULT_BETA = 0.007
public static let DEFAULT_D_CUT_OFF = 1.0
}

The OneEuroFilterManager class manages multiple One Euro Filters. This filter is used to reduce noise and smooth signals (e.g., gaze tracking data). Users can retrieve the filtered values and initialize filters with various settings through this class.

Overview

FunctionDescriptionReturn TypeType
initConstructorOneEuroFilterManagerConstructor
filterValuesFilters the given values.BoolMember
getFilteredValuesReturns the filtered values.[Double]Member

init

public init(_ count: Int = DEFAULT_COUNT,
_ freq: Double = DEFAULT_FREQUENCY,
_ minCutOff: Double = DEFAULT_MIN_CUT_OFF,
_ beta: Double = DEFAULT_BETA,
_ dCutOff: Double = DEFAULT_D_CUT_OFF)
ParametersTypeDescription
countIntThe number of filters to use.
freqDoubleThe frequency to be used by the filter (Hz).
minCutOffDoubleThe minimum cutoff frequency.
betaDoubleThe beta value (adjusts the response speed).
dCutOffDoubleThe derivative cutoff value.

filterValues

public func filterValues(timestamp: Int, val: [Double]) -> Bool

Filters the given values.

ParametersTypeDescription
timestampIntThe current timestamp (ms).
val[Double]The values to be filtered.
Return TypeDescription
Booltrue if filtering was successful, false otherwise.

Example

// Attempt to filter the gaze coordinates using the OneEuroFilterManager
if oneEuroFilterManager.filterValues(timestamp:timestamp, val: [gazeInfo.x, gazeInfo.y]) {
// If successful, notify that the values have been filtered
print("Successfully filtered values: (\(gazeInfo.x), \(gazeInfo.y))")
} else {
// If unsuccessful, notify that the filtering has failed
print("Failed to filter values.")
}

getFilteredValues

public func getFilteredValues() -> [Double] 

Returns the filtered values.

Return TypeDescription
[Double]An array of filtered values.

Example

// Attempt to filter the gaze coordinates using the OneEuroFilterManager
if oneEuroFilterManager.filterValues(timestamp:timestamp, val: [gazeInfo.x, gazeInfo.y]) {
// If successful, notify that the values have been filtered
let values = oneEuroFilterManager.getFilteredValues()
print("Successfully filtered values: (\(values[0]), \(values[1]))")
} else {
// If unsuccessful, notify that the filtering has failed
print("Failed to filter values.")
}

InitializationDelegate

@objc public protocol InitializationDelegate

A delegate protocol to receive the results of the GazeTracker initialization. Implement this protocol to handle the outcomes of the GazeTracker initialization attempt. The delegate methods notify whether the initialization was successful or if an error occurred.

Summary

function
onInitialized

onInitialized

func onInitialized(tracker: GazeTracker?, error: InitializationError)

Called when theGazeTracker initialization is completed. If successful, a valid GazeTracker instance is provided. If failed, the GazeTracker will be null.

See Authentication for more details.

ParametersTypeDescription
trackerGazeTracker?The GazeTracker instance if initialization was successful, otherwise null.
errorInitializationErrorThe type of error if initialization failed,represented by InitializationErrorType.

Example

public void onInitialized(tracker: GazeTracker?, error: InitializationErrorType) {
if (tracker != null) {
this.tracker = tracker;
} else {
switch (error) {
case InitializationErrorType.none:
// Handle no error
break
case InitializationErrorType.errorInit:
// Handle initialization error
break
...
}
}
}

TrackingDelegate

@objc public protocol TrackingDelegate

A delegate protocol to receive real-time tracking data from the GazeTracker. Implement this protocol to handle gaze, face, blink, and user status metrics during a tracking session.

Summary

function
onMetrics

onMetrics

func onMetrics(
timestamp: Int,
gazeInfo: GazeInfo,
faceInfo: FaceInfo,
blinkInfo: BlinkInfo,
userStatusInfo: UserStatusInfo)

Called when new tracking metrics are available.

ParametersTypeDescription
timestampIntThe timestamp when the metrics were captured (in milliseconds).
gazeInfoGazeInfoInformation about the gaze position and validity.
faceInfoFaceInfoInformation about the face position and orientation.
blinkInfoBlinkInfoInformation about the blink status.
userStatusInfoUserStatusInfoInformation about the user's status (e.g., attention level, drowsiness).

Example

func onMetrics(
timestamp : Int,
gazeInfo: GazeInfo,
faceInfo: FaceInfo,
blinkInfo: BlinkInfo,
userStatusInfo: UserStatusInfo) {
// do sometings
}

StatusDelegate

@objc public protocol StatusDelegate

A delegate protocol to receive status updates from the GazeTracker.

Summary

function
onStarted
onStopped

onStarted

func onStarted()

Called automatically after GazeTracker.startTracking() succeeds. Once this method is called, actions like calibration, preview, etc., are available.

Example

func onStarted() {
gazeTracker.startCalibration();
}

onStopped

func onStopped(error: StatusError)

Called when gaze tracking stops.

ParametersTypeDescription
errorStatusErrorThe error parameter will be StatusError.none if gaze tracking stopped after GazeTracker.stopTracking() was called, but will have different values for different statuses.

Example

func onStopped(error: StatusErrorType) {
if (error != .none) {
// onStopped with error
} else {
// onStopped without error
}
}

CalibrationDelegate

@objc public protocol CalibrationDelegate

A delegate protocol that provides status updates during the calibration process.

important

Methods of CalibrationDelegate are called on the main thread. Please use DispatchQueue.main.async() when updating the UI to ensure that all UI changes are made on the main thread.

Summary

function
onCalibrationNextPoint
onCalibrationProgress
onCalibrationFinished

onCalibrationNextPoint

func onCalibrationNextPoint(x: Double, y: Double)

Provides the coordinates for the next calibration point. After receiving the coordinates, display the calibration point on the screen and wait for a certain period to allow the user to focus on it. Then proceed by calling startCollectSamples().

ParametersTypeDescription
xDoubleThe x-coordinate of the next calibration target. The origin is at the top-left of the device screen. Measured in points (pt).
yDoubleThe y-coordinate of the next calibration target. The origin is at the top-left of the device screen. Measured in points (pt).

Example

func onCalibrationNextPoint(x: Double, y: Double) {
// Draw Calibration Point(x,y) in view.

// than call startCollectSamples to proceed calibration.
gazeTracker.startCollectSamples();
}

onCalibrationProgress

func onCalibrationProgress(progress: Double)

Provides the calibration progress. The progress value increases when the startCollectSamples() function is called.

note

Visualizing the calibration progression with proper UI is recommended to improve UX.

ParametersTypeDescription
progressDoubleThe calibration progress for each point, typically ranging from 0.0 to 1.0.

Example

func onCalibrationProgress(progress: Double) {
DispatchQueue.main.async {
self.progressView.setProgress(progress : progress)
}
}

onCalibrationFinished

func onCalibrationFinished(calibrationData: [Double])

Called when calibration is finished.

note

To avoid re-calibrating when restarting the app, pass the stored calibration data via GazeTracker.setCalibrationData().

ParametersTypeDescription
calibrationData[Double]The calibration data to be stored or used later.

Example

func onCalibrationFinished(calibrationData: [Double]) {
// remove calibration UI, save calibration data
removeCalibrationUI();
saveCalibrationDataForLaterUse(data: calibrationData);
}

ImageDelegate

@objc public protocol ImageDelegate

A delegate protocol for receiving images from the front camera.

Summary

function
onImage

onImage

func onImage(timestamp: Int, image: CMSampleBuffer)

Called when a new image frame is captured by the front camera.

note

The image is in kCVPixelFormatType_32BGRA format. The image orientation is rotated 90 degrees counterclockwise from the portrait orientation.

ParametersTypeDescription
timestampIntThe timestamp when the camera frame was captured, in milliseconds since the Unix epoch (UTC).
image :CMSampleBufferA CMSampleBuffer containing the image data from the camera.

Example

func onImage(timestamp: Int, image: CMSampleBuffer) {
// Convert CMSampleBuffer to UIImage
if let uiImage = imageFromSampleBuffer(image) {
DispatchQueue.main.async {
// Assuming imageView is your UIImageView instance
self.imageView.image = uiImage
}
}
}

func imageFromSampleBuffer(_ sampleBuffer: CMSampleBuffer) -> UIImage? {
guard let pixelBuffer = CMSampleBufferGetImageBuffer(sampleBuffer) else {
return nil
}

let ciImage = CIImage(cvPixelBuffer: pixelBuffer)
let context = CIContext()

if let cgImage = context.createCGImage(ciImage, from: ciImage.extent) {
return UIImage(cgImage: cgImage)
}

return nil
}

CameraPreset

@objc public enum CameraPreset: Int

Represents the available camera resolution presets. Allows the user to select a specific camera resolution for the GazeTracker.

NameDescription
vgaVGA resolution (640x480), suitable for low-resolution requirements.
hdHD resolution (1280x720), suitable for standard high-definition needs.
fhdFull HD resolution (1920x1080), suitable for high-resolution requirements.

resolution

The resolution size associated with the preset.

var resolution: CGSize {
switch self {
case .vga:
return CGSize(width: 640, height: 480)
case .hd:
return CGSize(width: 1280, height: 720)
case .fullHD:
return CGSize(width: 1920, height: 1080)
}
}

InitializationError

@objc public enum InitializationError: Int

Enum representing the different types of initialization errors that can occur during the setup of the Gaze Tracker system. Please read : Authentication for more details.

description

Returns string value of enum.

public var description: String {
switch self {
case .errorNone:
return "ERROR_NONE"
case .errorInit:
return "ERROR_INIT"
case .errorCameraPermission:
return "ERROR_CAMERA_PERMISSION"
case .authInvalidKey:
return "AUTH_INVALID_KEY"
case .authInvalidEnvUsedDevInProd:
return "AUTH_INVALID_ENV_USED_DEV_IN_PROD"
case .authInvalidEnvUsedProdInDev:
return "AUTH_INVALID_ENV_USED_PROD_IN_DEV"
case .authInvalidPackageName:
return "AUTH_INVALID_PACKAGE_NAME"
case .authInvalidAppSignature:
return "AUTH_INVALID_APP_SIGNATURE"
case .authExceededFreeTier:
return "AUTH_EXCEEDED_FREE_TIER"
case .authDeactivatedKey:
return "AUTH_DEACTIVATED_KEY"
case .authInvalidAccess:
return "AUTH_INVALID_ACCESS"
case .authUnknownError:
return "AUTH_UNKNOWN_ERROR"
case .authServerError:
return "AUTH_SERVER_ERROR"
case .authCannotFindHost:
return "AUTH_CANNOT_FIND_HOST"
case .authWrongLocalTime:
return "AUTH_WRONG_LOCAL_TIME"
case .authInvalidKeyFormat:
return "AUTH_INVALID_KEY_FORMAT"
case .authExpiredKey:
return "AUTH_EXPIRED_KEY"
case .errorNotAdvancedTier:
return "ERROR_NOT_ADVANCED_TIER"
}
}

CalibrationMode

@objc public enum CalibrationMode: Int

Enum representing the calibration mode types for starting calibration. Currently, available modes are default, onePoint, and fivePoint. default is set to the recommended mode, which is currently the five-point calibration mode.

NameDescription
defaultDefault calibration mode, which is set to the recommended mode. In the current version, this is the five-point calibration mode.
onePointOne-point calibration mode
fivePointFive-point calibration mode

description

Returns string value of enum.

public var description: String {
switch self {
case .onePoint:
return "ONE_POINT"
case .fivePoint:
return "FIVE_POINT"
case .default:
return "DEFAULT"
}
}

AccuracyCriteria

@objc public enum AccuracyCriteria: Int, Codable

Enum representing the accuracy criteria for starting calibration

NameDescription
defaultDefault calibration accuracy criteria.
lowLow calibration accuracy criteria.
highHigh calibration accuracy criteria.

TrackingState

@objc public enum TrackingState: Int

Enum representing the state of gaze tracking

NameDescription
successGaze tracking is successful, and both gaze and face are detected.
faceMissingThe face is not detected, which means gaze tracking is not possible.
gazeNotFoundThe gaze is not detected, but the face is still tracked.

description

Returns string value of enum.

public var description: String {
switch self {
case .success:
return "SUCCESS"
case .gazeNotFound:
return "GAZE_NOT_FOUND"
case .faceMissing:
return "FACE_MISSING"
}
}

EyeMovementState

@objc public enum EyeMovementState: Int

Enum representing the state of eye movement.

NameDescription
fixationThe eye is fixated on a specific point.
saccadeThe eye is moving rapidly between points (saccadic movement).
unknownThe eye movement state is unknown or cannot be determined.

desc

Returns string value of enum.

public var description: String {
switch self {
case .fixation:
return "FIXATION"
case .saccade:
return "SACCADE"
case .unknown:
return "UNKNOWN"
}
}

ScreenState

@objc public enum ScreenState: Int

Enum representing the state of whether the gaze is inside or outside of the screen.

NameDescription
insideOfScreenThe gaze is within the bounds of the screen.
outsideOfScreenThe gaze is outside the bounds of the screen.
unknownThe gaze state is unknown or cannot be determined.

description

Returns string value of enum.

public var description: String {
switch self {
case .insideOfScreen:
return "INSIDE_OF_SCREEN"
case .outsideOfScreen:
return "OUTSIDE_OF_SCREEN"
case .unknown:
return "UNKNOWN"
}
}

StatusError

@objc public enum StatusError: Int

Enum that describes the reason why gaze tracking stopped.

NameDescription
noneNo error occurred, and gaze tracking stopped due to a stopTracking call.
cameraStartGaze tracking stopped because an error occurred when trying to start the camera.
cameraInterruptGaze tracking stopped because the camera was interrupted.

description

Returns string value of enum.

public var description: String {
switch self {
case .none:
return "ERROR_NONE"
case .cameraStart:
return "ERROR_CAMERA_START"
case .cameraInterrupt:
return "ERROR_CAMERA_INTERRUPT"
}
}