API Docs
Overview
Name | Type | Description |
---|---|---|
GazeTracker | class | Class that generates gaze tracking data |
GazeInfo | class | A class composed of information about eye tracking. |
UserStatusOption | class | The class contains User Status options information for GazeTracker |
FaceInfo | class | A class composed of information about face state. |
InitializationDelegate | protocol | A Delegate interface that notifies the result when creating a gaze tracker. |
GazeTrackerDelegate | protocol | All Delegates used in GazeTracker instance inherit from GazeTrackerDelegate. |
StatusDelegate | protocol | Protocol that informs whether the Gaze Tracker is operating or not. |
GazeDelegate | protocol | Protocol to receive frame-by-frame GazeInfo. |
CalibrationDelegate | protocol | Protocol used in the calibration process. |
UserStatusDelegate | protocol | Protocol used for user status function. |
ImageDelegate | protocol | Protocol that receives the image buffer from the front camera. |
FaceDelegate | protocol | Protocol for checking user's face state on screen. |
InitializationError | enum | The Enum that contains error types of InitializationDelegate. |
StatusError | enum | The Enum that contains error types of StatusDelegate. |
TrackingState | enum | The Enum that contains tracking state types of GazeInfo. |
EyeMovementState | enum | The Enum that contains eye movement state types using at GazeInfo. |
ScreenState | enum | The Enum that contains screen state types using at GazeInfo. |
CalibrationMode | enum | The Enum that contains calibration mode types of startCalibartion. |
AccuracyCriteria | enum | The Enum that contains accuracy criteria of startCalibartion. |
GazeTracker
- Swift
- Objective-C
@objc public class GazeTracker: NSObject
@interface GazeTracker : NSObject
GazeTracker
is the class that generates gaze tracking data from the video from the device's front camera.
Summary
Constructor and Destructor |
---|
initGazeTracker |
deinitGazeTracker |
Variables | Type |
---|---|
statusDelegate | StatusDelegate? |
gazeDelegate | GazeDelegate? |
calibrationDelegate | CalibrationDelegate? |
imageDelegate | ImageDelegate? |
userStatusDelegate | UserStatusDelegate? |
faceDelegate | FaceDelegate? |
getFrameworkVersion
- Swift
- Objective-C
@objc public static func getFrameworkVersion() → String
(NSString * _Nonnull)getFrameworkVersion
This will return SeeSo framework version.
Returns | Type | Description |
---|---|---|
version | String , NSString * | SeeSo Version |
- Swift
- Objective-C
let version = GazeTracker.getFrameworkVersion()
print("SeeSo version: \(version)")
NSString * version = [GazeTracker getFrameworkVersion];
NSLog(@"SeeSo version: %@.", version);
initGazeTracker
- Swift
- Objective-C
@objc public static func initGazeTracker(license: String, delegate InitailizationDelegate, option: UserStatusOption? = nil)
+ (void)initGazeTrackerWithLicense:(NSString * _Nonnull)license delegate:(id <InitializationDelegate> _Nonnull)delegate option:(UserStatusOption * _Nullable)option
GazeTracker
constructing process includes authentication.
Function is asynchronous.
InitializationDelegate
will relay constructed object.
Parameters | Type | Description |
---|---|---|
license | String , NSString * | Key generated from manage.seeso.io |
delegate | InitailzationDelegate , id <InitializationDelegate> | Delegate represents initialization status |
option | UserStatusOption | A class containing User status option information (optional) |
- Swift
- Objective-C
GazeTracker.initGazeTracker(license: "YOUR_LICENSE_KEY", delegate: self)
let userStatusOption = UserStatusOption()
userStatusOption.useAll()
GazeTracker.initGazeTracker(license: "YOUR_LICENSE_KEY", delegate: self, option: userStatusOption)
[GazeTracker initGazeTrackerWithLicense:@"key" delegate:self];
UserStatusOption * option = [[UserStatusOption alloc] init];
[option useAll];
[GazeTracker initGazeTrackerWithLicense:@"key" delegate:self, option: option];
deinitGazeTracker
- Swift
- Objective-C
@objc public static func deinitGazeTracker(tracker: GazeTracker?)
+ (void)deinitGazeTrackerWithTracker:(GazeTracker * _Nullable)tracker;
After the destruction, every API will be disabled.
For memory optimization, assigning nil
to the destructed object is recommended.
Parameters | Type | Description |
---|---|---|
tracker | GazeTracker? , GazeTracker * | Target GazeTracker instance of destruction. |
- Swift
- Objective-C
GazeTracker.deinitGazeTracker(tracker: tracker);
tracker = nil
[GazeTracker deinitGazeTrackerWithTracker:tracker];
tracker = nil;
startTracking
- Swift
- Objective-C
@objc public func startTracking()
- (void)startTracking;
This is the function that should be called before starting gaze tracking.
It calls the onStarted() function in the StatusDelegate object when succeeded.
- Swift
- Objective-C
tracker.startTracking()
[tracker startTracking];
stopTracking
- Swift
- Objective-C
@objc public func stopTracking()
- (void)StopTracking;
his function is called to stop gaze tracking.
It calls the StatusDelegate.onStop()
object when succeed.
- Swift
- Objective-C
tracker.stopTracking()
[tracker stopTracking];
isTracking
This function represents the status of gaze tracking.
It returns true
when gaze tracking is working, false
when gaze tracking is stopped.
Return Type | Description |
---|---|
Bool , BOOL | It is a return value indicating whether the tracking state is present. |
- Swift
- Objective-C
let isTracking = tracker.isTracking();
print("Tracking status: \(isTracking)")
BOOL isTracking = [tracker isTracking];
NSLog("Tracking status: %d", isTracking);
setTrackingFPS
- Swift
- Objective-C
@objc public func setTrackingFPS(fps: Int) → Bool
- (BOOL)setTrackingFPSWithFps:(NSInteger)fps
The parameter that sets the FPS of the gaze tracking source.
Its value should bigger than 0
and no more than 30
.
FPS can be dropped due to device spec. The default value is 30
.
If state is TrackingState.FACE_MISSING
, FPS will set as 30
automatically.
Parameters | Type | Description |
---|---|---|
fps | Int , NSInteger | Custom FPS(Frame Per Second) for gaze tracking. |
Return Type | Description |
---|---|
Bool , BOOL | whether it is set fps |
- Swift
- Objective-C
tracker.setTrackingFPS(fps: 20)
[tracker setTrackingFPSWithFps:20];
setForcedOrientation
- Swift
- Objective-C
@objc public func setForcedOrientation(orientation: UIInterfaceOrientation) -> Bool
- (BOOL)setForcedOrientationWithOrientation:(UIInterfaceOrientation)orientation;
Set the forced device orientation for the GazeTracker. This function sets the orientation of the device forcibly for certain situations. Specifically, when the device is in portrait lock mode and the developer wants to rotate it to landscape mode, the gaze coordinates also need to be rotated. However, the function only works when not in the calibration process, meaning that the device's orientation cannot be forcibly changed during calibration.
Parameters | Type | Description |
---|---|---|
orientation | UIInterfaceOrientation | The forced device orientation. |
Return Type | Description |
---|---|
Bool , BOOL | True if the forced orientation is set successfully. False if the function is called during the calibration process. |
- Swift
- Objective-C
if let result = tracker.setForcedOrientation(orientation: .landscapeRight) {
if result {
print("forced orientation landscapeRight")
}
}
tracker.startTracking()
BOOL result = [self->tracker setForcedOrientationWithOrientation:UIInterfaceOrientationLandscapeRight];
if(result) {
NSLog(@"forced orientation landscapeRight");
}
[self->tracker startTracking];
resetForcedOrientation
- Swift
- Objective-C
@objc public func resetForcedOrientation() -> Bool
- (BOOL)resetForcedOrientation;
Reset the forced device orientation for the GazeTracker.
This function resets the orientation of the device that was forcibly set by the setForcedOrientation
function.
After this function is called, the device orientation will no longer be forcibly controlled, and the gaze coordinates will be determined based on the actual device orientation.
However, this function only works when not in the calibration process, meaning that the device's orientation cannot be forcibly changed during calibration.
Return Type | Description |
---|---|
Bool , BOOL | True if the forced orientation is reset successfully. False if the function is called during the calibration process. |
- Swift
- Objective-C
if let result = tracker.resetForcedOrientation() {
if result {
print("reset forced orientation")
}
}
BOOL result = [self->tracker resetForcedOrientation];
if(result) {
NSLog(@"reset forced orientation");
}
setCameraPreview
- Swift
- Objective-C
@objc public func setCameraPreview(cameraPreview: UIView)
- (void)setCameraPreviewWithPreview:(UIView * _Nonnull)preview;
This is a preview of the camera that GazeTracker
using.
You should rotate the screen if landscape mode
Parameters | Type | Description |
---|---|---|
cameraPreview | UIView , UIView * | UIView for camera preview. |
- Swift
- Objective-C
tracker.setCamerapreview(preview: preview)
[tracker setCameraPreviewWithPreview: preview];
removeCameraPreview
- Swift
- Objective-C
@objc public func removeCaemraPreview()
- (void)removeCameraPreview;
This will remove the camera preview that set at GazeTracker
.
- Swift
- Objective-C
tracker.removeCameraPreview();
[tracker removeCameraPreview];
startCalibration
- Swift
- Objective-C
@objc public func startCalibration(mode: CalibrationMode?, criteria : AccuracyCriteria?, region: CGRect?) → Bool
- (BOOL)startCalibrationWithMode:(enum CalibrationMode)mode criteria:(enum AccuracyCriteria)criteria region:(CGRect)region
There are four cases at function returns. It returns true
when parameters are valid.
If startCalibration() was called when the tracker is not tracking, it will return false
.
If the value of mode is not defined, the function returns false
.
It also returns false
when the calibration region has been set outside of the device screen.
The false
return will block the calibration process.
Parameters | Type | Description |
---|---|---|
mode | CalibrationMode? | Can select calibration option. Three options — DEFAULT(0), ONE_POINT(1), FIVE_POINT(5), SIX_POINT(6) — are available. Default is FIVE_POINT . |
region | CGRect? | Region that needs calibration. The unit is point(pt ). |
criteria | AccuracyCriteria? | Option that manage calibration process and accuracy. Three options - DEFAULT(0), LOW(1), HIGH(2) - are available. Default is DEFAULT . |
Return Type | Description |
---|---|
Bool , BOOL | Whether it is set. |
- Swift
- Objective-C
let startedResult = tracker.startCabliration(mode: .ONE_POINT, region: CGRect(x: 100, y: 100, width: 100, height: 100))
let startedResult2 = tracker.startCabliration(mode: .ONE_POINT)
let startedResult3 = tracker.startCabliration(region: CGRect(x: 100, y: 200, width: 300, height: 400))
let startedResult4 = tracker.startCabliration()
BOOL startResult = [tracker startCalibrationWithMode: CalibrationModeONE_POINT AccuracyCriteriaDEFAULT region:[self.view frame]];
stopCalibration
- Swift
- Objective-C
@objc public func stopCalibration()
- (void)stopCalibration;
- Swift
- Objective-C
tracker.stopCalibration()
[tracker stopCalibration];
isCalibrating
- Swift
- Objective-C
@objc public func isCalibrating() → Bool
- (BOOL) isCalibrating
It returns true
when calibrating, false
when when not calibrating.
Return Type | Description |
---|---|
Bool , BOOL | It returns true when calibrating. |
- Swift
- Objective-C
let isCalibrating = tracker.isCalibrating();
print("Calibrating status: \(isCalibrating)")
BOOL isCalibrating = [tracker isCalibrating];
NSLog("Calibrating status: %d", isCalibrating);
startCollectSamples
- Swift
- Objective-C
@objc public func startCollectSamples()
- (BOOL)startCollectSamples;
This relays the coordinates of the point that should be seen when calibration is in progress at the function: CalibrationDelegate.onCalibrationNextPoint
.
If startCollectSamples
returns false
when using SwiftUI
, it may be caused because of timing issue between the SwiftUI
and the core logic. You may add approximately 0.5 or 1 second delay to fix the issue.
Return Type | Description |
---|---|
Bool , BOOL | If startCollectionSamples calling timing went wrong, it will return false You should display coordinates on the screen and call startCollectSamples for calibration. |
- Swift
- Objective-C
tracker.startCollectSamples()
[tracker startCollectSamples];
setCalibrationData
- Swift
- Objective-C
@objc public func setCalibrationData(calibrationData : [Double]) -> Bool
- (BOOL)setCalibrationDataWithCalibrationData:(NSArray^<NSNumber \* ^> \* _Nonnull)calibrationData
Set existing calibration data to GazeTracker.
Parameters | Type | Description |
---|---|---|
calibrationData | [Double] , NSArray<NSNumber \*> \* | Calibration Data |
Return Type | Description |
---|---|
Bool , BOOL | If setCalibrationData() was called when the tracker is not tracking, it will return false. |
- Swift
- Objective-C
if (tracker.setCalibrationDat(calibrationData : data)) {
print("Loading of calibration data was successful.")
}else {
pirnt("Failed to load calibration data.")
}
if([tracker setCalibrationDataWithCalibrationData: data]){
NSLog(@"Loading of calibration data was successful.");
}else {
NSLog(@"Failed to load calibration data.");
}
setAttentionInterval
- Swift
- Objective-C
@objc public func setAttentionInterval(interval : Int)
- (void)setAttentionIntervalWithInterval:(NSInteger)interval;
Set time interval for the UserStatus Attention
callback.
The UserStatus Attention
score will be calculated for the given time interval
.
The beginning and ending timestamps are passed through the onAttention
callback as timestampBegin
and timestampEnd
.
The interval
range is 10 to 60 (seconds), and the default value is 30
seconds.
Parameters | Type | Description |
---|---|---|
interval | Int , NSInteger | Time interval for the UserStatus Attention score. |
- Swift
- Objective-C
tracker.setAttentionInterval(interval: 30)
[tracker setAttentionInervalWithInterval:20];
getAttentionScore
- Swift
- Objective-C
@objc public func getAttentionScore() -> Double
- (double)getAttentionScore
Get current Attention
score from the GazeTracker.
This API does not provide timestamp
of the Attention
score data.
Use onAttention callback, unless the Attention
score is required at a specific time/location.
Return Type | Description |
---|---|
Double , double | Attention score |
- Swift
- Objective-C
let score = tracker.getAttentionScore()
double score = [tracker getAttentionScore];
setAttentionRegion
- Swift
- Objective-C
@objc public func SetAttentionRegion(region : CGRect)
- (void)setAttentionRegionWithRegion:(CGRect)region;
Set Attention Region.
Parameters | Type | Description |
---|---|---|
region | CGRect | default is UIScreen Bounds. |
- Swift
- Objective-C
...
// set position (100,100), size (100,100) Attention region.
tracker?.setAttentionRegion(region: CGRect(x:100,y:100,width:100,height:100))
...
//The 'Attention' score for the set region is output.
print("Get Attention Scroe : \(tracker?.getAttentionScore())")
...
// set position (100,100), size (100,100) Attention region.
[tracker setAttentionRegionWithRegion: CGRectMake(100,100,100,100)];
...
//The 'Attention' score for the set region is output.
NSLog(@"Get Attention Scroe : %f", [tracker getAttentionScore]);
getAttentionRegion
- Swift
- Objective-C
@objc public func GetAttentionRegion() -> CGRect
- (CGRect)getAttentionRegion;
Get Current set Attention Region.
Return Type | Description |
---|---|
CGRect | default is UIScreen Bounds. |
- Swift
- Objective-C
...
// set position (100,100), size (100,100) Attention region.
tracker?.etAttentionRegion(region: CGRect(x:100,y:100,width:100,height:100))
...
//The 'Attention' set region is output.
print("Get Attention Region : \(tracker?.getAttentionRegion())")
...
// set position (100,100), size (100,100) Attention region.
[tracker setAttentionRegionWithRegion: CGRectMake(100,100,100,100)];
CGRect current = [tracker getAttentionRegion];
//The 'Attention' set region is output.
NSLog(@"Get Attention region (%f, %f, %f, %f)", current.origin.x, current.origin.y, current.size.width, current.size.height);
removeAttentionRegion
- Swift
- Objective-C
@objc public func removeAttentionRegion()
- (void)removeAttentionRegion;
current set attention region is removed.
- Swift
- Objective-C
tracker?.removeAttentionRegion()
let rect = tracker?.getAttentionRegion()
// Weired values will be printed.
print("Attension Region : \(rect.x), (rect.y), (rect.width) \(rect.height)")
[tracker removeAttentionRegion];
CGRect rect = [tracker getAttentionRegion];
// Weired values will be printed.
NSLog(@"x,y width, height : %f %f %f %f", rect.origin.x, rect.origin.y, rect.size.width, rect.size.height);
setDelegates
- Swift
- Objective-C
@objc public func setDelegates(statusDelegate: StatusDelegate? , gazeDelegate: GazeDelegate? , calibrationDelegate: CalibrationDelegate?, imageDelegate: ImageDelegate?, userStatusDelegate: UserStatusDelegate? = nil)
- (void)setDelegatesWithStatusDelegate:(id <StatusDelegate> _Nullable)statusDelegate gazeDelegate:(id <GazeDelegate> _Nullable)gazeDelegate calibrationDelegate:(id <CalibrationDelegate> _Nullable)calibrationDelegate imageDelegate:(id <ImageDelegate> _Nullable)imageDelegate userStatusDelegate:(id <UserStatusDelegate> _Nullable)userStatusDelegate;
Enroll all delegates that inherited GazeTrackerDelegate
to GazeTracker
at once.
Parameter | Type | Description |
---|---|---|
statusDelegate | StatusDelegate? , id<StatusDelegate> | StatusDelegate |
gazeDelegate | GazeDelegate? , id<GazeDelegate> | GazeDelegate |
calibrationDelegate | CalibrationDelegate? , id<CalibrationDelegate> | CalibrationDelegate |
imageDelegate | ImageDelegate? , id<ImageDelegate> | ImageDelegate |
userStatusDelegate (optional) | UserStatusDelegate , id<UserStatusDelegate> | UserStatusDelegate |
- Swift
- Objective-C
// parameter is nullable: attributes are selective.
tracker.setDelegates(statusDelegate: statusDelegate, gazeDelegate: self);
// Parameters are nullable
[tracker setDelegatesWithStatusDelegate:statusDelegate gazeDelegate:gazeDelegate calibrationDelegate:NULL imageDelegate:NULL, userStatusDelegate: userStatusDelegate];
GazeInfo
- Swift
- Objective-C
@objc public class GazeInfo : NSObject
@interface GazeInfo : NSObject
A class composed of information about eye tracking.
Summary
Variables | type | description |
---|---|---|
timestamp | Double | Timestamp of gaze point. The unit is millisecond. The time format is UTC. |
x | Double | x coordinate value of gaze point. Origin is device screen. The unit is point(pt ). |
y | Double | y coordinate value of gaze point. Origin is device screen. The unit is point(pt ). |
fixationX | Double | x coordinate value of last fixation point. |
fixationY | Double | y coordinate value of last fixation point. |
leftOpenness | Double | openness degree of left eye(0.0~1.0). value will only return when userStatusOption in on |
rightOpenness | Double | openness degree of right eye(0.0~1.0). value will only return when userStatusOption in on |
trackingState | TrackingState | SUCCESS , LOW_CONFIDENCE , UNSUPPORTED , FACE_MISSING |
eyeMovementState | EyeMovementState | FIXATION , SACCADE , UNKNOWN |
screenState | ScreenState | INSIDE_OF_SCREEN , OUTSIDE_OF_SCREEN , UNKNOWN |
timestamp
@objc public let timestamp : Double
Timestamp of gaze point. The unit is millisecond. The time format is UTC.
x
@objc public let x : Double
x coordinate value of gaze point. Origin is device screen. The unit is point(pt
).
y
@objc public let y : Double
y coordinate value of gaze point. Origin is device screen. The unit is point(pt
).
trackingState
@objc public let trackingState : TrackingState
See trackingState.
eyemovementState
@objc public let eyemovementState : EyeMovementState
See eyemovementState.
screenState
@objc public let screenState : ScreenState
See screenState.
FaceInfo
- Swift
- Objective-C
@objc public class FaceInfo : NSObject
@interface FaceInfo : NSObject
A class composed of information about face state.
Summary
Variables | type | description |
---|---|---|
timestamp | int | Timestamp of gaze point. The unit is millisecond. The time format is UTC. |
score | Double | Value of facial recognition confidence (0.0 ~ 1.0). |
rect | CGRect | Position and size of the face. |
pitch | Double | Pitch is the rotation around the X-axis of an object. It is also known as the attitude angle. In aviation, it is often referred to as the nose-up or nose-down angle. |
yaw | Double | Yaw is the rotation around the Y-axis of an object. It is also known as the heading angle. In aviation, it is often referred to as the bearing or compass angle. |
roll | Double | Roll is the rotation around the Z-axis of an object. It is also known as the bank angle. In aviation, it is often referred to as the wing-up or wing-down angle. |
imageSize | CGSize | Size of image for displaying face |
centerX | Double | x coordinate of face in screen |
centerY | Double | y coordinate of face in screen |
centerZ | Double | distance to face from camera |
timestamp
@objc public let timestamp : Double
Timestamp of face state. The unit is millisecond. The time format is UTC.
score
@objc public let score : Double
Value of facial recognition confidence (0.0 ~ 1.0).
rect
@objc public let rect : CGRect
It represents the location of the face with a rectangular box. The four sides of the box indicate the four positions of the face within the image.
pitch
@objc public let pitch : Double
Pitch is the rotation around the X-axis of an object. It is also known as the attitude angle. In aviation, it is often referred to as the nose-up or nose-down angle.
yaw
@objc public let yaw : Double
Yaw is the rotation around the Y-axis of an object. It is also known as the heading angle. In aviation, it is often referred to as the bearing or compass angle.
roll
@objc public let roll : Double
Roll is the rotation around the Z-axis of an object. It is also known as the bank angle. In aviation, it is often referred to as the wing-up or wing-down angle.
When the camera is positioned at the top, we defined it as 0 degrees.
centerX
@objc public let centerX : Double
x coordinate of face in screen
centerY
@objc public let centerY : Double
y coordinate of face in screen
centerZ
@objc public let centerZ : Double
distance to face from camera
imageSize
@objc public let imageSize : CGSize
Size of image for displaying face
UserStatusOption
- Swift
- Objective-C
@objc public class UserStatusOption: NSObject
@interface UserStatusOption : NSObject
The class contains User Status
options information for GazeTracker
Summary
functions |
---|
isUseAttention |
isUseBlink |
isUseDrowsiness |
useAttention |
useBlink |
useDrowsiness |
useAll |
isUseAttention
- Swift
- Objective-C
public Bool isUseAttention()
- (BOOL) isUseAttention;
Return Type | Description |
---|---|
boolean | Return true if Attention flag is on, otherwise return false . |
isUseBlink
- Swift
- Objective-C
public Bool isUseBlink()
- (BOOL) isUseBlink;
Return Type | Description |
---|---|
boolean | Return true if Blink flag is on, otherwise return false . |
isUseDrowsiness
- Swift
- Objective-C
public Bool isUseDrowsiness()
- (BOOL) isUseDrowsiness;
Return Type | Description |
---|---|
boolean | Return true if Blink flag is on, otherwise return false . |
useAttention
- Swift
- Objective-C
public void useAttention()
- (void) useAttention;
Set Attention flag.
useBlink
- Swift
- Objective-C
public void useBlink()
- (void) useBlink;
Set Blink flag.
useDrowsiness
- Swift
- Objective-C
public void useDrowsiness()
- (void) useDrowsiness;
Set Drowsiness flag.
useAll
- Swift
- Objective-C
public void useAll()
- (void) useAll;
Set All User Status flag.
InitializationDelegate
- Swift
- Objective-C
@objc public protocol InitailizationDelegate: GazeTrackerDelegate
@protocol InitializationDelegate <GazeTrackerDelegate>
Summary
function |
---|
onInitialized |
onInitialized
- Swift
- Objective-C
@objc public func onInitialized(tracker: GazeTracker?, error: InitializationError);
- (void)onInitializedWithTracker:(GazeTracker * _Nullable)tracker error:(enum InitializationError)error;
The callback function that calls when GazeTracker.init
function is called.
It returns a constructed object when succeed, but nil
if failed.
The error will be set by its type when construction failed.
Parameter | Type | Description |
---|---|---|
tracker | GazeTracker? , GazeTracker * | Relay GazeTracker when construction succeed. Relay nil if failed. |
error | InitializationError | The enum that contains error types of initGazeTracker. |
- Swift
- Objective-C
public func onInitialized( tracker: GazeTracker?, error: InitializationError){
if (tracker != nil) {
// Take object as class property when initialzation succeed
this.tracker = tracker;
} else {
if (error == InitializationError.ERROR_INIT) {
print("Initialization failed")
} else if (error == InitializationError.ERROR_CAMERA_PERMISSION) {
...
}
}
}
- (void)onInitializedWithTracker:(GazeTracker *)tracker error:(enum InitializationError)error {
if (tracker != nil){
self->tracker = tracker;
}else {
if (error == InitializationErrorERROR_INIT) {
// When initialization is failed
NSLog(@"Initialization failed");
} else if (error == InitializationErrorERROR_CAMERA_PERMISSION){
// When camera permission doesn not exists
NSLog(@"Required permission not granted");
}
...
}
}
GazeTrackerDelegate
- Swift
- Objective-C
@objc public protocol GazeTrackerDelegate
@protocol GazeTrackerDelegate
protocol
variable uses in GazeTracker
. All delegate in GazeTracker
inherit GazeTrackerDelegate
.
StatusDelegate
- Swift
- Objective-C
@objc public protocol StatusDelegate: GazeTrackerDelegate
@protocol StatusDelegate <GazeTrackerDelegate>
onStarted
- Swift
- Objective-C
public func onStarted()
- (void)onStarted;
The function that automatically calls after GazeTracker.StartTracking
succeed. Actions like calibration, preview, etc. are available after it.
- Swift
- Objective-C
public func onStarted(){
tracker.startCalibration()
}
- (void)onStarted{
[tracker startCalibrationWithMode:CalibrationModeDEFAULT region:self.view.frame];
}
onStopped
- Swift
- Objective-C
public void onStopped(error: StatusError)
- (void)onStoppedWithError:(enum StatusError)error;
Error value will be StatusError.ERROR_NONE
if gaze tracking stopped after GazeTracker.stopTracking called but different values for a different statuses.
It works properly when startTracking
explicitly called at the gaze tracker stopping process.
Parameter | Type | Description |
---|---|---|
error | StatusError | StatusError.ERROR_NONE , StatusError.ERROR_CAMERA_START , StatusError.ERROR_CAMERA_INTERRUPT are available. |
- Swift
- Objective-C
public void onStopped(error: StatusError) {
if (error != .ERROR_NONE) {
// stopTracking() is not called
}
}
- (void)onStoppedWithError:(enum StatusError)error {
NSLog(@"stopped : %ld", error);
}
GazeDelegate
- Swift
- Objective-C
@objc public protocol GazeDelegate: GazeTrackerDelegate
@protocol GazeDelegate <GazeTrackerDelegate>
onGaze
- Swift
- Objective-C
@objc public func onGaze(gazeInfo : GazeInfo)
- (void)onGazeWithGazeInfo:(GazeInfo * _Nonnull)gazeInfo;
Parameter | Type | Description |
---|---|---|
gazeInfo | GazeInfo | See GazeInfo. |
- Swift
- Objective-C
public func onGaze(gazeInfo : GazeInfo){
if ( gazeInfo.trackingState == TrackingState.SUCCESS){
// Gaze tracking succeed after calibration
}else {
// Face Missing
}
}
- (void)onGazeWithGazeInfo:(GazeInfo * _Nonnull)gazeInfo {
if([gazeInfo trackingState] == TrackingStateSUCCESS){
// Gaze tracking succeed after calibration
}else {
// Face Missing
}
}
CalibrationDelegate
- Swift
- Objective-C
@objc public protocol CalibrationDelegate: GazeTrackerDelegate
@protocol CalibrationDelegate <GazeTrackerDelegate>
CalibrationDelegate is never called on the main queue. Please use DispatchQueue.main.async
when using UI.
onCalibrationNextPoint
- Swift
- Objective-C
@objc public func onCalibrationNextPoint(x: Double, y: Double)
- (void)onCalibrationNextPointWithX:(double)x y:(double)y;
The x, y coordinate value of the gaze point that should be focused during the calibration process.
You should call startCollectSamples
to keep process the calibration.
Parameter | Type | Description |
---|---|---|
x | Double , double | The x coordinate value of the gaze point that should be focused during the calibration process. Origin is the device screen. The unit is point(pt ) |
y | Double , double | The y coordinate value of the gaze point that should be focused during the calibration process. Origin is the device screen. The unit is point(pt ) |
- Swift
- Objective-C
public func onCalibrationNextPoint(x: Double, y: Double){
// Display UI at given point
// call startCollectSamples() like below for start calibration
tracker.startCollectSamples()
}
- (void)onCalibrationNextPointWithX:(double)x y:(double)y {
// Display UI at given point
// call startCollectSamples() like below for start calibration
[tracker startCollectSamples];
}
onCalibrationProgress
- Swift
- Objective-C
@objc public func onCalibrationProgress(progress: Double)
- (void)onCalibrationProgressWithProgress:(double)progress;
Progress will be between 0.0~1.0.
The next point will be guided when the value reaches 1.0.
Parameter | Type | Description |
---|---|---|
progress | Double , double | Calibration progression for each point. |
- Swift
- Objective-C
public func onCalibrationProgress(progress: Double){
progress.setProgress(progress)
}
- (void)onCalibrationProgressWithProgress:(double)progress {
[progressView setProgress:progress];
}
onCalibrationFinished
- Swift
- Objective-C
@objc public func onCalibrationFinished(calibrationData: [Double])
- (void)onCalibrationFinishedWithCalibrationData:(NSArray<NSNumber *> * _Nonnull)calibrationData;
Callback for notify calibration ends. When this function is called, the calibration UI will be removed. After this callback, data from GazeDelegate
will be calibrated gaze data.
The calibrationData passed as a parameter has already been applied to GazeTracker,
and you can save it and load the this calibration data directly into GazeTracker without new calibration by calling setCalibrationData(calibrationData) when restarting the app etc..
parameter | Type | Description |
---|---|---|
calibrationData | [Double] , NSArray<NSNumber *> * | Calibration Data |
- Swift
- Objective-C
public func onCalibrationFinished(calibrationData: [Double]) {
//Remove calibration UI
removeCalibrationUI()
//save calibration Data
self.calibrationData = calibrationData
}
- (void)onCalibrationFinishedWithCalibrationData:(NSArray<NSNumber *> * _Nonnull)calibrationData{
// Remove calibration UI
[self removeCalibrationUI];
//save calibration Data
[self calibrationData] = calibrationData;
}
UserStatusDelegate
- Swift
- Objective-C
@objc public protocol UserStatusDelegate: GazeTrackerDelegate
@protocol UserStatusDelegate <GazeTrackerDelegate>
-
Attention: How much the user attention is focused on the screen content for
interval
time (0.0 ~ 1.0) -
Drowsiness: If the user feel drowsiness (True/False)
-
Blink: If the user blink eyes (left eye, right eye, general(both eyes))
Summary
Functions |
---|
onAttention |
onBlink |
onDrowsiness |
onAttention
- Swift
- Objective-C
@objc public func onAttention(timestampBegin: Int, timestampEnd: Int, score: Double)
@optional
- (void)onAttentionWithTimestampBegin:(NSInteger)timestampBegin timestampEnd:(NSInteger)timestampEnd score:(double)score;
- Timestamp range of the data will be passed as
timestampBegin
andtimestampEnd
inonAttention
callback. - The default time
interval
is30
seconds. - If the user attention level is low,
score
inonAttention
callback will be closed to 0.0. - If the user attention level is high,
score
inonAttention
callback will be closed to 1.0.
Parameters | Type | Description |
---|---|---|
timestampBegin | Int , NSInteger | Beginning Timestamp of the data. |
timestampEnd | Int , NSInteger | Ending Timestamp of the data. |
score | Double , double | User Attention rate score between the timestamps. |
- Swift
- Objective-C
func onAttention(timestampBegin: Int, timestampEnd: Int, score: Double) {
attentionView.text = "Attention: " + String(round(score * 10000) / 10000)
}
- (void)onAttentionWithTimestampBegin:(NSInteger)timestampBegin timestampEnd:(NSInteger)timestampEnd score:(double)score {
[attentionView setText: [NSString stringWithFormat: @"Attention : %f", ((double)((NSInteger)(score * 1000))/1000.0)]];
}
onBlink
- Swift
- Objective-C
@objc public func onBlink(timestamp: Int, isBlinkLeft: Bool, isBlinkRight: Bool, isBlink: Bool, leftOpenness: Double, rightOpenness: Double)
@optional
- (void)onBlinkWithTimestamp:(NSInteger)timestamp isBlinkLeft:(BOOL)isBlinkLeft isBlinkRight:(BOOL)isBlinkRight isBlink:(BOOL)isBlink leftOpenness:(double)leftOpenness rightOpenness(double)rightOpenness;
- Timestamp of the data will be passed as
timestamp
inonBlink
callback. - If the user blink left eye,
isBlinkLeft
inonBlink
callback will betrue
. - If the user blink right eye,
isBlinkRight
inonBlink
callback will betrue
. - If the user blink eyes,
isBlink
inonBlink
callback will betrue
(This is a general blink condition). - If the user's eyes are wide,
leftOpenness
/rightOpenness
inonBlink
callback will be closed to 1.0. - If the user's eyes are narrow,
leftOpenness
/rightOpenness
inonBlink
callback will be closed to 0.0.
eyeOpenness
has separated toleftOpenness
/rightOpenness
Parameters | Type | Description |
---|---|---|
timestamp | Int , NSInteger | Timestamp of the data. |
isBlinkLeft | Bool , BOOL | User Left Blink flag. |
isBlinkRight | Bool , BOOL | User Right Blink flag. |
isBlink | Bool , BOOL | User Blink flag. |
leftOpenness | Double , double | User Left EyeOpenness rate. |
rightOpenness | Double , double | User Right EyeOpenness rate. |
- Swift
- Objective-C
func onBlink(timestamp: Int, isBlinkLeft: Bool, isBlinkRight: Bool, isBlink: Bool, leftOpenness: Double, rightOpenness: Double) {
blinkView.text = "Blink: " + String(isBlink)
blinkLeftView.text = "Blink Left: " + String(isBlinkLeft)
blinkRightView.text = "Blink Right: " + String(isBlinkRight)
}
- (void)onBlinkWithTimestamp:(NSInteger)timestamp isBlinkLeft:(BOOL)isBlinkLeft isBlinkRight:(BOOL)isBlinkRight isBlink:(BOOL)isBlink leftOpenness:(double)leftOpenness rightOpenness:(double)rightOpenness {
[blinkView setText:[NSString stringWithFormat:@"Blink: %d", isBlink]];
[blinkLeftView setText:[NSString stringWithFormat:@"Blink Left: %d", isBlinkLeft]];
[blinkRightView setText:[NSString stringWithFormat:@"Blink Right: %d", isBlinkRight]];
}
onDrowsiness
- Swift
- Objective-C
@objc public func onDrowsiness(timestamp: Int, isDrowsiness: Bool, intensity: Double)
@optional
- (void)onDrowsinessWithTimestamp:(NSInteger)timestamp isDrowsiness:(BOOL)isDrowsiness intensity:(double)intensity;
-
Timestamp of the data will be passed as
timestamp
inonDrowsiness
callback. -
If the user feel Drowsiness,
isDrowsiness
inonDrowsiness
callback will betrue
, Otherwise,isDrowsiness
will befalse
.
Parameters | Type | Description |
---|---|---|
timestamp | Int , NSInteger | Timestamp of the data. |
isDrowsiness | Bool , BOOL | User Drowsiness flag. |
intensity | Double , double | Drowsiness intensity. |
- Swift
- Objective-C
func onDrowsiness(timestamp: Int, isDrowsiness: Bool, intensity: Double) {
drowsinessView.text = "Drowsiness: " + String(isDrowsiness)
}
- (void)onDrowsinessWithTimestamp:(NSInteger)timestamp isDrowsiness:(BOOL)isDrowsiness intensity:(double)intensity{
[drowsinessView setText:[NSString stringWithFormat:@"Drowsiness : %d", isDrowsiness]];
}
ImageDelegate
- Swift
- Objective-C
@objc public protocol ImageDelegate: GazeTrackerDelegate
@protocol ImageDelegate <GazeTrackerDelegate>
onImage
- Swift
- Objective-C
@objc public func onImage(timestamp: Double, image: CMSampleBuffer)
- (void)onImageWithTimestamp:(double)timestamp image:(CMSampleBufferRef _Nonnull)image;
The function that provide the image as CMSampleBuffer
form.
Parameter | Type | Description |
---|---|---|
timestamp | Double , double | The timestamp of camera image creation. The unit is millisecond. The time format is UTC. |
image | CMSampleBuffer , CMSampleBufferRef | CMSampleBuffer type image from camera((kCVPixelFormatType_32BGRA , AVCaptureSession.Preset.vga640x480 ). The direction is 90 deg rotated as CCW from portrait direction. |
- Swift
- Objective-C
public func onImage(timestamp: Double, image: CMSampleBuffer){
// SAMPLE: save image as jpeg with proper function
if imageSave {
writeJPEGFile(image: image);
}
}
- (void)onImageWithTimestamp:(double)timestamp image:(CMSampleBufferRef)image {
// SAMPLE: save image as jpeg with proper function
if (imageSave){
[self writeJPEGFileImage:image];
}
}
FaceDelegate
- Swift
- Objective-C
@objc public protocol FaceDelegate: GazeTrackerDelegate
@protocol FaceDelegate <GazeTrackerDelegate>
onFace
- Swift
- Objective-C
@objc public func onFace(faceInfo:FaceInfo)
- (void)onFaceWithFaceInfo:(FaceInfo * _Nonnull)info
Parameter | Type | Description |
---|---|---|
faceInfo | FaceInfo | See FaceInfo. |
- Swift
- Objective-C
public func onFace(faceInfo : FaceInfo) {
faceScoreLabel.text = "score = \(faceInfo.score)"
}
- (void)onFaceWithFaceInfo:(FaceInfo * _Nonnull)info {
faceScoreLabel.text = [NSString stringWithFormat:@"score = %f",faceInfo.score]
}
InitializationError
- Swift
- Objective-C
@objc public enum InitializationError: Int
enum InitializationError : NSInteger;
The Enum that contains error types of InitializationDelegate
Please read : Authentication for more details.
StatusError
- Swift
- Objective-C
@objc public enum StatusError: Int
enum StatusError : NSInteger;
The enum that contains error types of StatusDelegate
Constant | Value | Description |
---|---|---|
ERROR_NONE | 0 | GazeTracker.stopTracking call succeed without error. |
ERROR_CAMERA_START | 1 | Error code occurs when GazeTracker.startTracking is called but front camera of device is not available. |
ERROR_CAMERA_INTERRUPT | 2 | Error code occurs when camera is unavailable. |
TrackingState
- Swift
- Objective-C
@objc public enum TrackingState: Int
enum TrackingState : NSInteger;
The enum that contains state types using at GazeDelegate.
Constant | Type | Description |
---|---|---|
SUCCESS | 0 | Face alignment is in a best position (Gaze tracking success, with valid x and y). |
LOW_CONFIDENCE | 1 | Face alignment is not in the best position, should not be used for precise gaze tracking (Gaze tracking success, with less accurate x and y). |
UNSUPPORTED | 2 | Face alignment is not suitable for tracking (Gaze tracking fail, with invalid x and y). |
FACE_MISSING | 3 | Face is missing (Gaze tracking fail). |
EyeMovementState
- Swift
- Objective-C
@objc public enum EyeMovementState: Int
enum EyeMovementState : NSInteger;
The enum that contains state types using at GazeDelegate.
Constants | Value | Descritpion |
---|---|---|
FIXATION | 0 | It means that the gazes from the past and the gazes up to the present have made a fixation. |
SACCADE | 2 | It means that the gazes from the past and the gazes up to the present have formed a saccade. |
UNKNOWN | 3 | Not pixation or saccade. |
ScreenState
- Swift
- Objective-C
@objc public enum ScreenState: Int
enum ScreenState : NSInteger;
The enum that contains state types using at GazeDelegate.
Constants | Value | Description |
---|---|---|
INSIDE_OF_SCREEN | 0 | Gaze tracking is success and the gaze point is inside of device screen. |
OUTSIDE_OF_SCREEN | 1 | Gaze tracking is success and the gaze point is outside of device screen. |
UNKNOWN | 2 | Gaze tracking is fail. |
CalibrationMode
- Swift
- Objective-C
@objc public enum CalibrationMode: Int
enum CalibrationMode : NSInteger;
The enum that contains mode types of GazeTracker.startCalibartion .
Constants | Value | Description |
---|---|---|
DEFAULT | 0 | Default. Represent FIVE_POINT . |
ONE_POINT | 1 | One-point calibration mode. |
FIVE_POINT | 5 | Five-point calibration mode. |
SIX_POINT | 6 | Six-point calibration mode. |
AccuracyCriteria
- Swift
- Objective-C
@objc public enum AccuracyCriteria: Int
enum AccuracyCriteria : NSInteger;
The enum that contains accuracy criteria of GazeTracker.startCalibartion.
Constants | Value | Description |
---|---|---|
DEFAULT | 0 | Default calibration accuracy criteria. |
LOW | 1 | Low calibration accuracy criteria. |
HIGH | 2 | High calibration accuracy criteria. |