Skip to main content
Version: Eyedid Beta

API Docs

Overview

NametypeDescription
EyedidFlutterclassThe main class for utilizing gaze tracking functionality within the eyedid_flutter package.
GazeTrackerOptionsclassProvides options for configuring the Gaze Tracker.
GazeTrackerOptionsBuilderclassA builder class for constructing instances of GazeTrackerOptions.
CameraPositionclassRepresents the camera position for Android devices.
InitializedResultclassRepresents the result of the Eyedid GazeTracker initialization.
MetricsInfoclassMetricsInfo is designed to collect and structure data from the tracking event stream.
GazeInfoclassrepresents the data collected from gaze tracking events, providing detailed information about the user's gaze and related states.
FaceInfoclassrepresents the data collected from face tracking events, providing information about the detected face and its orientation in space.
BlinkInfoclassrepresents the data related to eye blink detection, providing information about whether a blink occurred and the openness level of each eye.
UserStatusInfoclassprovides information about the user's status, such as drowsiness and attention level, based on gaze and eye tracking data.
StatusInfoclassA class representing status information obtained from a status event.
CalibrationInfoclassA class that represents calibration status information for the Eyedid SDK.
OneEuroFilterManagerclassManages a collection of OneEuroFilters for filtering multiple values.
CalibrationModeenumAn enumeration representing the calibration modes for calibration processes.
CalibrationCriteriaenumAn enum representing different calibration criteria or levels.
TrackingStateenumEnum representing different states of gaze tracking.
EyeMovementStateenumAn enumeration representing different types of eye movement states.
ScreenStateenumAn enumeration representing different states of gaze position in relation to the device screen.
StatusTypeenumAn enumeration representing different types of status events for gaze tracking.
StatusErrorTypeenumAn enumeration representing different types of error scenarios when the gaze tracking status stops.
CalibrationTypeenumAn enumeration representing the different stages within a calibration process.
EyedidDeviceOrientationenumAn enumeration representing different types of gaze events orientation that can be tracked.

EyedidFlutter

class EyedidFlutter 

The EyedidFlutter class is the main interface for integrating gaze tracking and calibration features using the Eyedid SDK in a Flutter application.

This class provides methods for initializing the SDK, starting and stopping gaze tracking, and accessing various events related to gaze tracking and calibration.

Summary

FunctionCamera Permission RequiredDescription
checkCameraPermissionNoChecks if the camera permission is granted for the Eyedid SDK.
requestCameraPermissionNoRequests camera permission and returns whether the permission is granted.
initGazeTrackerYesInitializes the GazeTracker with the given license key and options.
releaseGazeTrackerYesReleases the GazeTracker resources.
getPlatformVersionNoRetrieves the version of the Eyedid SDK for the current platform.
startTrackingYesStarts the gaze tracking process.
stopTrackingYesStops the ongoing gaze tracking process.
isTrackingNoChecks if gaze tracking is currently active.
hasCameraPositionsNoChecks if the required device information is available for the Eyedid SDK(Android only).
addCameraPositionNoAdds camera position information to the Eyedid SDK(Android only).
getCameraPositionNoRetrieves the current camera position used by the Eyedid SDK(Android only).
getCameraPositionListNoRetrieves the list of current camera positions used by the Eyedid SDK(Android only).
selectCameraPositionNoSelects a specific camera position in the Eyedid SDK by index(Android only).
setTrackingFPSNoSets the frames per second (FPS) for gaze tracking in the Eyedid SDK.
startCalibrationNoInitiates the calibration process in the Eyedid SDK.
stopCalibrationNoStops the current calibration process in the Eyedid SDK.
isCalibratingNoChecks if the Eyedid SDK is currently performing calibration.
startCollectSamplesNoInforms the Eyedid SDK to start collecting samples for the current calibration target.
setCalibrationDataNoSets calibration data in the Eyedid SDK for refinement.
setAttentionRegionNoSets the region of interest (ROI) for the Attention feature in the Eyedid SDK.
getAttentionRegionNoRetrieves the currently set region of interest (ROI) for the Attention feature.
removeAttentionRegionNoRemoves the region of interest (ROI) for the Attention feature.
setForcedOrientationNoManually changes the device orientation in the Eyedid SDK (iOS only).
resetForcedOrientationNoResets the forced orientation in the Eyedid SDK (iOS only).
getTrackingEventNoReturns an event stream that provides camera tracking data.
getCalibrationEventNoReturns a stream of calibration events from the Eyedid SDK.
getStatusEventNoReturns an event stream that provides status updates for the Eyedid GazeTracker.

checkCameraPermission

Future<bool> checkCameraPermission() async

Checks if the camera permission is granted for the Eyedid SDK. This function asynchronously checks whether the camera permission is granted for the Eyedid SDK.

Return TypeDescription
Future<bool>true if the camera permission has been granted. false if the camera permission has not been granted or is denied.

Example

_hasCameraPermission = await _eyedidFlutterPlugin.checkCameraPermission();

if(!_hasCameraPermission) {
print("Camera permission is not granted. Please enable camera access in the device settings.");
}

requestCameraPermission

Future<bool> requestCameraPermission() async

Requests camera permission and returns whether the permission is granted. This function asynchronously requests camera permission from the user.

Return TypeDescription
Future<bool>true if the user grants the camera permission. false if the user denies the camera permission or if the permission was previously denied.

Example

_hasCameraPermission = await _eyedidFlutterPlugin.checkCameraPermission();

if(!_hasCameraPermission) {
_hasCameraPermission =
await _eyedidFlutterPlugin.requestCameraPermission();
}

initGazeTracker

FutureOr<InitializedResult?> initGazeTracker(
{required String licenseKey, GazeTrackerOptions? options})

Initializes the GazeTracker with the given license key and options.

This function asynchronously initializes the GazeTracker using the provided licenseKey and optional options. If no options are provided, default options are created using GazeTrackerOptionsBuilder.

ParametersTypeDescription
licenseKeyStringlicenseKey is required and used to authorize the GazeTracker.
optionsGazeTrackerOptions?options are the optional configuration settings for the GazeTracker. If null,default options are used.
Return TypeDescription
FutureOr<InitializedResult?>Returns a FutureOr containing an InitializedResult, which provides information on whether the initialization was successful or not. If the result is null, the initialization failed or was interrupted.

Example

final options = GazeTrackerOptionsBuilder()
.setPreset(CameraPreset.vga640x480)
.setUseGazeFilter(false)
.setUseBlink(false)
.setUseUserStatus(false)
.build();
final result = await _eyedidFlutterPlugin.initGazeTracker(licenseKey: _licenseKey, options: options);

releaseGazeTracker

Future<void> releaseGazeTracker() async

Releases the GazeTracker resources. This function asynchronously releases all resources associated with the GazeTracker. It should be called when the GazeTracker is no longer needed to free up memory and system resources.

Example

final isTracking = await _eyedidFlutterPlugin.isTracking();

if(!isTracking) {
await _eyedidFlutter.releaseGazeTracker();
print("Released Eyedid.")
}

getPlatformVersion

Future<String?> getPlatformVersion()

Retrieves the version of the Eyedid SDK for the current platform. This method returns the version of the Eyedid eye-tracking module for the specific platform (iOS/Android). The version string provides insight into the currently installed release of the Eyedid SDK.

Return TypeDescription
Future<String?>Returns a Future that resolves with the version string, or null if the version cannot be retrieved.

Example

try {
platformVersion = await _eyedidFlutterPlugin.getPlatformVersion() ?? 'Unknown platform version';
} on PlatformException catch (error) {
print(error);
platformVersion = 'Failed to get platform version.';
}

startTracking

Future<void> startTracking() async

Starts the gaze tracking process. This function initiates tracking of the user's gaze. Before calling this function, ensure that the gaze tracker has been initialized using initGazeTracker. If the camera permission has not been granted or if the gaze tracker has not been initialized correctly, this function may throw a PlatformException.

Example

final initialized = await _eyedidFlutterPlugin.initGazeTracker(licenseKey: _licenseKey, options: options);
if (initialized != null && initialized.result == true) {
_eyedidFlutterPlugin.startTracking();
}

stopTracking

Future<void> stopTracking() async

Stops the ongoing gaze tracking process. This function halts the current gaze tracking operation. Before calling this function, ensure that the gaze tracker has been initialized using initGazeTracker.

Throws a PlatformException if the camera permission is not granted or if the gaze tracker has not been properly initialized.

Example

 try {
_eyedidFlutterPlugin.stopTracking(); // Call the function to stop tracking
} on PlatformException catch (e) {
setState(() {
_stateString = "Occur PlatformException (${e.message})";
});
}

isTracking

Future<bool?> isTracking() async

Checks if gaze tracking is currently active.

Return TypeDescription
Future<bool?>true if tracking is active. false if tracking is inactive or not properly initialized.

Example

final isTracking = await _eyedidFlutterPlugin.isTracking() ?? false;
setState(() {
_showGaze = isTracking;
});

hasCameraPositions

Future<bool?> hasCameraPositions() async
note

This function is Android-specific.

Checks if the required device information is available for the Eyedid SDK. Determines if the necessary device information is present for optimal gaze tracking.

Return TypeDescription
Future<bool?>true if the device information is available. false if it is not available.

Example

final hasCameraPositions = await _eyedidFlutterPlugin.hasCameraPositions() ?? false;

if(!hasCameraPositions) {
print("Camera position information is missing, which may affect the accuracy of gaze tracking.");
}

addCameraPosition

Future<void> addCameraPosition(CameraPosition cameraPosition) async
note

This function is Android-specific.

Adds camera position information to the Eyedid SDK. Provides the necessary camera position data to improve gaze tracking accuracy. The cameraPosition parameter contains the relevant data.

ParametersTypeDescription
cameraPositionCameraPositionThe camera position data.

Example

final custom = CameraPosition(modelName: "CustomDevice", screenWidth: 1280, screenHeight: 1920, screenOriginX: -35, screenOriginY: 5, cameraOnLongerAxis: false);

await _eyedidFlutterPlugin.addCameraPosition(custom);

getCameraPositions

Future<CameraPosition?> getCameraPosition() async
note

This function is Android-specific.

Retrieves the current camera position used by the Eyedid SDK.

Return TypeDescription
Future[CameraPosition?]currently in use, or null if unavailable.

Example

final current = await _eyedidFlutterPlugin.getCameraPosition();

if(current != null) {
print("current camera position [${current.toString()}]");
}

getCameraPositionList

Future<List<CameraPosition>?> getCameraPositionList() async
note

This function is Android-specific.

Retrieves the list of current camera positions used by the Eyedid SDK.

Return TypeDescription
Future<List<CameraPosition>>Returns a list of CameraPosition objects, or null if unavailable.

Example

final currentList = await _eyedidFlutterPlugin.getCameraPositionList();

if(currentList != null) {
for(var cp in currentList) {
print("current camera position [${cp.toString()}]");
}
}

selectCameraPosition

Future<void> selectCameraPosition(int idx) async
note

This function is Android-specific.

Selects a specific camera position in the Eyedid SDK by index. Chooses a camera position from the available list using the provided idx.

ParametersTypeDescription
idxintThe index of the camera position to select.

Example

final screenSize = MediaQuery.of(context).size;
final screenWidth = screenSize.width;
final screenHeight = screenSize.height;

// Example: Galaxy Fold's default screen resolution conditions
if (screenWidth == 1768 && screenHeight == 2208) {
// Using the internal main screen
await _eyedidFlutterPlugin.selectCameraPosition(0); // Main camera position
print("Main camera position selected for Galaxy Fold (Main Screen).");
} else if (screenWidth == 832 && screenHeight == 2260) {
// Using the external cover screen
await _eyedidFlutterPlugin.selectCameraPosition(1); // Cover camera position
print("Cover camera position selected for Galaxy Fold (Cover Screen).");
}

setTrackingFPS

Future<bool?> setTrackingFPS(int fps) async

Sets the frames per second (FPS) for gaze tracking in the Eyedid SDK. Specifies the desired FPS for gaze tracking. By default, the FPS is set to 30 for the front camera, but higher values can be set if supported by the device.

ParametersTypeDescription
fpsintThe desired frames per second for gaze tracking.
Return TypeDescription
Future<bool?>true if the FPS is successfully updated. false if the FPS is out of range (below 1) or if an error occurs.

Example

final result = await _eyedidFlutterPlugin.setTrackingFPS(15) ?? false;

if (result) {
print("Tracking FPS set to 15 successfully.");
}

startCalibration

Future<void> startCalibration(CalibrationMode calibrationMode,
{CalibrationCriteria calibrationCriteria = CalibrationCriteria.standard,
Rect? region}) async

Initiates the calibration process in the Eyedid SDK.

ParametersTypeDescription
calibrationModeCalibrationModeThe mode to use (1, 5 targets).
calibrationCriteriaCalibrationCriteriaThe criteria for calibration (default is CalibrationCriteria.standard).
regionRect?The screen region where targets will appear (default is the full screen).

Example

// 5 point target calibration
_eyedidFlutterPlugin.startCalibration(CalibrationMode.five);

stopCalibration

Future<void> stopCalibration() async

Stops the current calibration process in the Eyedid SDK.

Any progress made before stopping will not be saved.

Example

await _eyedidFlutterPlugin.stopCalibration();

setState(() {
_calibrationShowing = false;
});

isCalibrating

Future<bool?> isCalibrating() async

Checks if the Eyedid SDK is currently performing calibration.

Return TypeDescription
Future<bool?>true if calibration is in progress. false if calibration is not active or if an error occurs.

Example

final isCalibrating = await _eyedidFlutterPlugin.isCalibrating() ?? false;

if (isCalibrating) {
await _eyedidFlutterPlugin.stopCalibration();

setState(() {
_calibrationShowing = false;
});
}

startCollectSamples

Future<void> startCollectSamples() async

Informs the Eyedid SDK to start collecting samples for the current calibration target. Call this function when the calibration target is displayed on the UI and you are ready to collect calibration data.

Example

 _eyedidFlutterPlugin.getCalibrationEvent().listen((event) {
final info = CalibrationInfo(event);
if (info.type == CalibrationType.nextPoint) {
setState(() {
_nextX = info.next!.x;
_nextY = info.next!.y;
_calibrationProgress = 0.0;
});
Future.delayed(const Duration(milliseconds: 500), () {
_eyedidFlutterPlugin.startCollectSamples();
});
}
});

setCalibrationData

Future<void> setCalibrationData(List<double> data) async

Sets calibration data in the Eyedid SDK for refinement.

Provides the Eyedid SDK with calibration data for adjustment or refinement.

ParametersTypeDescription
dataList<double>A list of doubles containing the calibration data.

Example

final data = await loadCalibrationData();

await _eyedidFlutterPlugin.setCalibrationData(data);

setAttentionRegion

Future<void> setAttentionRegion(Rect region) async

Sets the region of interest (ROI) for the Attention feature in the Eyedid SDK.

ParametersTypeDescription
regionRectspecifies the area of interest as a [Rect] on the device screen.

Example

// roi is x : 10, y : 10, width : 100, height : 100
Rect roi = Rect(10, 10, 100, 100);
await _eyedidFlutterPlugin.setAttentionRegion(roi);

getAttentionRegion

Future<Rect?> getAttentionRegion() async

Retrieves the currently set region of interest (ROI) for the Attention feature.

Return TypeDescription
Future<Rect?>current Rect representing the ROI, or null if not set.

Example


final roi = await _eyedidFlutterPlugin.setAttentionRegion();

if (roi != null) {
print("roi is ${roi.toString()}");
}

removeAttentionRegion

Future<void> removeAttentionRegion() async

Removes the region of interest (ROI) for the Attention feature.

After removal, attention-related information will no longer be provided.

Example

final roi = await _eyedidFlutterPlugin.setAttentionRegion();

if (roi != null) {
await _eyedidFlutterPlugin.removeAttentionRegion();
}

setForcedOrientation

Future<bool?> setForcedOrientation(EyedidDeviceOrientation orientation) async
note

This function is iOS-specific.

Manually changes the device orientation in the Eyedid SDK

ParametersTypeDescription
orientationEyedidDeviceOrientationChanges the device orientation using EyedidDeviceOrientation.
Return TypeDescription
Future<bool?>true if successful. false if failed or unsupported.

Example

final result = await _eyedidFlutterPlugin.setForcedOrientation(EyedidDeviceOrientation.portrait) ?? false;
if (result){
print("set portrait orientation.")
}

resetForcedOrientation

Future<bool?> resetForcedOrientation() async
note

This function is iOS-specific.

Resets the forced orientation in the Eyedid SDK.

After calling, the device's orientation will revert to automatic.

Return TypeDescription
Future<bool?>true if the reset was successful. false if failed or unsupported.

Example

await _eyedidFlutterPlugin.resetForcedOrientation();

getTrackingEvent

Stream<dynamic> getTrackingEvent()

Returns an event stream that provides camera tracking data.

This stream delivers real-time tracking matrix information from the camera. Subscribe to the stream to process tracking data.

Example

_eyedidFlutterPlugin.getTrackingEvent().listen((event) {
final info = MetricsInfo(event);
if (info.gazeInfo.trackingState == TrackingState.success) {
setState(() {
_x = info.gazeInfo.gaze.x;
_y = info.gazeInfo.gaze.y;
_gazeColor = Colors.green;
});
} else {
setState(() {
_gazeColor = Colors.red;
});
}
});

getCalibrationEvent

Stream<dynamic> getCalibrationEvent()

Returns a stream of calibration events from the Eyedid SDK.

The stream provides updates about the calibration process through CalibrationEvent objects, allowing subscribers to receive real-time calibration data.

Consumers can listen to this stream to track calibration progress or changes.

Example

_eyedidFlutterPlugin.getCalibrationEvent().listen((event) {
final info = CalibrationInfo(event);
if (info.type == CalibrationType.nextPoint) {
setState(() {
_nextX = info.next!.x;
_nextY = info.next!.y;
_calibrationProgress = 0.0;
});
Future.delayed(const Duration(milliseconds: 500), () {
_eyedidFlutterPlugin.startCollectSamples();
});
} else if (info.type == CalibrationType.progress) {
setState(() {
_calibrationProgress = info.progress!;
});
} else if (info.type == CalibrationType.finished) {
setState(() {
_isCaliMode = false;
});
}
});

getStatusEvent

Stream<dynamic> getStatusEvent()

Returns an event stream that provides status updates for the Eyedid GazeTracker.

This stream notifies when the tracking starts or stops, providing real-time updates on the GazeTracker's status.

Example

_eyedidFlutterPlugin.getStatusEvent().listen((event) {
final info = StatusInfo(event);
if (info.type == StatusType.start) {
setState(() {
_stateString = "start Tracking";
_showingGaze = true;
});
} else {
setState(() {
_stateString = "stop Trakcing : ${info.errorType?.name}";
_showingGaze = false;
});
}
});

GazeTrackerOptions

class GazeTrackerOptions {
final bool useBlink;
final bool useUserStatus;
final bool useGazeFilter;
final int maxConcurrency;
final CameraPreset preset;
// Returns a new [GazeTrackerOptionsBuilder] to configure the options.
static GazeTrackerOptionsBuilder get builder => GazeTrackerOptionsBuilder();
}

Provides options for configuring the Gaze Tracker. Use GazeTrackerOptions.builder to customize options such as blink detection, user status tracking, gaze filtering, concurrency control, and camera presets.

Example

var options = GazeTrackerOptions.builder
.setUseBlink(true)
.setMaxConcurrency(2)
.setPreset(CameraPreset.fhd1920x1080)
.build();

GazeTrackerOptionsBuilder

class GazeTrackerOptionsBuilder

A builder class for constructing instances of GazeTrackerOptions. The GazeTrackerOptionsBuilder provides a fluent interface for configuring the options used by the gaze tracker. This includes settings such as blink detection, user status tracking, gaze filtering, concurrency limits, and camera presets.

default parameters.

note
  • _useBlink = false;
  • _useUserStatus = false;
  • _useGazeFilter = true;
  • _maxConcurrency = 4;
  • CameraPreset _preset = CameraPreset.vga640x480;

Summary

FunctionDescriptionReturn Type
buildBuilds the GazeTrackerOptions instance with the specified settings.GazeTrackerOptions
setCameraPresetSets the camera resolution preset to be used for gaze tracking.GazeTrackerOptionsBuilder
setMaxConcurrencySets the maximum number of concurrent threads allowed for processing.GazeTrackerOptionsBuilder
setUseBlinkSets whether to enable blink detection.GazeTrackerOptionsBuilder
setUseUserStatusSets whether to enable user status tracking (e.g., focus, drowsiness).GazeTrackerOptionsBuilder
setUseGazeFilterSets whether to apply a gaze filter to stabilize gaze tracking data.GazeTrackerOptionsBuilder

build

GazeTrackerOptions build()

Builds and returns a GazeTrackerOptions instance with the configured settings. This method finalizes the configuration and returns an instance of GazeTrackerOptions.

Example

var options = GazeTrackerOptionsBuilder().build();

setCameraPreset

GazeTrackerOptionsBuilder setPreset(CameraPreset preset)

Sets the camera preset for the gaze tracker.

ParametersTypeDescription
presetCameraPresetThe camera resolution preset to use.
Return TypeDescription
GazeTrackerOptionsBuilderThe GazeTrackerOptionsBuilder instance for method chaining.

Example

var options = GazeTrackerOptionsBuilder()
.setPreset(CameraPreset.fhd1920x1080)
build();

setMaxConcurrency

GazeTrackerOptionsBuilder setMaxConcurrency(int maxConcurrency)

Sets the maximum number of concurrent gaze tracking processes.

ParametersTypeDescription
maxConcurrencyintspecifies the maximum number of threads to be used for gaze tracking.
tip

If maxConcurrency is 0, it will use as many threads as possible without limitation. If maxConcurrency is less than 1, it defaults to 1.

Return TypeDescription
GazeTrackerOptionsBuilderThe GazeTrackerOptionsBuilder instance for method chaining.

Example

var options = GazeTrackerOptionsBuilder()
.setMaxConcurrency(4)
.setPreset(CameraPreset.fhd1920x1080)
build();

GazeTrackerOptionsBuilder setUseBlink(bool useBlink)

Sets whether to use the blink detection feature.

ParametersTypeDescription
useBlinkbooluseBlink determines if the blink detection feature should be enabled.
Return TypeDescription
GazeTrackerOptionsBuilderThe GazeTrackerOptionsBuilder instance for method chaining.

Example

var options = GazeTrackerOptionsBuilder()
.setUseBlink(true)
.setMaxConcurrency(4)
.setPreset(CameraPreset.fhd1920x1080)
build();

setUseUserStatus

GazeTrackerOptionsBuilder setUseUserStatus(bool useUserStatus)

Sets whether to use user status tracking.

ParametersTypeDescription
useUserStatusbooluseUserStatus determines if the user status tracking should be enabled.
Return TypeDescription
GazeTrackerOptionsBuilderThe GazeTrackerOptionsBuilder instance for method chaining.

Example

var options = GazeTrackerOptionsBuilder()
.setUseUserStatus(true)
.setUseBlink(true)
.setMaxConcurrency(4)
.setPreset(CameraPreset.fhd1920x1080)
build();

setUseGazeFilter

GazeTrackerOptionsBuilder setUseGazeFilter(bool useGazeFilter)

Sets whether to use gaze filtering for stabilization.

ParametersTypeDescription
useGazeFilterbooluseGazeFilter determines if gaze filtering should be applied to the tracking data.
Return TypeDescription
GazeTrackerOptionsBuilderThe GazeTrackerOptionsBuilder instance for method chaining.

Example

var options = GazeTrackerOptionsBuilder()
.setUseGazeFilter(false)
.setUseUserStatus(true)
.setUseBlink(true)
.setMaxConcurrency(4)
.setPreset(CameraPreset.fhd1920x1080)
build();

CameraPosition

class CameraPosition {
final String modelName;
final double screenWidth;
final double screenHeight;
final double screenOriginX;
final double screenOriginY;
final bool cameraOnLongerAxis;
}

Represents the camera position for Android devices. This class provides the necessary device information for accurate gaze tracking on Android.

Variablestypedescription
modelNameStringThe name of the device model.
screenWidthdoubleThe width of the screen.
screenHeightdoubleThe height of the screen.
screenOriginXdoubleThe distance between the center of the camera and the top-left corner of the screen along the x-axis (in mm). Positive values represent the right direction.
screenOriginYdoubleThe distance between the center of the camera and the top-left corner of the screen along the y-axis (in mm). Positive values represent the upward direction.
cameraOnLongerAxisboolbool value indicating whether the camera is located along the longer axis of the device.

Constructor

CameraPosition({
required this.modelName,
required this.screenWidth,
required this.screenHeight,
required this.screenOriginX,
required this.screenOriginY,
this.cameraOnLongerAxis = false,
});

Constructor to initialize the camera position

ParametersTypeDescription
modelNameStringThe name of the device model.
screenWidthdoubleThe width of the screen (in pixels).
screenHeightdoubleThe height of the screen (in pixels).
screenOriginXdoubleThe x-axis distance from the camera to the top-left corner of the screen (in mm).
screenOriginYdoubleThe y-axis distance from the camera to the top-left corner of the screen (in mm).
cameraOnLongerAxisboolA bool indicating if the camera is on the longer axis of the device.

Example

final cameraPosition = CameraPosition(
modelName: "SM-T720",
screenWidth: 1600,
screenHeight: 2560,
screenOriginX: -72,
screenOriginY: -4); // galaxy tab

InitializedResult

class InitializedResult {
late bool result;
late String message;
static const String isAlreadyAttempting = "Already attempting";
static const String gazeTrackerAlreadyInitialized = "Gaze tracker is already initialized.";
}

Represents the result of the Eyedid GazeTracker initialization. Use this class to obtain the result of the initialization process after calling initGazeTracker.

Variablestypedescription
resultboolWhether the initialization was successful.
messageStringAdditional information in case of failure. (If initialization is successful, this will be "ERROR_NONE.")
isAlreadyAttemptingStringInitialization is already in progress.
gazeTrackerAlreadyInitializedStringGaze tracker is already initialized.

Example

 final result = await _eyedidFlutterPlugin.initGazeTracker(
licenseKey: _licenseKey, options: options);
if (result != null) {
_eyedidFlutterPlugin.startTracking();
} else if (result.message == InitializedResult.isAlreadyAttempting ||
result.message == InitializedResult.gazeTrackerAlreadyInitialized) {
final isTracking = await _eyedidFlutterPlugin.isTracking() ?? false;
if (isTracking) {
showGaze = true;
}
}

MetricsInfo

class MetricsInfo {
late final int timestamp;
late final GazeInfo gazeInfo;
late final FaceInfo faceInfo;
late final BlinkInfo blinkInfo;
late final UserStatusInfo userStatusInfo;
}

MetricsInfo is a data structure designed to collect and organize various information from the tracking event stream.

Variablestypedescription
timestampintThe timestamp of the event.
gazeInfoGazeInfoInformation about the user's gaze, such as coordinates and tracking state.
faceInfoFaceInfoDetails regarding the user's face position and orientation.
blinkInfoBlinkInfoBlink detection data.
userStatusInfoUserStatusInfoAdditional status information about the user, such as attention level.

GazeInfo

class GazeInfo {
late final Point<double> gaze;
late final Point<double> fixation;
late final TrackingState trackingState;
late final EyemovementState eyemovementState;
late final ScreenState screenState;
}

GazeInfo represents the data collected from gaze tracking events, providing detailed information about the user's gaze and related states. GazeInfo is essential for interpreting the user's gaze behavior and determining their interaction with the screen in real-time.

Variablestypedescription
gazePoint<double>The current gaze point on the screen, represented as coordinates (x, y).
fixationPoint<double>The most recently detected fixation point, providing stable coordinates where the user's gaze was fixated.
trackingStateTrackingStateThe state of the gaze tracking, indicating whether tracking is successful, failed.
eyemovementStateEyemovementStateInformation about the user's eye movement, such as fixation or saccade.
screenStateScreenStateIndicates whether the user's gaze is within the device screen boundaries or outside of it.

FaceInfo

class FaceInfo {
late final double score;
late final Rect rect;
late final Size imageSize;
late final double pitch;
late final double yaw;
late final double roll;
late final double centerX;
late final double centerY;
late final double centerZ;
}

FaceInfo represents the data collected from face tracking events, providing information about the detected face and its orientation in space. FaceInfo is useful for understanding the face orientation and its position relative to the camera, aiding in gaze tracking and other facial interactions.

Variablestypedescription
scoredoubleA confidence score (0.0 to 1.0) indicating the accuracy of the face detection.
rectRectThe bounding rectangle of the detected face, represented as a Rect object with coordinates relative to imageSize.
imageSizeSizeThe size of the image used by the camera during face detection, represented as a Size object.
pitchdoubleThe pitch angle of the face, representing the up and down tilt.
yawdoubleThe yaw angle of the face, representing the left and right rotation.
rolldoubleThe roll angle of the face, representing the tilt to the left or right side.
centerXdoubleThe distance of the face's center from the camera along the X-axis. (Unit is mm)
centerYdoubleThe distance of the face's center from the camera along the Y-axis. (Unit is mm)
centerZdoubleThe distance of the face's center from the camera along the Z-axis. (Unit is mm)

BlinkInfo

class BlinkInfo {
late final bool isBlink;
late final bool isBlinkLeft;
late final bool isBlinkRight;
late final double leftOpenness;
late final double rightOpenness;
}

BlinkInfo represents the data related to eye blink detection, providing information about whether a blink occurred and the openness level of each eye. BlinkInfo is used to monitor and interpret the blinking state of the user's eyes, aiding in detecting attention levels and gaze-based interactions.

Variablestypedescription
isBlinkboolA boolean value indicating whether a blink (both eyes) is detected.
isBlinkLeftboolA boolean value indicating whether the left eye is detected as blinking.
isBlinkRightboolA boolean value indicating whether the right eye is detected as blinking.
leftOpennessdoubleA double value (0.0 to 1.0) representing the openness level of the left eye, where 0.0 means fully closed and 1.0 means fully open.
rightOpennessdoubleA double value (0.0 to 1.0) representing the openness level of the right eye, where 0.0 means fully closed and 1.0 means fully open.

UserStatusInfo

class UserStatusInfo {
late final bool isDrowsy;
late final double drowsinessIntensity;
late final double attentionScore;
}

UserStatusInfo provides information about the user's status, such as drowsiness and attention level, based on gaze and eye tracking data. UserStatusInfo is useful for monitoring the user's state in real-time, helping to detect signs of drowsiness and measure attention levels during interactions.

Variablestypedescription
isDrowsyboolA boolean value indicating whether the user is detected as drowsy.
drowsinessIntensitydoubleA double value (0.0 to 1.0) representing the intensity of drowsiness, where 0.0 indicates no drowsiness and 1.0 indicates extreme drowsiness.
attentionScoredoubleA double value (0.0 to 1.0) representing the user's attention level, where 0.0 means no attention and 1.0 means full attention.

StatusInfo

class StatusInfo {
late final StatusType type;
late final StatusErrorType? errorType;
}

A class that represents the status information for Eyedid SDK's GazeTracker status event. It contains the status type (e.g., start, stop, unknown) and an optional error type if any error occurs when stopping.

Variablestypedescription
typeStatusTypeThe type of status event, such as 'start', 'stop', or 'unknown'.
errorTypeStatusErrorTypeThe type of error that occurred when the status is 'stop'. Can be null if no error occurs or if the status is not 'stop'.

Example

_eyedidFlutterPlugin.getStatusEvent().listen((event) {
final info = StatusInfo(event);
if (info.type == StatusType.start) {
setState(() {
_stateString = "start Tracking";
_showingGaze = true;
});
} else {
setState(() {
_stateString = "stop Trakcing : ${info.errorType?.name}";
_showingGaze = false;
});
}
});

CalibrationInfo

class CalibrationInfo {
late final CalibrationType type;
late final double? progress;
late final Point<double>? next;
late final List<double>? data;
}

A class that represents calibration status information for the Eyedid SDK. It contains the type of calibration event, progress percentage, the next calibration point, and calibration data if available.

Variablestypedescription
typeCalibrationTypeThe type of calibration event: progress, nextPoint, finished, or unknown.
progressdouble?The progress of the calibration process (0.0 to 1.0). Can be null if not applicable.
nextPoint<double>The next point for calibration. This is used when the event indicates the next calibration point. Can be null if not applicable.
dataList<double>?Calibration data collected when the calibration is finished. Can be null if not applicable.

Example

_eyedidFlutterPlugin.getCalibrationEvent().listen((event) {
final info = CalibrationInfo(event);
if (info.type == CalibrationType.nextPoint) {
setState(() {
_nextX = info.next!.x;
_nextY = info.next!.y;
_calibrationProgress = 0.0;
});
Future.delayed(const Duration(milliseconds: 500), () {
_eyedidFlutterPlugin.startCollectSamples();
});
} else if (info.type == CalibrationType.progress) {
setState(() {
_calibrationProgress = info.progress!;
});
} else if (info.type == CalibrationType.finished) {
setState(() {
_isCaliMode = false;
});
}
});

OneEuroFilterManager

class OneEuroFilterManager {
static const double _defaultFrequency = 30.0;
static const double _defaultMinCutOFF = 1.0;
static const double _defaultBeta = 0.007;
static const double _defaultDCutOFF = 1.0;
}

Manages a collection of OneEuroFilters for filtering multiple values.

Summary

FunctionDescription
ConstructorOneEuroFilterManager constructor.
filterValuesPerforms filtering for the given timestamp and a list of values.
getFilteredValuesRetrieves the filtered values obtained after applying the filter.

Constructor

OneEuroFilterManager(
{required int count,
freq = _defaultFrequency,
minCutOff = _defaultMinCutOFF,
beta = _defaultBeta,
dCutOff = _defaultDCutOFF})

Use this constructor to initialize the filter and specify the number of values to filter and the filter's settings.

ParametersTypeDescription
countintThe number of values to filter. For example, if set to 2, it filters two input values separately.
freqdoubleRepresents the sampling frequency of the data. The default value is 30.0
minCutOffdoubleMinimum cutoff frequency used to filter noise in the data. The default value is 1.0
betadoubleIndicates the strength of applying the derivative filter. The default value is 0.007
dCutOffdoubleRepresents the cutoff frequency for the derivative data. The default value is 1.0

Example

 he method uses default values for freq, minCutOff, beta, and dCutOff.
OneEuroFilterManager _filterManager = OneEuroFilterManager(count: 2);

filterValues

bool filterValues(int timestamp, List<double> val)

Performs filtering for the given timestamp and a list of values.

Use this method to perform filtering on the given data and obtain the filtered results. If filtering fails or if there are invalid values, the filter is reset, and false is returned.

ParametersTypeDescription
timestampintRepresents the timestamp of the data and is used for filtering.
valList<double>A list of values to be filtered, representing the values for which you want to obtain filtered results.
Return TypeDescription
boolIf a value is invalid or anomalous, the filter is reset, and false is returned. If the filtering is successful, it returns true.

Example

final metricsInfo = MetricsInfo(event);
this.gazeInfo = metricsInfo.gazeInfo;
if (_useFilter) {
bool isFiltered = _filterManager
.filterValues(metricsInfo.timestamp, [gazeInfo.gaze.x, gazeInfo.gaze.y]);
if (isFiltered) {
print("filtered success!");
}
}

getFilteredValues

List<double> getFilteredValues()

Retrieves the filtered values obtained after applying the filter.

This function returns a list of filtered double values that have been processed using the filter. It allows you to access the filtered results for further use.

Return TypeDescription
List<double>A list of filtered double values.

Example

bool isFiltered = _filterManager
.filterValues(metricsInfo.timestamp, [gazeInfo.gaze.x, gazeInfo.gaze.y]);
if (isFiltered) {
List<Double> filtered = _filterManager.getFilteredValues();
print("filtered (x ,y) : (${filtered[0]}, ${filtered[1]})");
}

CalibrationMode

enum CalibrationMode { one, five }

Defines the calibration modes for the Eyedid SDK.

  1. one: Calibration with a single target point.
  2. five: Calibration with five target points for greater accuracy.

CalibrationCriteria

enum CalibrationCriteria { low, standard, high }

Represents different calibration criteria levels.

  1. low: Low accuracy, faster calibration.
  2. standard: Default level for general use.
  3. high: High accuracy for precise calibration.

TrackingState

enum TrackingState { success, gazeNotFound, faceMissing }

TrackingState represents the different states that the tracking system can be in during gaze and face tracking.

  1. success: The tracking is successful, and the gaze is detected correctly.
  2. gazeNotFound: The tracking system is active, but the gaze could not be detected.
  3. faceMissing: The tracking system cannot detect the face, making gaze tracking impossible.

EyemovementState

enum EyemovementState { fixation, saccade, unknown }

EyemovementState represents the different states of eye movement detected by the tracking system.

  1. fixation: The eye is fixated on a specific point, indicating stable gaze.
  2. saccade: A rapid movement of the eye between fixation points, indicating quick gaze shifts.
  3. unknown: The state of the eye movement is undetermined or not recognized.

ScreenState

enum ScreenState { insideOfScreen, outsideOfScreen, unknown }

ScreenState represents the different states of the user's gaze in relation to the screen boundaries.

  1. insideOfScreen: The user's gaze is detected within the screen boundaries.
  2. outsideOfScreen: The user's gaze is detected outside of the screen boundaries.
  3. unknown: The state of the user's gaze in relation to the screen cannot be determined.

StatusType

enum StatusType { start, stop, unknown }

Enum representing the different types of status events.


StatusErrorType

enum StatusErrorType {
none,
cameraStart,
cameraInterrupt,
unknown,
}

Enum representing the different types of errors that can occur when the status is 'stop'.

  1. none: No error occurred.
  2. cameraStart: Error when starting the camera.
  3. cameraInterrupt: Error due to camera interruption.
  4. unknown: An unknown error occurred.

CalibrationType

enum CalibrationType {
progress,
nextPoint,
finished,
unknown
}

Enum representing the different types of calibration events.

  1. progress: Calibration is in progress.
  2. nextPoint: Next calibration point is available.
  3. finished: Calibration has finished and data is available.
  4. unknown: Unknown calibration type.

EyedidDeviceOrientation

enum EyedidDeviceOrientation {
portrait,
portraitUpsideDown,
landscapeLeft,
landscapeRight,
}

Represents different gaze event orientations for tracking in the Eyedid SDK on iOS. Used to adjust the direction of gaze coordinates based on the device orientation.

  1. portrait: Device is in portrait mode, Home button at the bottom.
  2. portraitUpsideDown: Device is in portrait mode, Home button at the top.
  3. landscapeLeft: Device is in landscape mode, Home button on the left.
  4. landscapeRight: Device is in landscape mode, Home button on the right.