This contains tests for different methods of interaction with the computer via head/gaze tracking. It interfaces with the xLabs api to provide webcam-based head/eye tracking.
Methods implemented are:
-
Head-based dwell-time selection,
-
Head-based gestures,
-
Head-based smooth pursuit selection,
-
The xLabs ants game reimplemented with head tracking,
-
Gaze-based dwell-time selection,
-
Gaze-based gestures,
-
Gaze-based smooth pursuit selection,
-
And any other ideas!
Requires the xLabs api!
The code was originally written in late 2018 so may be out of date with current APIs.