To show you what we can do, here’s some videos of the xLabs Eye/Gaze tracking system in action. In future, we’ll use this page to showcase some third-party systems that make use of our technology. Of course, you can download and test the system for yourself!
- Hands-free web video: Control playback with head gestures. Look away to pause; resume when you look back. Use gestures for rewind/fast-forward. Ideal for cooking to a recipe, doing some woodwork, or learning other practical skills. (Youtube video)
- Web-based 3D racing game: Steer the craft with head motion capture via our SDK. (Youtube video)
- Apple-Shooter: Use your head to shoot apples as they grow. (Youtube video)
- xLabs promotional video: About us, the technology, and potential applications
- xLabs calibration guide: A video showing how to use our calibration interface
- xLabs pose-invariant calibration tips: An experiment showing how our machine-learning calibration engine can be trained to generalize to varying head poses.
- Spotlight demo: Watch a user guiding a spotlight around the screen with his eyes. Note that head movement is unrestricted. Near the end, he is able to look away and then resume tracking. These capabilities have never previously been possible using only a single webcam.
Feature & Functionality Demonstrations
The videos below demonstrate some of the features and capabilites of our eye/gaze tracking system.
The second video is a screencast of typical gaze tracking results. The user looks around the corners of the screen, and then reads some lines of text.
3rd-party showcase apps
Contact us to include a video of your gaze-enabled website here!