Eye tracking, or gaze tracking, is a technology that consists in calculating the eye gaze point of a user as he or she looks around.

A device equipped with an eye tracker enables users to use their eye gaze as an input modality that can be combined with other input devices like mouse, keyboard, touch and gestures, referred as active applications. Furthermore, eye gaze data collected with an eye tracker can be employed to improve the design of a website or a magazine cover, which are described more thoroughly later on as passive applications. Applications that can benefit from eye tracking include games, OS navigation, e-books, market research studies, and usability testing.

The Eye Tribe Tracker is an eye tracking system that can calculate the location where a person is looking by means of information extracted from person’s face and eyes. The eye gaze coordinates are calculated with respect to a screen the person is looking at, and are represented by a pair of (x, y) coordinates given on the screen coordinate system. A typical scenario is represented in Figure 1.

Figure 1. User in front of an eye tracker.

In order to track the user’s eye movements and calculate the on-screen gaze coordinates, the Tracker must be placed below the screen and pointing at the user. Please check Getting Started guide for more information about setting up the hardware.

The user needs to be located within the Tracker’s trackbox. The trackbox is defined as the volume in space where the user can theoretically be tracked by the system. The size of the trackbox depends on the frame rate, with a higher frame rate offering a smaller trackbox. The Eye Tribe SDK includes a Trackbox sample that illustrates how to indicate users their location with respect to the Tracker so they can adjust their position accordingly.

When the system is calibrated (see Calibration below), the eye tracking software calculates the user's eye gaze coordinates with an average accuracy of around 0.5 to 1º of visual angle. Assuming the user sits approximately 60 cm away from the screen/tracker, this accuracy corresponds to an on-screen average error of 0.5 to 1 cm.

Prior to using an eye tracker the user needs to undergo a personal calibration process. The reason for this is that each person has different eye characteristics, and the eye tracking software needs to model these in order to estimate gaze accurately.

A typical user calibration process takes approximately 20 seconds to complete, and consists in a circular target that is displayed at different locations of the screen on a blank background during around 2 seconds each. The user needs to look at the target as this is displayed on the screen. Once all the calibration targets have been displayed on the screen the calibration process is completed. The system will start providing (x, y) coordinates of the user's gaze point through the API.

Once the calibration process is completed successfully, the Tracker should not be moved. If the Tracker is placed in a different location, the user will need to re-calibrate in order for the system to update the calibration parameters to match the new location of the Tracker.

Figure 2 shows the recommended calibration pattern used to calibrate The Eye Tribe Tracker. A minimum of 9 calibration locations covering most of the screen is recommended. Using more locations (e.g. 12 or 16) will improve the accuracy of the gaze coordinates computed by the system.

Figure 2. Temporary image showing the 9 calibration points

Eye tracking applications are divided into 2 categories: active and passive.


An eye tracker enables users to use their eye movements as an input modality to control a device, an application, a game, etc. The user’s eye gaze point can be combined with other input modalities like buttons, keyboards, mouse or touch, in order to create a more natural and engaging interaction.

Some examples of eye-controlled applications are provided here:

  • Web browser or pdf reader that scrolls automatically as the user reads on the bottom part of the page.
  • A maps application that pans when the user looks at the edges of the map. The map also zooms in and out where the user is looking.
  • User interface on which icons can be activated by looking at them.
  • When multiple windows are opened, the window the user is looking at keeps the focus.
  • A first person shooter game where the user aims with the eyes and shoots with the mouse button.
  • An adventure game where characters react to the player looking at them. For instance, if the player looks at a given character, this character will start talking to the player.
  • An on-screen keyboard designed to enable people with severe motor disabilities to write text, send emails, participate in online chats, etc.


Eye tracking makes it possible to observe and evaluate human attention objectively and non-intrusively, enabling you to increase the impact of your visual designs and communication.

The Eye Tribe Tracker can be employed to collect eye gaze data when the user is presented with different stimuli. e.g. a website, a user interface, a commercial or a magazine cover. The data collected can then be analyzed to improve the design and hence get a better response from customers.

Eye movements can be classified into fixations and saccades; fixations occur when we look at a given point, while saccades occur when we perform large eye movements. By combining fixation and saccade information from different users, it is possible to create a heatmap of the regions of the stimuli that attracted most interest from the participants. Below is  an example of a heatmap of a printed ad.