GazeAppraise team nominated for federal labs consortium notable technology development award


Experienced professionals, whether in security or medicine, are great at pattern recognition and spotting irregularities as they scan images, but can be overwhelmed by the volume of data, especially in rapidly changing and high-stress environments. New products have been developed to assist workers, but it’s unclear that these products actually help them scan images more effectively.

Eye tracking data and information can help researchers and practitioners organize new image products in ways that demonstrably support effective signature detection, while making visual scanning workflows more comfortable and efficient for the people doing them. 

GazeAppraise software is an eye tracking analysis solution that offers a new way to understand human performance as it relates to analysis of dynamic soft-copy images on computer screens. Current eye tracking analysis tools support limited comparisons of gaze patterns across a group of individuals viewing soft-copy images on computer screens. Existing tools allow for comparison of either spatial, or temporal (time sequence) characteristics of gaze patterns independently. But because of this unidimensional approach, rich comparisons of whole scan paths are not possible. For example, existing analysis packages cannot automatically identify different scanning strategies such as spiral gaze patterns or rastering the eyes across an image from left to right and top to bottom. GazeAppraise takes into account BOTH the spatial and temporal information contained in scanpaths and automatically identifies common scanning strategies across a group of individuals. With GazeAppraise, researchers, marketers, or anyone who designs human information interaction systems, will be able to understand how people move their attention across information displays and how to optimize those systems to increase human performance. It has opened up entirely new ways of evaluating search strategies in a wide range of problem areas.

Insights discovered through the use of GazeAppraise will be incorporated into next-generation analysis hardware and software systems. These systems will be used to improve human performance in dynamic image analysis applied to medical diagnostics, airport security, nuclear nonproliferation, and any area where people are working with soft-copy images.

Description of technology

Using a unique algorithm, GazeAppraise offers a new way to understand human performance as it relates to analysis of dynamic soft-copy images. Researchers have excellent tools for analyzing human gaze patterns as people are interacting with static stimuli—for example, when reading a line of text or examining a natural scene. However, the visual cognition research community lacks algorithms and software that enable us to model how our eyes dynamically recalibrate their trajectory when tracking a moving target across a scene.

The GazeAppraise software algorithm is strong at characterizing those “smooth pursuit” eye movements; in doing so, it has opened up entirely new ways of evaluating search strategies in a wide range of problem areas, across a wide range of gaze datasets. This is because most eye tracking datasets are analyzed using standard statistical models to reduce the raw data to a small number of points, called fixations, where a person’s eyes came to rest on a specific part of an image. In contrast, GazeAppraise starts with the scanpath that represents the movement of human gaze over an entire image or target. GazeAppraise interprets collections of eye movements as whole shapes, with geometric features (such as length, angularity, aspect ratio, etc.) that can be used to differentiate and classifying variations in eye movement patterns among individuals and across different image sets. It can categorize scanpaths from raw eye tracking data, even when those data include samples collected with variations in calibration precision, tracking consistency, and viewer performance. Evaluating collections of eye movements as a holistic pattern, rather than a set of discrete spatiotemporal events, provides insight into the strategies of experts as they try to gain information from dynamic imagery display systems—not only with moving targets, but in still scenes as well.

Partnerships formed

The partnerships the Sandia team has developed create interaction between behavioral scientists, computer scientists, and specialists in other fields, enriching eye tracking research and applications.

Partnerships formed under Sandia’s Academic Alliance program with two academic partners, Georgia Tech (GT) and the University of Illinois at Urbana-Champaign (UIUC), help Sandia to fine tune the quality of the technology and algorithms in GazeAppraise. The work Sandia and the university partners are doing will be an important part of eye movement research. GT is helping with the value framework that can be used to evaluate information displays. UIUC, with an interdisciplinary team, is helping with image-driven visual saliency and goal-directed visual processing.

A cooperative research and development agreement (CRADA) with EyeTracking, Inc., a company specializing in this field, gives the Sandia team access to a wide array of eye tracking systems and a pathway to commercial applications for this planned technology transfer.

The GazeAppraise team is Mike Haass, Laura McNamara, Mark Danny Rintoul, Andy Wilson, and Laura Matzen.


Michael Joseph Haass,

September 1, 2016

News story url: