PANTHER research rethinks finding patterns in motion, makes sensor images searchable
After a disaster or national tragedy, bits of information often are found later among vast amounts of available data that might have mitigated or even prevented what happened, had they been recognized ahead of time.
In this information age, national security analysts often find themselves searching for a needle in a haystack. The available data is growing much faster than analysts’ ability to observe and process it. Sometimes they are unable to make key connections and often they are overwhelmed struggling to use data for predictions and forensics.
A three-year project at Sandia called Pattern Analytics to Support High-Performance Exploitation and Reasoning (PANTHER) has made progress on this problem. It’s developing ways to help analysts work smarter, faster, and more effectively when sifting through huge, complex data sets in real-time, stressful environments where the consequences could be life or death.
PANTHER’s 26-member team has accomplished a number of breakthroughs in rethinking how to compare motion and trajectories; developing software to represent remote sensor images, couple them with additional information and present them in a searchable form; and conducting fundamental research on visual cognition, says Kristina Czuchlewski, PANTHER’s principal investigator and manager of Sandia’s Intelligence Surveillance and Reconnaissance Systems Engineering and Decision Support Dept. 5346.
The PANTHER team looked at raw data and ways to pre-process and analyze it to make it searchable and more meaningful. The project also conducted fundamental research in cognitive science to inform the design of software and tools to help those viewing the data and make information of interest or trends easier to uncover.
PANTHER led to a strong partnership between the Labs’ Science & Technology and Defense Systems & Assessments organizations, “creating seminal technical achievements that are having direct impacts on Sandia’s ability to provide solutions for some of our nation’s most challenging national security intelligence problems,” says senior manager Steve Castillo (5340).
PANTHER, which is funded by Sandia’s Laboratory Directed Research & Development program, is providing deeper insights from complex data sets in minutes, as compared to months, and covering hundreds, as opposed to dozens, of miles.
“PANTHER developed the foundation for transforming how massive, complex data sets can be quickly analyzed to provide the nation’s decision-makers with new perspectives on situations and circumstances,” says Anthony Medina, director of Radio Frequency & Electronic Systems Center 5300. “If analysts are collecting information on a specific location over time and learn that something of interest might be occurring there, they probably don’t have the tools they need to quickly gather and analyze information from all relevant data sets that might corroborate the forecast. But PANTHER is probably the nation’s best bet right now to get to that point quickly.”
Tracktable code automates observation of motion, trajectories
Danny Rintoul (1462) developed the Tracktable code along with Andy Wilson (1461) and others to automate the observation of motion and trajectories. The code could be applied to any problem that examines movement, such as airliners, ships, or people.
Current approaches to getting meaningful information from trajectories focus on comparing one trajectory to another. If you have millions of trajectories to consider, that could mean trillions of comparisons, which takes a lot of time and computer power, Danny says.
“We’ve developed a way to store and represent trajectories so that computers can compare them all at once in a very fast and effective manner,” he says. Instead of trillions of comparisons, the software does the same job in millions of comparisons, which is manageable.
An analyst concerned about the number of airliners stuck in holding patterns could ask Tracktable about aircraft trajectories that made a certain pattern of turns. Tracktable then calculates geometric features, such as the number of 90-degree turns an aircraft flew or the length of a straight line. By associating a type of motion with these features and assigning a number to each feature, the computer can quickly group flights that behave in similar ways and show them to the viewer for interpretation.
“If you have millions and you’re not interested in precise comparisons, but general groupings of them, this is very effective,” Danny says.
PANTHER also examined the predictive capability of the information buried in data. If an analyst looks at the first half of a flight, considers historical data about similar flight paths, and then looks at the second half of the flight, any deviation from the pattern might cue an analyst to take a closer look. Finding that outlier from millions of flights that have flown before takes about a second with Tracktable, Danny says. The analyst is alerted because PANTHER team members are using the advances in cognitive science to design visual results that will highlight the odd behavior of the single aircraft. By studying how analysts use visual data, Sandia researchers are figuring out ways to make an outlier pop out of a screen full of detail to demand an analyst’s attention.
The team is now looking at integrating motion and trajectories into a system called GeoGraphy.
GeoGraphy helps analysts search for items of interest, shows changes over time
GeoGraphy, initially funded by NNSA, is a software system that converts remote sensing images expressed in pixels into nodes and edges in a graph to show changes over time and make the data searchable, says Randy Brost (1462), a computer scientist who led the team that developed the software.
GeoGraphy breaks the images into categories, such as buildings, trees, or rivers. This pre-processing creates something like a very complex paint-by-number that shows the categories of everything visible in the image. The program uses nodes and edges — nodes are analogous to the beige hubs in Tinkertoys, while edges are the colored connecting rods — to describe relationships between objects, such as distance or time, Randy and Kristina say.
In addition to the imagery, the software package could include such information as phone books or county records to provide a single searchable database of all the information that shows what’s changed over time.
For example, to find a high school, the analyst tells the program to search for large buildings near regions that look like parking lots, football fields, and tennis courts and defines those items. The analyst then can choose from among the results the computer provides.
The system is hierarchical, so once analysts identify high schools, they can ask the program to find high schools the next time without describing them. And should they doubt that something is a high school, the software makes the raw data available so they can verify the results, Randy says.
“The purpose of these codes — GeoGraphy and Tracktable — is to assist humans, not to replace them or to automatically do their jobs. It’s to enhance their ability to do their jobs well and to allow them to be more effective in dealing with large sets of evidence,” Randy says. “In the end, basically they are suggestion systems that say, ‘Hey, based on what you told me you’re interested in, you ought to look here, here, and here.’”
The PANTHER team also included researchers focused on enhancing the viewer experience. Researcher Laura Matzen and others are conducting cognitive science experiments to learn how analysts’ expertise affects their visual cognition and to create a model of how top-down visual attention — when a user approaches an image with a goal in mind — works. The researchers hope to use the answers they find to such fundamental cognitive science questions to inform the design of new tools to improve interactions between humans and computers, Laura says.
At the start of the three-year program, PANTHER’s diverse team of computer scientists and experts in sensors, human factors, digital signal processing, and machine learning had to take time to understand how each discipline approached the problem, Kristina says.
The team created a small shared workspace that “has done wonders to get everyone to be comfortable with each other,” she says.
PANTHER’s next steps
The prototype products and ideas developed under PANTHER are now ready for the next step in their development: to be tested in real-world environments, Kristina says.
The researchers have proposed research into new problems illuminated by PANTHER, while other agencies are solidifying the foundation PANTHER has developed. Other projects will use PANTHER’s ideas to address real-world problems, the researchers say.
“We went into PANTHER thinking we were going to do one thing, we’re going to improve the lives of image analysts,” Kristina says. “And, in the research process, we did a whole lot more.”