I am part of the Scalable Analysis and Visualization group at Sandia. I've been here since 2003. Before that I worked with Dinesh Manocha at the University of North Carolina at Chapel Hill's Department of Computer Science.
The most concise way to explain what I do is to say that I orbit the issue of large data. Analysis, visualization, interaction, acquisition and representation are all fair game. I will focus on one area or another according to the demands of the project at hand.
It all began innocuously enough. As a new graduate student in the GAMMA group at UNC, at my very first research group meeting, my advisor said to me, "Andy, we have this rendering system for a big power plant model. It takes a long time to load because it uses all the memory on our biggest machine. Would you see if you can do something about that?" That was in 1997. I'm still chasing the repercussions of that assignment and I have plenty to keep me busy for a long time to come.
Current Research Projects
Each of these will have its own project page in time.
Trajectory Analysis: Suppose you have a database of a few million trajectories with tens of thousands more coming in each day. How can you decide what's normal and what's abnormal?
Nested Narratives: From the Iliad to Star Wars to King Lear to the World Trade Center, grand stories take place on multiple scales. How can we build tools that will capture those stories at all scales?
Here are a few of the publications I've had the most fun with over the years. The full list is on its own page.
Exploring 2D Tensor Fields Using Stress Nets
IEEE Visualization 2005
In this article we describe stress nets, a technique for exploring 2D tensor fields. Our method allows a user to examine simultaneously the tensors' eigenvectors (both major and minor) as well as scalar-valued tensor invariants. By avoiding noise- advection techniques, we are able to display both principal directions of the tensor field as well as the derived scalars without cluttering the display. We present a GPU-only implementation of stress nets as well as a hybrid CPU/GPU approach and discuss the relative strengths and weaknesses of each.
Stress nets have been used as part of an investigation into crack propagation. They were used to display the directions of maximum shear in a slab of material under tension as well as the magnitude of the shear forces acting on each point. Our methods allowed users to find new features in the data that were not visible on standard plots of tensor invariants. These features disagree with commonly accepted analytical crack propagation solutions and have sparked renewed investigation. Though developed for a materials mechanics problem, our method applies equally well to any 2D tensor field having unique characteristic directions.
A Flexible Approach for the Statistical Visualization of Ensemble Data
IEEE Workshop on Knowledge Discovery from Climate Data: Prediction, Extremes
Scientists are increasingly moving towards ensemble data sets to explore relationships present in dynamic systems. Ensemble data sets combine spatio-temporal simulation results generated using multiple numerical models, sampled input conditions and perturbed parameters. While ensemble data sets are a powerful tool for mitigating uncertainty, they pose significant visualization and analysis challenges due to their complexity. We present a collection of overview and statistical displays linked through a high level of interactivity to provide a framework for gaining key scientific insight into the distribution of the simulation results as well as the uncertainty associated with the data. In contrast to methods that present large amounts of diverse information in a single display, we argue that combining multiple linked statistical displays yields a clearer presentation of the data and facilitates a greater level of visual data analysis. We demonstrate this approach using driving problems from climate modeling and meteorology and discuss generalizations to other fields.
Simplifying Complex Environments Using Incremental Textured Depth Meshes
ACM Transactions on Graphics (TOG) - Proceedings of ACM SIGGRAPH 2003
We present an incremental algorithm to compute image-based simplifications of a large environment. We use an optimization-based approach to generate samples based on scene visibility, and from each viewpoint create textured depth meshes (TDMs) using sampled range panoramas of the environment. The optimization function minimizes artifacts such as skins and cracks in the reconstruction. We also present an encoding scheme for multiple TDMs that exploits spatial coherence among different viewpoints. The resulting simplifications, incremental textured depth meshes (ITDMs), reduce preprocessing, storage, rendering costs and visible artifacts. Our algorithm has been applied to large, complex synthetic environments comprising millions of primitives. It is able to render them at 20 -- 40 frames a second on a PC with little loss in visual fidelity.
Whenever feasible I release algorithms I develop as open source. Some of these go into VTK or the mostly-defunct Titan informatics toolkit. Then there are the others...
PocketKML: Minimalist Python library for writing KML (Google Earth file format). Similar in spirit to simplekml with the intent of requiring absolutely no external dependencies beyond the standard packages that ship with Python.
Project page coming soon.
I indulge in origami and occasionally turn exploratory data analysis into art projects. Pages coming soon.