Reduced SWAP Hyperspectral Imaging Through Event-based Sensing
Abstract not provided.
Abstract not provided.
Proceedings of SPIE - The International Society for Optical Engineering
In this paper, we develop a nested chi-squared likelihood ratio test for selecting among shrinkage-regularized covariance estimators for background modeling in hyperspectral imagery. Critical to many target and anomaly detection algorithms is the modeling and estimation of the underlying background signal present in the data. This is especially important in hyperspectral imagery, wherein the signals of interest often represent only a small fraction of the observed variance, for example when targets of interest are subpixel. This background is often modeled by a local or global multivariate Gaussian distribution, which necessitates estimating a covariance matrix. Maximum likelihood estimation of this matrix often overfits the available data, particularly in high dimensional settings such as hyperspectral imagery, yielding subpar detection results. Instead, shrinkage estimators are often used to regularize the estimate. Shrinkage estimators linearly combine the overfit covariance with an underfit shrinkage target, thereby producing a well-fit estimator. These estimators introduce a shrinkage parameter, which controls the relative weighting between the covariance and shrinkage target. There have been many proposed methods for setting this parameter, but comparing these methods and shrinkage values is often performed with a cross-validation procedure, which can be computationally expensive and highly sample inefficient. Drawing from Bayesian regression methods, we compute the degrees of freedom of a covariance estimate using eigenvalue thresholding and employ a nested chi-squared likelihood ratio test for comparing estimators. This likelihood ratio test requires no cross-validation procedure and enables direct comparison of different shrinkage estimates, which is computationally efficient.
Applied Optics
We present the characterization of several atmospheric aerosol analogs in a tabletop chamber and an analysis of how the concentration of NaCl present in these aerosols influences their bulk optical properties. Atmospheric aerosols (e.g., fog and haze) degrade optical signal via light–aerosol interactions causing scattering and absorption, which can be described by Mie theory. This attenuation is a function of the size distribution and number concentration of droplets in the light path. These properties are influenced by ambient conditions and the droplet’s composition, as described by Köhler theory. It is therefore possible to tune the wavelength-dependent bulk optical properties of an aerosol by controlling droplet composition. We present experimentation wherein we generated multiple microphysically and optically distinct atmospheric aerosol analogs using salt water solutions with varying concentrations of NaCl. The results demonstrate that changing the NaCl concentration has a clear and predictable impact on the microphysical and optical properties of the aerosol
Applied Optics
Fogs, low lying clouds, and other highly scattering environments pose a challenge for many commercial and national security sensing systems. Current autonomous systems rely on optical sensors for navigation whose performance is degraded by highly scattering environments. In our previous simulation work, we have shown that polarized light can penetrate through a scattering environment such as fog. We have demonstrated that circularly polarized light maintains its initial polarization state better than linearly polarized light, even through large numbers of scattering events and thus ranges. This has recently been experimentally verified by other researchers. In this work, we present the design, construction, and testing of active polarization imagers at short-wave infrared and visible wavelengths. We explore multiple polarimetric configurations for the imagers, focusing on linear and circular polarization states. The polarized imagers were tested at the Sandia National Laboratories Fog Chamber under realistic fog conditions. We show that active circular polarization imagers can increase range and contrast in fog better than linear polarization imagers. We show that when imaging typical road sign and safety retro-reflective films, circularly polarized imaging has enhanced contrast throughout most fog densities/ranges compared to linearly polarized imaging and can penetrate over 15 to 25 m into the fog beyond the range limit of linearly polarized imaging, with a strong dependence on the interaction of the polarization state with the target materials.
AIAA SCITECH 2023 Forum
As the path towards Urban Air Mobility (UAM) continues to take shape, there are outstanding technical challenges to achieving safe and effective air transportation operations under this new paradigm. To inform and guide technology development for UAM, NASA is investigating the current state-of-the-art in key technology areas including traffic management, detect-and-avoid, and autonomy. In support of this effort, a new perception testbed was developed at NASA Ames Research Center to collect data from an array of sensing systems representative of those that could be found on a future UAM vehicle. This testbed, featuring a Light-Detection-and-Ranging (LIDAR) instrument, a long-wave infrared sensor, and a visible spectrum camera was deployed for a multiday test campaign in the Fog Chamber at Sandia National Laboratories (SNL), in Albuquerque, New Mexico. During the test campaign, fog conditions were created for tests with targets including a human, a resolution chart, and a small unmanned aerial vehicle (sUAV). Here, this paper describes in detail, the developed perception testbed, the experimental setup in the fog chamber, the resulting data, and presents an initial result from analysis of the data with the evaluation of methods to increase contrast through filtering techniques.
Proceedings of SPIE the International Society for Optical Engineering
Event-based sensors are a novel sensing technology which capture the dynamics of a scene via pixel-level change detection. This technology operates with high speed (>10 kHz), low latency (10 µs), low power consumption (<1 W), and high dynamic range (120 dB). Compared to conventional, frame-based architectures that consistently report data for each pixel at a given frame rate, event-based sensor pixels only report data if a change in pixel intensity occurred. This affords the possibility of dramatically reducing the data reported in bandwidth-limited environments (e.g., remote sensing) and thus, the data needed to be processed while still recovering significant events. Degraded visual environments, such as those generated by fog, often hinder situational awareness by decreasing optical resolution and transmission range via random scattering of light. To respond to this challenge, we present the deployment of an event-based sensor in a controlled, experimentally generated, well-characterized degraded visual environment (a fog analogue), for detection of a modulated signal and comparison of data collected from an event-based sensor and from a traditional framing sensor.
Proceedings of SPIE - The International Society for Optical Engineering
Event-based sensors are a novel sensing technology which capture the dynamics of a scene via pixel-level change detection. This technology operates with high speed (>10 kHz), low latency (10 µs), low power consumption (<1 W), and high dynamic range (120 dB). Compared to conventional, frame-based architectures that consistently report data for each pixel at a given frame rate, event-based sensor pixels only report data if a change in pixel intensity occurred. This affords the possibility of dramatically reducing the data reported in bandwidth-limited environments (e.g., remote sensing) and thus, the data needed to be processed while still recovering significant events. Degraded visual environments, such as those generated by fog, often hinder situational awareness by decreasing optical resolution and transmission range via random scattering of light. To respond to this challenge, we present the deployment of an event-based sensor in a controlled, experimentally generated, well-characterized degraded visual environment (a fog analogue), for detection of a modulated signal and comparison of data collected from an event-based sensor and from a traditional framing sensor.
AIAA SciTech Forum and Exposition, 2023
As the path towardsUrban Air Mobility (UAM) continues to take shape, there are outstanding technical challenges to achieving safe and effective air transportation operations under this new paradigm. To inform and guide technology development for UAM, NASA is investigating the current state-of-the-art in key technology areas including traffic management, detect-and-avoid, and autonomy. In support of this effort, a new perception testbed was developed at NASA Ames Research Center to collect data from an array of sensing systems representative of those that could be found on a future UAM vehicle. This testbed, featuring a Light-Detection-and-Ranging (LIDAR) instrument, a long-wave infrared sensor, and a visible spectrum camera was deployed for a multiday test campaign in the Fog Chamber at Sandia National Laboratories (SNL), in Albuquerque, New Mexico. During the test campaign, fog conditions were created for tests with targets including a human, a resolution chart, and a small unmanned aerial vehicle (sUAV). This paper describes in detail, the developed perception testbed, the experimental setup in the fog chamber, the resulting data, and presents an initial result from analysis of the data with the evaluation of methods to increase contrast through filtering techniques.