Publications

Results 1–25 of 50

Search results

Jump to search filters

Influence of NaCl concentration on the optical scattering properties of water-based aerosols

Applied Optics

Pattyn, Christian A.; Wright, Jeremy B.; Laros, James H.; Redman, Brian J.; Vander Laan, John D.; Glen, Andrew G.; Sanchez, A.L.; Westlake, Karl W.; Patel, Lekha P.; Bentz, Brian Z.

We present the characterization of several atmospheric aerosol analogs in a tabletop chamber and an analysis of how the concentration of NaCl present in these aerosols influences their bulk optical properties. Atmospheric aerosols (e.g., fog and haze) degrade optical signal via light–aerosol interactions causing scattering and absorption, which can be described by Mie theory. This attenuation is a function of the size distribution and number concentration of droplets in the light path. These properties are influenced by ambient conditions and the droplet’s composition, as described by Köhler theory. It is therefore possible to tune the wavelength-dependent bulk optical properties of an aerosol by controlling droplet composition. We present experimentation wherein we generated multiple microphysically and optically distinct atmospheric aerosol analogs using salt water solutions with varying concentrations of NaCl. The results demonstrate that changing the NaCl concentration has a clear and predictable impact on the microphysical and optical properties of the aerosol

More Details

Increased range and contrast in fog with circularly polarized imaging

Applied Optics

Vander Laan, John D.; Redman, Brian J.; Segal, Jacob W.; Westlake, Karl W.; Wright, Jeremy B.; Bentz, Brian Z.

Fogs, low lying clouds, and other highly scattering environments pose a challenge for many commercial and national security sensing systems. Current autonomous systems rely on optical sensors for navigation whose performance is degraded by highly scattering environments. In our previous simulation work, we have shown that polarized light can penetrate through a scattering environment such as fog. We have demonstrated that circularly polarized light maintains its initial polarization state better than linearly polarized light, even through large numbers of scattering events and thus ranges. This has recently been experimentally verified by other researchers. In this work, we present the design, construction, and testing of active polarization imagers at short-wave infrared and visible wavelengths. We explore multiple polarimetric configurations for the imagers, focusing on linear and circular polarization states. The polarized imagers were tested at the Sandia National Laboratories Fog Chamber under realistic fog conditions. We show that active circular polarization imagers can increase range and contrast in fog better than linear polarization imagers. We show that when imaging typical road sign and safety retro-reflective films, circularly polarized imaging has enhanced contrast throughout most fog densities/ranges compared to linearly polarized imaging and can penetrate over 15 to 25 m into the fog beyond the range limit of linearly polarized imaging, with a strong dependence on the interaction of the polarization state with the target materials.

More Details

Computational Imaging for Intelligence in Highly Scattering Aerosols (Final Report)

Bentz, Brian Z.; Wright, Jeremy B.; Laros, James H.; Sanchez, A.L.; Pattyn, Christian A.; Laros, James H.; Redman, Brian J.; Deneke, Elihu; Glen, Andrew G.; Westlake, Karl W.; Hastings, Ryan L.; Lacny, Christopher M.; Alexander, David W.; Webb, Kevin J.

Natural and man-made degraded visual environments pose major threats to national security. The random scattering and absorption of light by tiny particles suspended in the air reduces situational awareness and causes unacceptable down-time for critical systems and operations. To improve the situation, we have developed several approaches to interpret the information contained within scattered light to enhance sensing and imaging in scattering media. These approaches were tested at the Sandia National Laboratory Fog Chamber facility and with tabletop fog chambers. Computationally efficient light transport models were developed and leveraged for computational sensing. The models are based on a weak angular dependence approximation to the Boltzmann or radiative transfer equation that appears to be applicable in both the moderate and highly scattering regimes. After the new model was experimentally validated, statistical approaches for detection, localization, and imaging of objects hidden in fog were developed and demonstrated. A binary hypothesis test and the Neyman-Pearson lemma provided the highest theoretically possible probability of detection for a specified false alarm rate and signal-to-noise ratio. Maximum likelihood estimation allowed estimation of the fog optical properties as well as the position, size, and reflection coefficient of an object in fog. A computational dehazing approach was implemented to reduce the effects of scatter on images, making object features more readily discernible. We have developed, characterized, and deployed a new Tabletop Fog Chamber capable of repeatably generating multiple unique fog-analogues for optical testing in degraded visual environments. We characterized this chamber using both optical and microphysical techniques. In doing so we have explored the ability of droplet nucleation theory to describe the aerosols generated within the chamber, as well as Mie scattering theory to describe the attenuation of light by said aerosols, and correlated the aerosol microphysics to optical properties such as transmission and meteorological optical range (MOR). This chamber has proved highly valuable and has supported multiple efforts inclusive to and exclusive of this LDRD project to test optics in degraded visual environments. Circularly polarized light has been found to maintain its polarization state better than linearly polarized light when propagating through fog. This was demonstrated experimentally in both the visible and short-wave infrared (SWIR) by imaging targets made of different commercially available retroreflective films. It was found that active circularly polarized imaging can increase contrast and range compared to linearly polarized imaging. We have completed an initial investigation of the capability for machine learning methods to reduce the effects of light scattering when imaging through fog. Previously acquired experimental long-wave images were used to train an autoencoder denoising architecture. Overfitting was found to be a problem because of lack of variability in the object type in this data set. The lessons learned were used to collect a well labeled dataset with much more variability using the Tabletop Fog Chamber that will be available for future studies. We have developed several new sensing methods using speckle intensity correlations. First, the ability to image moving objects in fog was shown, establishing that our unique speckle imaging method can be implemented in dynamic scattering media. Second, the speckle decorrelation over time was found to be sensitive to fog composition, implying extensions to fog characterization. Third, the ability to distinguish macroscopically identical objects on a far-subwavelength scale was demonstrated, suggesting numerous applications ranging from nanoscale defect detection to security. Fourth, we have shown the capability to simultaneously image and localize hidden objects, allowing the speckle imaging method to be effective without prior object positional information. Finally, an interferometric effect was presented that illustrates a new approach for analyzing speckle intensity correlations that may lead to more effective ways to localize and image moving objects. All of these results represent significant developments that challenge the limits of the application of speckle imaging and open important application spaces. A theory was developed and simulations were performed to assess the potential transverse resolution benefit of relative motion in structured illumination for radar systems. Results for a simplified radar system model indicate that significant resolution benefits are possible using data from scanning a structured beam over the target, with the use of appropriate signal processing.

More Details

Incorporating the effects of objects in an approximate model of light transport in scattering media

Optics Letters

Bentz, Brian Z.; Pattyn, Christian A.; Laros, James H.; Redman, Brian J.; Glen, Andrew G.; Sanchez, A.L.; Westlake, Karl W.; Wright, Jeremy B.

A computationally efficient radiative transport model is presented that predicts a camera measurement and accounts for the light reflected and blocked by an object in a scattering medium. The model is in good agreement with experimental data acquired at the Sandia National Laboratory Fog Chamber Facility (SNLFC). The model is applicable in computational imaging to detect, localize, and image objects hidden in scattering media. Here, a statistical approach was implemented to study object detection limits in fog.

More Details

Optical characterization of the Sandia fog facility for computational sensing

Optics InfoBase Conference Papers

Bentz, Brian Z.; Pattyn, Christian A.; Redman, Brian J.; Laros, James H.; Deneke, Elihu; Sanchez, A.L.; Westlake, Karl W.; Laros, James H.; Wright, Jeremy B.

We present optical metrology at the Sandia fog chamber facility. Repeatable and well characterized fogs are generated under different atmospheric conditions and applied for light transport model validation and computational sensing development.

More Details

Approximate Model of Light Transport in Scattering Media for Computational Sensing in Fog and Tissue

2022 Conference on Lasers and Electro-Optics, CLEO 2022 - Proceedings

Bentz, Brian Z.; Pattyn, Christian A.; Laros, James H.; Redman, Brian J.; Laros, James H.; Sanchez, A.L.; Westlake, Karl W.; Wright, Jeremy B.

We present a computationally efficient a pproximate solution to the time-resolved radiative transfer equation that is applicable in weakly and diffuse scattering heterogeneous media. Applications will be considered, including computational sensing in fog and tissue.

More Details

Data Fusion of Very High Resolution Hyperspectral and Polarimetric SAR Imagery for Terrain Classification

West, Roger D.; Yocky, David A.; Laros, James H.; Anderson, Dylan Z.; Redman, Brian J.

Performing terrain classification with data from heterogeneous imaging modalities is a very challenging problem. The challenge is further compounded by very high spatial resolution. (In this paper we consider very high spatial resolution to be much less than a meter.) At very high resolution many additional complications arise, such as geometric differences in imaging modalities and heightened pixel-by-pixel variability due to inhomogeneity within terrain classes. In this paper we consider the fusion of very high resolution hyperspectral imaging (HSI) and polarimetric synthetic aperture radar (PolSAR) data. We introduce a framework that utilizes the probabilistic feature fusion (PFF) one-class classifier for data fusion and demonstrate the effect of making pixelwise, superpixel, and pixelwise voting (within a superpixel) terrain classification decisions. We show that fusing imaging modality data sets, combined with pixelwise voting within the spatial extent of superpixels, gives a robust terrain classification framework that gives a good balance between quantitative and qualitative results.

More Details

Light transport with weak angular dependence in fog

Optics Express

Bentz, Brian Z.; Redman, Brian J.; Laros, James H.; Westlake, Karl W.; Glen, Andrew G.; Sanchez, A.L.; Wright, Jeremy B.

Random scattering and absorption of light by tiny particles in aerosols, like fog, reduce situational awareness and cause unacceptable down-time for critical systems or operations. Computationally efficient light transport models are desired for computational imaging to improve remote sensing capabilities in degraded optical environments. To this end, we have developed a model based on a weak angular dependence approximation to the Boltzmann or radiative transfer equation that appears to be applicable in both the moderate and highly scattering regimes, thereby covering the applicability domain of both the small angle and diffusion approximations. An analytic solution was derived and validated using experimental data acquired at the Sandia National Laboratory Fog Chamber facility. The evolution of the fog particle density and size distribution were measured and used to determine macroscopic absorption and scattering properties using Mie theory. A three-band (0.532, 1.55, and 9.68 μm) transmissometer with lock-in amplifiers enabled changes in fog density of over an order of magnitude to be measured due to the increased transmission at higher wavelengths, covering both the moderate and highly scattering regimes. The meteorological optical range parameter is shown to be about 0.6 times the transport mean free path length, suggesting an improved physical interpretation of this parameter.

More Details

Detection and localization of objects hidden in fog

Proceedings of SPIE - The International Society for Optical Engineering

Bentz, Brian Z.; Laros, James H.; Glen, Andrew G.; Pattyn, Christian A.; Redman, Brian J.; Martinez-Sanchez, Andres M.; Westlake, Karl W.; Hastings, Ryan L.; Webb, Kevin J.; Wright, Jeremy B.

Degraded visual environments like fog pose a major challenge to safety and security because light is scattered by tiny particles. We show that by interpreting the scattered light it is possible to detect, localize, and characterize objects normally hidden in fog. First, a computationally efficient light transport model is presented that accounts for the light reflected and blocked by an opaque object. Then, statistical detection is demonstrated for a specified false alarm rate using the Neyman-Pearson lemma. Finally, object localization and characterization are implemented using the maximum likelihood estimate. These capabilities are being tested at the Sandia National Laboratory Fog Chamber Facility.

More Details

Optical and Polarimetric SAR Data Fusion Terrain Classification Using Probabilistic Feature Fusion

International Geoscience and Remote Sensing Symposium (IGARSS)

West, Roger D.; Yocky, David A.; Redman, Brian J.; Laros, James H.; Anderson, Dylan Z.

Deciding on an imaging modality for terrain classification can be a challenging problem. For some terrain classes a given sensing modality may discriminate well, but may not have the same performance on other classes that a different sensor may be able to easily separate. The most effective terrain classification will utilize the abilities of multiple sensing modalities. The challenge of utilizing multiple sensing modalities is then determining how to combine the information in a meaningful and useful way. In this paper, we introduce a framework for effectively combining data from optical and polarimetric synthetic aperture radar sensing modalities. We demonstrate the fusion framework for two vegetation classes and two ground classes and show that fusing data from both imaging modalities has the potential to improve terrain classification from either modality, alone.

More Details

Optical and Polarimetric SAR Data Fusion Terrain Classification Using Probabilistic Feature Fusion

International Geoscience and Remote Sensing Symposium (IGARSS)

West, Roger D.; Yocky, David A.; Redman, Brian J.; Laros, James H.; Anderson, Dylan Z.

Deciding on an imaging modality for terrain classification can be a challenging problem. For some terrain classes a given sensing modality may discriminate well, but may not have the same performance on other classes that a different sensor may be able to easily separate. The most effective terrain classification will utilize the abilities of multiple sensing modalities. The challenge of utilizing multiple sensing modalities is then determining how to combine the information in a meaningful and useful way. In this paper, we introduce a framework for effectively combining data from optical and polarimetric synthetic aperture radar sensing modalities. We demonstrate the fusion framework for two vegetation classes and two ground classes and show that fusing data from both imaging modalities has the potential to improve terrain classification from either modality, alone.

More Details

Performance evaluation of two optical architectures for task-specific compressive classification

Optical Engineering

Redman, Brian J.; Dagel, Amber L.; Sahakian, Meghan A.; LaCasse, Charles F.; Quach, Tu-Thach Q.; Birch, Gabriel C.

Many optical systems are used for specific tasks such as classification. Of these systems, the majority are designed to maximize image quality for human observers. However, machine learning classification algorithms do not require the same data representation used by humans. We investigate the compressive optical systems optimized for a specific machine sensing task. Two compressive optical architectures are examined: an array of prisms and neutral density filters where each prism and neutral density filter pair realizes one datum from an optimized compressive sensing matrix, and another architecture using conventional optics to image the aperture onto the detector, a prism array to divide the aperture, and a pixelated attenuation mask in the intermediate image plane. We discuss the design, simulation, and trade-offs of these systems built for compressed classification of the Modified National Institute of Standards and Technology dataset. Both architectures achieve classification accuracies within 3% of the optimized sensing matrix for compression ranging from 98.85% to 99.87%. The performance of the systems with 98.85% compression were between an F / 2 and F / 4 imaging system in the presence of noise.

More Details

Robust terrain classification of high spatial resolution remote sensing data employing probabilistic feature fusion and pixelwise voting

Proceedings of SPIE - The International Society for Optical Engineering

West, Roger D.; Redman, Brian J.; Yocky, David A.; Laros, James H.; Anderson, Dylan Z.

There are several factors that should be considered for robust terrain classification. We address the issue of high pixel-wise variability within terrain classes from remote sensing modalities, when the spatial resolution is less than one meter. Our proposed method segments an image into superpixels, makes terrain classification decisions on the pixels within each superpixel using the probabilistic feature fusion (PFF) classifier, then makes a superpixel-level terrain classification decision by the majority vote of the pixels within the superpixel. We show that this method leads to improved terrain classification decisions. We demonstrate our method on optical, hyperspectral, and polarimetric synthetic aperture radar data.

More Details
Results 1–25 of 50
Results 1–25 of 50