Publications

Results 1–100 of 104

Search results

Jump to search filters

No tImplementing transition--edge sensors in a tabletop edge sensors in a tabletop xx--ray CT system for imaging applicationsray CT system for imaging applicationsitle

Alpert, Bradley; Becker, Daniel; Bennett, Douglas; Doriese, W.; Durkin, Malcolm; Fowler, Joseph; Gard, Johnathon; Imrek, Jozsef; Levine, Zachary; Mates, John; Miaja-Avila, Luis; Morgan, Kelsey; Nakamura, Nathan; O'Neil, Galen; Ortiz, Nathan; Reintsema, Carl; Schmidt, Daniel; Swetz, Daniel; Szypryt, Paul; Ullom, Joel; Vale, Leila; Weber, Joel; Wessels, Abigail; Dagel, Amber; Dalton, Gabriella; Foulk, James W.; Jimenez, Edward S.; Mcarthur, Daniel; Thompson, Kyle; Walker, Christopher; Wheeler, Jason; Ablerto, Julien; Griveau, Damien; Silvent, Jeremie

Abstract not provided.

Design and fabrication of multi-metal patterned target anodes for improved quality of hyperspectral X-ray radiography and computed tomography imaging systems

Proceedings of SPIE - The International Society for Optical Engineering

Foulk, James W.; Foulk, James W.; Dalton, Gabriella; Wheeling, Rebecca; Foulk, James W.; Thompson, Kyle; Foulk, James W.; Jimenez, Edward S.

Applications such as counterfeit identification, quality control, and non-destructive material identification benefit from improved spatial and compositional analysis. X-ray Computed Tomography is used in these applications but is limited by the X-ray focal spot size and the lack of energy-resolved data. Recently developed hyperspectral X-ray detectors estimate photon energy, which enables composition analysis but lacks spatial resolution. Moving beyond bulk homogeneous transmission anodes toward multi-metal patterned anodes enables improvements in spatial resolution and signal-to-noise ratios in these hyperspectral X-ray imaging systems. We aim to design and fabricate transmission anodes that facilitate confirmation of previous simulation results. These anodes are fabricated on diamond substrates with conventional photolithography and metal deposition processes. The final transmission anode design consists of a cluster of three disjoint metal bumps selected from molybdenum, silver, samarium, tungsten, and gold. These metals are chosen for their k-lines, which are positioned within distinct energy intervals of interest and are readily available in standard clean rooms. The diamond substrate is chosen for its high thermal conductivity and high transmittance of X-rays. The feature size of the metal bumps is chosen such that the cluster is smaller than the 100 m diameter of the impinging electron beam in the X-ray tube. This effectively shrinks the X-ray focal spot in the selected energy bands. Once fabricated, our transmission anode is packaged in a stainless-steel holder that can be retrofitted into our existing X-ray tube. Innovations in anode design enable an inexpensive and simple method to improve existing X-ray imaging systems.

More Details

Design and fabrication of multi-metal patterned target anodes for improved quality of hyperspectral X-ray radiography and computed tomography imaging systems

Proceedings of SPIE - The International Society for Optical Engineering

Foulk, James W.; Foulk, James W.; Dalton, Gabriella; Wheeling, Rebecca; Foulk, James W.; Thompson, Kyle; Foulk, James W.; Jimenez, Edward S.

Applications such as counterfeit identification, quality control, and non-destructive material identification benefit from improved spatial and compositional analysis. X-ray Computed Tomography is used in these applications but is limited by the X-ray focal spot size and the lack of energy-resolved data. Recently developed hyperspectral X-ray detectors estimate photon energy, which enables composition analysis but lacks spatial resolution. Moving beyond bulk homogeneous transmission anodes toward multi-metal patterned anodes enables improvements in spatial resolution and signal-to-noise ratios in these hyperspectral X-ray imaging systems. We aim to design and fabricate transmission anodes that facilitate confirmation of previous simulation results. These anodes are fabricated on diamond substrates with conventional photolithography and metal deposition processes. The final transmission anode design consists of a cluster of three disjoint metal bumps selected from molybdenum, silver, samarium, tungsten, and gold. These metals are chosen for their k-lines, which are positioned within distinct energy intervals of interest and are readily available in standard clean rooms. The diamond substrate is chosen for its high thermal conductivity and high transmittance of X-rays. The feature size of the metal bumps is chosen such that the cluster is smaller than the 100 m diameter of the impinging electron beam in the X-ray tube. This effectively shrinks the X-ray focal spot in the selected energy bands. Once fabricated, our transmission anode is packaged in a stainless-steel holder that can be retrofitted into our existing X-ray tube. Innovations in anode design enable an inexpensive and simple method to improve existing X-ray imaging systems.

More Details

AirNet-SNL: End-to-end training of iterative reconstruction and deep neural network regularization for sparse-data XPCI CT

Optics InfoBase Conference Papers

Lee, Dennis J.; Mulcahy-Stanislawczyk, Johnathan; Jimenez, Edward S.; Goodner, Ryan N.; West, Roger D.; Epstein, Collin; Thompson, Kyle; Dagel, Amber

We present a deep learning image reconstruction method called AirNet-SNL for sparse view computed tomography. It combines iterative reconstruction and convolutional neural networks with end-to-end training. Our model reduces streak artifacts from filtered back-projection with limited data, and it trains on randomly generated shapes. This work shows promise to generalize learning image reconstruction.

More Details

Influence of Data Acquisition Algorithms on X-Ray Phase Contrast Imaging Computed Tomography

Journal of Nondestructive Evaluation, Diagnostics and Prognostics of Engineering Systems

Epstein, Collin; Goodner, Ryan N.; West, Roger D.; Thompson, Kyle; Dagel, Amber

X-ray phase contrast imaging (XPCI) is a nondestructive evaluation technique that enables high-contrast detection of low-attenuation materials that are largely transparent in traditional radiography. Extending a grating-based Talbot-Lau XPCI system to three-dimensional imaging with computed tomography (CT) imposes two motion requirements: the analyzer grating must translate transverse to the optical axis to capture image sets for XPCI reconstruction, and the sample must rotate to capture angular data for CT reconstruction. The acquisition algorithm choice determines the order of movement and positioning of the two stages. The choice of the image acquisition algorithm for XPCI CT is instrumental to collecting high fidelity data for reconstruction. We investigate how data acquisition influences XPCI CT by comparing two simple data acquisition algorithms and determine that capturing a full phase-stepping image set for a CT projection before rotating the sample results in higher quality data.

More Details

Progress on building a laboratory based x-ray phase contrast imaging computed tomography system

AIP Conference Proceedings

Thompson, Kyle; Dagel, Amber; Goodner, Ryan N.; Epstein, Collin

Sandia National Laboratories is developing a laboratory-based x-ray phase contrast imaging (XPCI) computed tomography (CT) system. This system utilizes a Talbot-Lau interferometer based on in-house fabricated gratings and a conventional x-ray system. Initial work has focused on adding CT capabilities to a 28 keV XPCI system. A new set of gratings tuned for an x-ray energy of 100 keV is being developed. This new grating set will facilitate imaging denser components. System configuration details will be presented as well as a discussion of the challenges associated with building an XPCI CT system. Additionally, initial imaging results will be presented.

More Details

High-fidelity calibration and characterization of a spectral computed tomography system

Proceedings of SPIE - The International Society for Optical Engineering

Gallegos, Isabel; Dalton, Gabriella; Stohn, Adriana M.; Koundinyan, Srivathsan; Thompson, Kyle; Jimenez, Edward S.

Sandia National Laboratories has developed a model characterizing the nonlinear encoding operator of the world's first hyperspectral x-ray computed tomography (H-CT) system as a sequence of discrete-to-discrete, linear image system matrices across unique and narrow energy windows. In fields such as national security, industry, and medicine, H-CT has various applications in the non-destructive analysis of objects such as material identification, anomaly detection, and quality assurance. However, many approaches to computed tomography (CT) make gross assumptions about the image formation process to apply post-processing and reconstruction techniques that lead to inferior data, resulting in faulty measurements, assessments, and quantifications. To abate this challenge, Sandia National Laboratories has modeled the H-CT system through a set of point response functions, which can be used for calibration and anaylsis of the real-world system. This work presents the numerical method used to produce the model through the collection of data needed to describe the system; the parameterization used to compress the model; and the decompression of the model for computation. By using this linear model, large amounts of accurate synthetic H-CT data can be efficiently produced, greatly reducing the costs associated with physical H-CT scans. Furthermore, successfully approximating the encoding operator for the H-CT system enables quick assessment of H-CT behavior for various applications in high-performance reconstruction, sensitivity analysis, and machine learning.

More Details

Optimization of hardware and image processing for improved image quality in X-ray phase contrast imaging

Proceedings of SPIE - The International Society for Optical Engineering

Dagel, Amber; West, Roger D.; Goodner, Ryan N.; Grover, Steven M.; Epstein, Collin; Thompson, Kyle

High-quality image products in an X-Ray Phase Contrast Imaging (XPCI) system can be produced with proper system hardware and data acquisition. However, it may be possible to further increase the quality of the image products by addressing subtleties and imperfections in both hardware and the data acquisition process. Noting that addressing these issues entirely in hardware and data acquisition may not be practical, a more prudent approach is to determine the balance of how the apparatus may reasonably be improved and what can be accomplished with image post-processing techniques. Given a proper signal model for XPCI data, image processing techniques can be developed to compensate for many of the image quality degradations associated with higher-order hardware and data acquisition imperfections. However, processing techniques also have limitations and cannot entirely compensate for sub-par hardware or inaccurate data acquisition practices. Understanding system and image processing technique limitations enables balancing between hardware, data acquisition, and image post-processing. In this paper, we present some of the higher-order image degradation effects we have found associated with subtle imperfections in both hardware and data acquisition. We also discuss and demonstrate how a combination of hardware, data acquisition processes, and image processing techniques can increase the quality of XPCI image products. Finally, we assess the requirements for high-quality XPCI images and propose reasonable system hardware modifications and the limits of certain image processing techniques.

More Details

High-fidelity calibration and characterization of a spectral computed tomography system

Proceedings of SPIE - The International Society for Optical Engineering

Gallegos, Isabel; Dalton, Gabriella; Stohn, Adriana M.; Koundinyan, Srivathsan; Thompson, Kyle; Jimenez, Edward S.

Sandia National Laboratories has developed a model characterizing the nonlinear encoding operator of the world's first hyperspectral x-ray computed tomography (H-CT) system as a sequence of discrete-to-discrete, linear image system matrices across unique and narrow energy windows. In fields such as national security, industry, and medicine, H-CT has various applications in the non-destructive analysis of objects such as material identification, anomaly detection, and quality assurance. However, many approaches to computed tomography (CT) make gross assumptions about the image formation process to apply post-processing and reconstruction techniques that lead to inferior data, resulting in faulty measurements, assessments, and quantifications. To abate this challenge, Sandia National Laboratories has modeled the H-CT system through a set of point response functions, which can be used for calibration and anaylsis of the real-world system. This work presents the numerical method used to produce the model through the collection of data needed to describe the system; the parameterization used to compress the model; and the decompression of the model for computation. By using this linear model, large amounts of accurate synthetic H-CT data can be efficiently produced, greatly reducing the costs associated with physical H-CT scans. Furthermore, successfully approximating the encoding operator for the H-CT system enables quick assessment of H-CT behavior for various applications in high-performance reconstruction, sensitivity analysis, and machine learning.

More Details

Unsupervised learning methods to perform material identification tasks on spectral computed tomography data

Proceedings of SPIE - The International Society for Optical Engineering

Gallegos, Isabel; Koundinyan, Srivathsan; Suknot, April; Jimenez, Edward S.; Thompson, Kyle; Goodner, Ryan N.

Sandia National Laboratories has developed a method that applies machine learning methods to high-energy spectral X-ray computed tomography data to identify material composition for every reconstructed voxel in the field-of-view. While initial experiments led by Koundinyan et al. demonstrated that supervised machine learning techniques perform well in identifying a variety of classes of materials, this work presents an unsupervised approach that differentiates isolated materials with highly similar properties, and can be applied on spectral computed tomography data to identify materials more accurately compared to traditional performance. Additionally, if regions of the spectrum for multiple voxels become unusable due to artifacts, this method can still reliably perform material identification. This enhanced capability can tremendously impact fields in security, industry, and medicine that leverage non-destructive evaluation for detection, verification, and validation applications.

More Details

A kinetic approach to modeling the manufacture of high density strucutral foam: Foaming and polymerization

Rao, Rekha R.; Mondy, Lisa A.; Noble, David R.; Brunini, Victor; Roberts, Christine; Long, Kevin N.; Soehnel, Melissa; Celina, Mathew C.; Wyatt, Nicholas B.; Thompson, Kyle

We are studying PMDI polyurethane with a fast catalyst, such that filling and polymerization occur simultaneously. The foam is over-packed to tw ice or more of its free rise density to reach the density of interest. Our approach is to co mbine model development closely with experiments to discover new physics, to parameterize models and to validate the models once they have been developed. The model must be able to repres ent the expansion, filling, curing, and final foam properties. PMDI is chemically blown foam, wh ere carbon dioxide is pr oduced via the reaction of water and isocyanate. The isocyanate also re acts with polyol in a competing reaction, which produces the polymer. A new kinetic model is developed and implemented, which follows a simplified mathematical formalism that decouple s these two reactions. The model predicts the polymerization reaction via condensation chemis try, where vitrification and glass transition temperature evolution must be included to correctly predict this quantity. The foam gas generation kinetics are determined by tracking the molar concentration of both water and carbon dioxide. Understanding the therma l history and loads on the foam due to exothermicity and oven heating is very important to the results, since the kinetics and ma terial properties are all very sensitive to temperature. The conservation eq uations, including the e quations of motion, an energy balance, and thr ee rate equations are solved via a stabilized finite element method. We assume generalized-Newtonian rheology that is dependent on the cure, gas fraction, and temperature. The conservation equations are comb ined with a level set method to determine the location of the free surface over time. Results from the model are compared to experimental flow visualization data and post-te st CT data for the density. Seve ral geometries are investigated including a mock encapsulation part, two configur ations of a mock stru ctural part, and a bar geometry to specifically test the density model. We have found that the model predicts both average density and filling profiles well. However, it under predicts density gradients, especially in the gravity direction. Thoughts on m odel improvements are also discussed.

More Details

Experiments to populate and validate a processing model for polyurethane foam. BKC 44306 PMDI-10

Mondy, Lisa A.; Bauer, Stephen J.; Hileman, Michael B.; Thompson, Kyle; Smith, David; Rao, Rekha R.; Shelden, Bion; Soehnel, Melissa; O'Hern, Timothy J.; Grillet, Anne M.; Celina, Mathew C.; Wyatt, Nicholas B.; Russick, Edward M.

We are developing computational models to elucidate the expansion and dynamic filling process of a polyurethane foam, PMDI. The polyurethane of interest is chemically blown, where carbon dioxide is produced via the reaction of water, the blowing agent, and isocyanate. The isocyanate also reacts with polyol in a competing reaction, which produces the polymer. Here we detail the experiments needed to populate a processing model and provide parameters for the model based on these experiments. The model entails solving the conservation equations, including the equations of motion, an energy balance, and two rate equations for the polymerization and foaming reactions, following a simplified mathematical formalism that decouples these two reactions. Parameters for the polymerization kinetics model are reported based on infrared spectrophotometry. Parameters describing the gas generating reaction are reported based on measurements of volume, temperature and pressure evolution with time. A foam rheology model is proposed and parameters determined through steady-shear and oscillatory tests. Heat of reaction and heat capacity are determined through differential scanning calorimetry. Thermal conductivity of the foam as a function of density is measured using a transient method based on the theory of the transient plane source technique. Finally, density variations of the resulting solid foam in several simple geometries are directly measured by sectioning and sampling mass, as well as through x-ray computed tomography. These density measurements will be useful for model validation once the complete model is implemented in an engineering code.

More Details

Exploring mediated reality to approximate X-ray attenuation coefficients from radiographs

Proceedings of SPIE the International Society for Optical Engineering

Jimenez, Edward S.; Orr, Laurel J.; Morgan, Megan L.; Thompson, Kyle

Estimation of the x-ray attenuation properties of an object with respect to the energy emitted from the source is a challenging task for traditional Bremsstrahlung sources. This exploratory work attempts to estimate the x-ray attenuation profile for the energy range of a given Bremsstrahlung profile. Previous work has shown that calculating a single effective attenuation value for a polychromatic source is not accurate due to the non-linearities associated with the image formation process. Instead, we completely characterize the imaging system virtually and utilize an iterative search method/constrained optimization technique to approximate the attenuation profile of the object of interest. This work presents preliminary results from various approaches that were investigated. The early results illustrate the challenges associated with these techniques and the potential for obtaining an accurate estimate of the attenuation profile for objects composed of homogeneous materials.

More Details

Irregular large-scale computed tomography on multiple graphics processors improves energy-efficiency metrics for industrial applications

Proceedings of SPIE - The International Society for Optical Engineering

Jimenez, Edward S.; Goodman, Eric; Park, Ryeojin; Orr, Laurel J.; Thompson, Kyle

This paper will investigate energy-efficiency for various real-world industrial computed-tomography reconstruction algorithms, both CPU- and GPU-based implementations. This work shows that the energy required for a given reconstruction is based on performance and problem size. There are many ways to describe performance and energy efficiency, thus this work will investigate multiple metrics including performance-per-watt, energy-delay product, and energy consumption. This work found that irregular GPU-based approaches1 realized tremendous savings in energy consumption when compared to CPU implementations while also significantly improving the performanceper- watt and energy-delay product metrics. Additional energy savings and other metric improvement was realized on the GPU-based reconstructions by improving storage I/O by implementing a parallel MIMD-like modularization of the compute and I/O tasks.

More Details

Exploring mediated reality to approximate X-ray attenuation coefficients from radiographs

Proceedings of SPIE - The International Society for Optical Engineering

Jimenez, Edward S.; Orr, Laurel J.; Morgan, Megan L.; Thompson, Kyle

Estimation of the x-ray attenuation properties of an object with respect to the energy emitted from the source is a challenging task for traditional Bremsstrahlung sources. This exploratory work attempts to estimate the x-ray attenuation profile for the energy range of a given Bremsstrahlung profile. Previous work has shown that calculating a single effective attenuation value for a polychromatic source is not accurate due to the non-linearities associated with the image formation process. Instead, we completely characterize the imaging system virtually and utilize an iterative search method/constrained optimization technique to approximate the attenuation profile of the object of interest. This work presents preliminary results from various approaches that were investigated. The early results illustrate the challenges associated with these techniques and the potential for obtaining an accurate estimate of the attenuation profile for objects composed of homogeneous materials.

More Details

High performance graphics processor based computed tomography reconstruction algorithms for nuclear and other large scale applications

Jimenez, Edward S.; Orr, Laurel J.; Thompson, Kyle

The goal of this work is to develop a fast computed tomography (CT) reconstruction algorithm based on graphics processing units (GPU) that achieves significant improvement over traditional central processing unit (CPU) based implementations. The main challenge in developing a CT algorithm that is capable of handling very large datasets is parallelizing the algorithm in such a way that data transfer does not hinder performance of the reconstruction algorithm. General Purpose Graphics Processing (GPGPU) is a new technology that the Science and Technology (S&T) community is starting to adopt in many fields where CPU-based computing is the norm. GPGPU programming requires a new approach to algorithm development that utilizes massively multi-threaded environments. Multi-threaded algorithms in general are difficult to optimize since performance bottlenecks occur that are non-existent in single-threaded algorithms such as memory latencies. If an efficient GPU-based CT reconstruction algorithm can be developed; computational times could be improved by a factor of 20. Additionally, cost benefits will be realized as commodity graphics hardware could potentially replace expensive supercomputers and high-end workstations. This project will take advantage of the CUDA programming environment and attempt to parallelize the task in such a way that multiple slices of the reconstruction volume are computed simultaneously. This work will also take advantage of the GPU memory by utilizing asynchronous memory transfers, GPU texture memory, and (when possible) pinned host memory so that the memory transfer bottleneck inherent to GPGPU is amortized. Additionally, this work will take advantage of GPU-specific hardware (i.e. fast texture memory, pixel-pipelines, hardware interpolators, and varying memory hierarchy) that will allow for additional performance improvements.

More Details

Characterization of X-Ray Generator Beam Profiles

Mitchell, Dean J.; Thompson, Kyle; Harding, Lee; Thoreson, Gregory; Theisen, Lisa A.; Parmeter, John

T to compute the radiography properties of various materials, the flux profiles of X-ray sources must be characterized. This report describes the characterization of X-ray beam profiles from a Kimtron industrial 450 kVp radiography system with a Comet MXC-45 HP/11 bipolar oil-cooled X-ray tube. The empirical method described here uses a detector response function to derive photon flux profiles based on data collected with a small cadmium telluride detector. The flux profiles are then reduced to a simple parametric form that enables computation of beam profiles for arbitrary accelerator energies.

More Details

A high-performance and energy-efficient CT reconstruction algorithm for multi-terabyte datasets

IEEE Nuclear Science Symposium Conference Record

Orr, Laurel J.; Thompson, Kyle

There has been much work done in implementing various GPU-based Computed Tomography reconstruction algorithms for medical applications showing tremendous improvement in computational performance. While many of these reconstruction algorithms could also be applied to industrial-scale datasets, the performance gains may be modest to non-existent due to a combination of algorithmic, hardware, or scalability limitations. Previous work presented showed an irregular dynamic approach to GPU-Reconstruction kernel execution for industrial-scale reconstructions that dramatically improved voxel processing throughput. However, the improved kernel execution magnified other system bottlenecks such as host memory bandwidth and storage read/write bandwidth, thus hindering performance gains. This paper presents a multi-GPU-based reconstruction algorithm capable of efficiently reconstructing large volumes (between 64 gigavoxels and 1 teravoxel volumes) not only faster than traditional CPU- and GPU-based reconstruction algorithms but also while consuming significantly less energy. The reconstruction algorithm exploits the irregular kernel approach from previous work as well as a modularized MIMD-like environment, heterogeneous parallelism, as well as macro- and micro-scale dynamic task allocation. The result is a portable and flexible reconstruction algorithm capable of executing on a wide range of architectures including mobile computers, workstations, supercomputers, and modestly-sized hetero or homogeneous clusters with any number of graphics processors. © 2013 IEEE.

More Details

Exploring diagnostic capabilities for application to new photovoltaic technologies

Quintana, Enrico C.; Quintana, Michael A.; Rolfe, Kevin D.; Thompson, Kyle

Explosive growth in photovoltaic markets has fueled new creative approaches that promise to cut costs and improve reliability of system components. However, market demands require rapid development of these new and innovative technologies in order to compete with more established products and capture market share. Often times diagnostics that assist in R&D do not exist or have not been applied due to the innovative nature of the proposed products. Some diagnostics such as IR imaging, electroluminescence, light IV, dark IV, x-rays, and ultrasound have been employed in the past and continue to serve in development of new products, however, innovative products with new materials, unique geometries, and previously unused manufacturing processes require additional or improved test capabilities. This fast-track product development cycle requires diagnostic capabilities to provide the information that confirms the integrity of manufacturing techniques and provides the feedback that can spawn confidence in process control, reliability and performance. This paper explores the use of digital radiography and computed tomography (CT) with other diagnostics to support photovoltaic R&D and manufacturing applications.

More Details

Experiments for foam model development and validation

Mondy, Lisa A.; Gorby, Allen D.; Cote, Raymond O.; Castaeda, Jaime N.; Thompson, Kyle; Rao, Rekha R.; Moffat, Harry K.; Kraynik, Andrew M.; Russick, Edward M.; Adolf, Douglas B.; Grillet, Anne M.; Brotherton, Christopher M.; Bourdon, Christopher

A series of experiments has been performed to allow observation of the foaming process and the collection of temperature, rise rate, and microstructural data. Microfocus video is used in conjunction with particle image velocimetry (PIV) to elucidate the boundary condition at the wall. Rheology, reaction kinetics and density measurements complement the flow visualization. X-ray computed tomography (CT) is used to examine the cured foams to determine density gradients. These data provide input to a continuum level finite element model of the blowing process.

More Details
Results 1–100 of 104
Results 1–100 of 104