Legacy and modern-day ablation codes typically assume equilibrium pyrolysis gas chemistry. Yet, experimental data suggest that speciation from resin decomposition is far from equilibrium. A thermal and chemical kinetic study was performed on pyrolysis gas advection through a porous char, using the Theoretical Ablative Composite for Open Testing (TACOT) as a demonstrator material. The finite-element tool SIERRA/ Aria simulated the ablation of TACOT under various conditions. Temperature and phenolic decomposition rates generated from Aria were applied as inputs to a simulated network of perfectly stirred reactors (PSRs) in the chemical solver Cantera. A high-fidelity combustion mechanism computed the gas composition and thermal properties of the advecting pyrolyzate. The results indicate that pyrolysis gases do not rapidly achieve chemical equilibrium while traveling through the simulated material. Instead, a highly chemically reactive zone exists in the ablator between 1400 and 2500 K, wherein the modeled pyrolysis gases transition from a chemically frozen state to chemical equilibrium. These finite-rate results demonstrate a significant departure in computed pyrolysis gas properties from those derived from equilibrium solvers. Under the same conditions, finite-rate-derived gas is estimated to provide up to 50% less heat absorption than equilibrium-derived gas. This discrepancy suggests that nonequilibrium pyrolysis gas chemistry could substantially impact ablator material response models.
Legacy and modern-day ablation codes typically assume equilibrium pyrolysis gas chemistry. Yet, experimental data suggest that speciation from resin decomposition is far from equilibrium. A thermal and chemical kinetic study was performed on pyrolysis gas advection through a porous char, using the Theoretical Ablative Composite for Open Testing (TACOT) as a demonstrator material. The finite-element tool SIERRA/ Aria simulated the ablation of TACOT under various conditions. Temperature and phenolic decomposition rates generated from Aria were applied as inputs to a simulated network of perfectly stirred reactors (PSRs) in the chemical solver Cantera. A high-fidelity combustion mechanism computed the gas composition and thermal properties of the advecting pyrolyzate. The results indicate that pyrolysis gases do not rapidly achieve chemical equilibrium while traveling through the simulated material. Instead, a highly chemically reactive zone exists in the ablator between 1400 and 2500 K, wherein the modeled pyrolysis gases transition from a chemically frozen state to chemical equilibrium. These finite-rate results demonstrate a significant departure in computed pyrolysis gas properties from those derived from equilibrium solvers. Under the same conditions, finite-rate-derived gas is estimated to provide up to 50% less heat absorption than equilibrium-derived gas. This discrepancy suggests that nonequilibrium pyrolysis gas chemistry could substantially impact ablator material response models.
Presented in this document is a portion of the tests that exist in the Sierra Thermal/Fluids verification test suite. Each of these tests is run nightly with the Sierra/TF code suite and the results of the test checked under mesh refinement against the correct analytic result. For each of the tests presented in this document the test setup, derivation of the analytic solution, and comparison of the code results to the analytic solution is provided.
The SIERRA Low Mach Module: Fuego, henceforth referred to as Fuego, is the key element of the ASC fire environment simulation project. The fire environment simulation project is directed at characterizing both open large-scale pool fires and building enclosure fires. Fuego represents the turbulent, buoyantly-driven incompressible flow, heat transfer, mass transfer, combustion, soot, and absorption coefficient model portion of the simulation software. Sierra/PMR handles the participating-media thermal radiation mechanics. This project is an integral part of the SIERRA multi-mechanics software development project. Fuego depends heavily upon the core architecture developments provided by SIERRA for massively parallel computing, solution adaptivity, and mechanics coupling on unstructured grids.
Presented in this document is a portion of the tests that exist in the Sierra Thermal/Fluids verification test suite. Each of these tests is run nightly with the Sierra/TF code suite and the results of the test checked under mesh refinement against the correct analytic result. For each of the tests presented in this document the test setup, derivation of the analytic solution, and comparison of the code results to the analytic solution is provided. This document can be used to confirm that a given code capability is verified or referenced as a compilation of example problems.
The SIERRA Low Mach Module: Fuego, henceforth referred to as Fuego, is the key element of the ASC fire environment simulation project. The fire environment simulation project is directed at characterizing both open large-scale pool fires and building enclosure fires. Fuego represents the turbulent, buoyantly-driven incompressible flow, heat transfer, mass transfer, combustion, soot, and absorption coefficient model portion of the simulation software. Using MPMD coupling, Scefire and Nalu handle the participating-media thermal radiation mechanics. This project is an integral part of the SIERRA multi-mechanics software development project. Fuego depends heavily upon the core architecture developments provided by SIERRA for massively parallel computing, solution adaptivity, and mechanics coupling on unstructured grids.
The SNL Sierra Mechanics code suite is designed to enable simulation of complex multiphysics scenarios. The code suite is composed of several specialized applications which can operate either in standalone mode or coupled with each other. Arpeggio is a supported utility that enables loose coupling of the various Sierra Mechanics applications by providing access to Framework services that facilitate the coupling. More importantly Arpeggio orchestrates the execution of applications that participate in the coupling. This document describes the various components of Arpeggio and their operability. The intent of the document is to provide a fast path for analysts interested in coupled applications via simple examples of its usage.
Image-based simulation, the use of 3D images to calculate physical quantities, relies on image segmentation for geometry creation. However, this process introduces image segmentation uncertainty because different segmentation tools (both manual and machine-learning-based) will each produce a unique and valid segmentation. First, we demonstrate that these variations propagate into the physics simulations, compromising the resulting physics quantities. Second, we propose a general framework for rapidly quantifying segmentation uncertainty. Through the creation and sampling of segmentation uncertainty probability maps, we systematically and objectively create uncertainty distributions of the physics quantities. We show that physics quantity uncertainty distributions can follow a Normal distribution, but, in more complicated physics simulations, the resulting uncertainty distribution can be surprisingly nontrivial. We establish that bounding segmentation uncertainty can fail in these nontrivial situations. While our work does not eliminate segmentation uncertainty, it improves simulation credibility by making visible the previously unrecognized segmentation uncertainty plaguing image-based simulation.
Image-based simulation, the use of 3D images to calculate physical quantities, relies on image segmentation for geometry creation. However, this process introduces image segmentation uncertainty because different segmentation tools (both manual and machine-learning-based) will each produce a unique and valid segmentation. First, we demonstrate that these variations propagate into the physics simulations, compromising the resulting physics quantities. Second, we propose a general framework for rapidly quantifying segmentation uncertainty. Through the creation and sampling of segmentation uncertainty probability maps, we systematically and objectively create uncertainty distributions of the physics quantities. We show that physics quantity uncertainty distributions can follow a Normal distribution, but, in more complicated physics simulations, the resulting uncertainty distribution can be surprisingly nontrivial. We establish that bounding segmentation uncertainty can fail in these nontrivial situations. While our work does not eliminate segmentation uncertainty, it improves simulation credibility by making visible the previously unrecognized segmentation uncertainty plaguing image-based simulation.
Deep learning segmentation models are known to be sensitive to the scale, contrast, and distribution of pixel values when applied to Computed Tomography (CT) images. For material samples, scans are often obtained from a variety of scanning equipment and resolutions resulting in domain shift. The ability of segmentation models to generalize to examples from these shifted domains relies on how well the distribution of the training data represents the overall distribution of the target data. We present a method to overcome the challenges presented by domain shifts. Our results indicate that we can leverage a deep learning model trained on one domain to accurately segment similar materials at different resolutions by refining binary predictions using uncertainty quantification (UQ). We apply this technique to a set of unlabeled CT scans of woven composite materials with clear qualitative improvement of binary segmentations over the original deep learning predictions. In contrast to prior work, our technique enables refined segmentations without the expense of the additional training time and parameters associated with deep learning models used to address domain shift.