Publications

Results 26–41 of 41

Search results

Jump to search filters

NSRD-16: Computational Capability to Substantiate DOE-HDBk-3010 Data

Laros, James H.; Bignell, John B.; Le, San L.; Dingreville, Remi P.; Gilkey, Lindsay N.; Gordon, Natalie G.; Fascitelli, Dominic G.

Safety basis analysts throughout the U.S. Department of Energy (DOE) complex rely heavily on the information provided in the DOE Handbook, DOE-HDBK-3010, Airborne Release Fractions/Rates and Respirable Fractions for Nonreactor Nuclear Facilities, to determine radionuclide source terms from postulated accident scenarios. In calculating source terms, analysts tend to use the DOE Handbook's bounding values on airborne release fractions (ARFs) and respirable fractions (RFs) for various categories of insults (representing potential accident release categories). This is typically due to both time constraints and the avoidance of regulatory critique. Unfortunately, these bounding ARFs/RFs represent extremely conservative values. Moreover, they were derived from very limited small-scale bench/laboratory experiments and/or from engineered judgment. Thus, the basis for the data may not be representative of the actual unique accident conditions and configurations being evaluated. The goal of this research is to develop a more accurate and defensible method to determine bounding values for the DOE Handbook using state-of-art multi-physics-based computer codes. This enables us to better understand the fundamental physics and phenomena associated with the types of accidents in the handbook. In this fourth year, we improved existing computational capabilities to better model fragmentation situations to capture small fragments during an impact accident. In addition, we have provided additional new information for various sections of Chapters 4 and 5 of the Handbook on free fall powders and impacts of solids, and have provided the damage ratio simulations for containers (7A drum and standard waste box) for various drops and impact scenarios. Thus, this work provides a low-cost method to establish physics-justified safety bounds by considering specific geometries and conditions that may not have been previously measured and/or are too costly to perform during an experiment.

More Details

Calibration of MAMBA and CRUD Sources Using Full-Core CTF+MAMBA

Gilkey, Lindsay N.; Hetzler, Adam; Collins, Benjamin; Salko, Robert

This report outlines a process for the deterministic calibration of MAMBA using the computational toolkit Dakota. The tools and processes for deterministic calibration have been built and are laid out in this report. While completing this milestone, issues emerged with MAMBA that resulted in delays. The consequences for these difficulties to the calibration process are briefly discussed. The report concludes with an outline of a path forward for Bayesian calibration. The Bayesian calibration will be performed next year. This process was laid out by Benjamin Collins, Robert Salko, and Adam Hetzler.

More Details

Surface Temperature Mapping Models for STAR and CTF

Gilkey, Lindsay N.

This milestone presents a demonstration of a surface mapping model to map the surface temperature for single-phase STAR-CCM+ to Cobra-TF using average temperature data. This model can be used to generate high-resolution surface temperature data. This can be accomplished with linear equations or with an alternative non-linear model. Improvements and a path forward for the surface mapping model to be applied to two-phase temperature mappings is also laid out in this milestone report.

More Details

Blind prediction of the response of an additively manufactured tensile test coupon loaded to failure

American Society of Mechanical Engineers, Pressure Vessels and Piping Division (Publication) PVP

Gilkey, Lindsay N.; Bignell, John B.; Dingreville, Remi; Sanborn, Scott E.; Jones, Chris A.

Sandia National Laboratories (SNL) conducted in the summer of 2017 its third fracture challenge (i.e., the Third Sandia Fracture Challenge or SFC3). The challenge, which was open to the public, asked participants to predict, without foreknowledge of the outcome, the fracture response predictions of an additively manufactured tensile test coupon of moderate geometric complexity when loaded to failure. This paper outlines the approach taken by our team, one of the SNL teams that participated in the challenge, to make a prediction. To do so, we employed a traditional finite element approach coupled with a continuum damage mechanics constitutive model. Constitutive model parameters were determined through a calibration process of the model response with the provided longitudinal and transverse tensile test coupon data. Comparison of model predictions with the challenge coupon test results are presented and general observations gleaned from the exercise are provided.

More Details

STAR-CCM+ (CFD) Calculations and Validation L3:VVI.H2L.P15.02

Gilkey, Lindsay N.

This milestone presents a demonstration of the High-to-Low (Hi2Lo) process in the VVI focus area. Validation and additional calculations with the commercial computational fluid dynamics code, STAR-CCM+, were performed using a 5x5 fuel assembly with non-mixing geometry and spacer grids. This geometry was based on the benchmark experiment provided by Westinghouse. Results from the simulations were compared to existing experimental data and to the subchannel thermal-hydraulics code COBRA-TF (CTF). An uncertainty quantification (UQ) process was developed for the STAR-CCM+ model and results of the STAR UQ were communicated to CTF. Results from STAR-CCM+ simulations were used as experimental design points in CTF to calibrate the mixing parameter β and compared to results obtained using experimental data points. This demonstrated that CTF’s β parameter can be calibrated to match existing experimental data more closely. The Hi2Lo process for the STAR-CCM+/CTF code coupling was documented in this milestone and closely linked L3:VVI.H2LP15.01 milestone report.

More Details

NSRD-11: Computational Capability to Substantiate DOE-HDBK-3010 Data

Laros, James H.; Brown, Alexander B.; Gelbard, Fred G.; Bignell, John B.; Pierce, Flint P.; Voskuilen, Tyler V.; Rodriguez, Salvador B.; Dingreville, Remi P.; Zepper, Ethan T.; Juan, Pierre-Alexandre J.; Le, San L.; Gilkey, Lindsay N.

Safety basis analysts throughout the U.S. Department of Energy (DOE) complex rely heavily on the information provided in the DOE Handbook, DOE - HDBK - 3010, Airborne Release Fractions/Rates and Respirable Fractions for Nonreactor Nuclear Facilities, to determine radionuclide source terms. In calculating source terms, analysts tend to use the DOE Handbook's bounding values on airborne release fractions (ARFs) and respirable fractions (RFs) for various categories of insults (representing potential accident release categories). This is typically due to both time constraints and the avoidance of regulatory critique. Unfortunately, these bounding ARFs/RFs represent extremely conservative values. Moreover, they were derived from very limited small-scale bench/laboratory experiments and/or from engineered judgment. Thus, the basis for the data may not be representative of the actual unique accident conditions and configurations being evaluated. The goal of this research is to develop a more accurate and defensible method to determine bounding values for the DOE Handbook using state-of-art multi-physics-based computer codes. This enables us to better understand the fundamental physics and phenomena associated with the types of accidents in the handbook. In this year, this research included improvements of the high-fidelity codes to model particle resuspension and multi-component evaporation for fire scenarios. We also began to model ceramic fragmentation experiments, and to reanalyze the liquid fire and powder release experiments that were done last year. The results show that the added physics better describes the fragmentation phenomena. Thus, this work provides a low-cost method to establish physics-justified safety bounds by taking into account specific geometries and conditions that may not have been previously measured and/or are too costly to perform.

More Details

CASL VMA Milestone Report FY16 (L3:VMA.VUQ.P13.08): Westinghouse Mixing with STAR-CCM+

Gilkey, Lindsay N.

STAR-CCM+ (STAR) is a high-resolution computational fluid dynamics (CFD) code developed by CD-adapco. STAR includes validated physics models and a full suite of turbulence models including ones from the k-ε and k-ω families. STAR is currently being developed to be able to do two phase flows, but the current focus of the software is single phase flow. STAR can use imported meshes or use the built in meshing software to create computation domains for CFD. Since the solvers generally require a fine mesh for good computational results, the meshes used with STAR tend to number in the millions of cells, with that number growing with simulation and geometry complexity. The time required to model the flow of a full 5x5 Mixing Vane Grid Assembly (5x5MVG) in the current STAR configuration is on the order of hours, and can be very computationally expensive. COBRA-TF (CTF) is a low-resolution subchannel code that can be trained using high fidelity data from STAR. CTF does not have turbulence models and instead uses a turbulent mixing coefficient β. With a properly calibrated β, CTF can be used a low-computational cost alternative to expensive full CFD calculations performed with STAR. During the Hi2Lo work with CTF and STAR, STAR-CCM+ will be used to calibrate β and to provide high-resolution results that can be used in the place of and in addition to experimental results to reduce the uncertainty in the CTF results.

More Details
Results 26–41 of 41
Results 26–41 of 41