Sandia's Dakota software (available at http://dakota.sandia.gov) supports science and engineering transformation through advanced exploration of simulations. Specifically it manages and analyzes ensembles of simulations to provide broader and deeper perspective for analysts and decision makers. This enables them to enhance understanding of risk, improve products, and assess simulation credibility. This manual offers Consortium for Advanced Simulation of Light Water Reactors (LWRs) (CASL) partners a guide to conducting Dakota-based VUQ studies for CASL problems. It motivates various classes of Dakota methods and includes examples of their use on representative application problems. On reading, a CASL analyst should understand why and how to apply Dakota to a simulation problem. This SAND report constitutes the product of CASL milestone L3:VUQ.V&V.P8.01 and is also being released as a CASL unlimited release report with number CASL-U-2014-0038-000.
Sabotage of spent nuclear fuel casks remains a concern nearly forty years after attacks against shipment casks were first analyzed and has a renewed relevance in the post-9/11 environment. A limited number of full-scale tests and supporting efforts using surrogate materials, typically depleted uranium dioxide (DUO2), have been conducted in the interim to more definitively determine the source term from these postulated events. In all the previous studies, the postulated attack of greatest interest was by a conical shape charge (CSC) that focuses the explosive energy much more efficiently than bulk explosives. However, the validity of these large-scale results remain in question due to the lack of a defensible Spent Fuel Ratio (SFR), defined as the amount of respirable aerosol generated by an attack on a mass of spent fuel compared to that of an otherwise identical DUO2 surrogate. Previous attempts to define the SFR have resulted in estimates ranging from 0.42 to 12 and include suboptimal experimental techniques and data comparisons. Different researchers have suggested using SFR values of 3 to 5.6. Sound technical arguments exist that the SFR does not exceed a value of unity. A defensible determination of the SFR in this lower range would greatly reduce the calculated risk associated with the transport and dry storage of spent nuclear fuel. Currently, Oak Ridge National Laboratory (ORNL) is in possession of several samples of spent nuclear fuel (SNF) that were used in the original SFR studies in the 1980s and were intended for use in a modern effort at Sandia National Laboratories (SNL) in the 2000s. A portion of these samples are being used for a variety of research efforts. However, the entirety of SNF samples at ORNL is scheduled for disposition at the Waste Isolation Pilot Plant (WIPP) by approximately the end of 2015. If a defensible SFR is to be determined for use in storage and transportation security analyses, the need to begin this effort is urgent in order to secure the only known available SNF samples with a clearly defined path to disposal.
We are developing computational models to elucidate the expansion and dynamic filling process of a polyurethane foam, PMDI. The polyurethane of interest is chemically blown, where carbon dioxide is produced via the reaction of water, the blowing agent, and isocyanate. The isocyanate also reacts with polyol in a competing reaction, which produces the polymer. Here we detail the experiments needed to populate a processing model and provide parameters for the model based on these experiments. The model entails solving the conservation equations, including the equations of motion, an energy balance, and two rate equations for the polymerization and foaming reactions, following a simplified mathematical formalism that decouples these two reactions. Parameters for the polymerization kinetics model are reported based on infrared spectrophotometry. Parameters describing the gas generating reaction are reported based on measurements of volume, temperature and pressure evolution with time. A foam rheology model is proposed and parameters determined through steady-shear and oscillatory tests. Heat of reaction and heat capacity are determined through differential scanning calorimetry. Thermal conductivity of the foam as a function of density is measured using a transient method based on the theory of the transient plane source technique. Finally, density variations of the resulting solid foam in several simple geometries are directly measured by sectioning and sampling mass, as well as through x-ray computed tomography. These density measurements will be useful for model validation once the complete model is implemented in an engineering code.
Probabilistic Risk Assessment (PRA) is the primary tool used to risk-inform nuclear power regulatory and licensing activities. Risk-informed regulations are intended to reduce inherent conservatism in regulatory metrics (e.g., allowable operating conditions and technical specifications) which are built into the regulatory framework by quantifying both the total risk profile as well as the change in the risk profile caused by an event or action (e.g., in-service inspection procedures or power uprates). Dynamical Systems (DS) analysis has been used to understand unintended time-dependent feedbacks in both industrial and organizational settings. In dynamical systems analysis, feedback loops can be characterized and studied as a function of time to describe the changes to the reliability of plant Structures, Systems and Components (SSCs). While DS has been used in many subject areas, some even within the PRA community, it has not been applied toward creating long-time horizon, dynamic PRAs (with time scales ranging between days and decades depending upon the analysis). Understanding slowly developing dynamic effects, such as wear-out, on SSC reliabilities may be instrumental in ensuring a safely and reliably operating nuclear fleet. Improving the estimation of a plant's continuously changing risk profile will allow for more meaningful risk insights, greater stakeholder confidence in risk insights, and increased operational flexibility.
Sandia is designing a set of modern, research-quality blades for use on the V27 turbines at the DOE/SNL SWiFT site at Texas Tech University in Lubbock, Texas. The new blades will replace OEM blades and will be a publicly available resource for subscale rotor research. Features of the new blades do not represent the optimal design for a V27 rotor, but are determined by aeroelastic scaling of relevant parameters and design drivers from a representative megawatt-scale rotor. Scaling parameters and design drivers are chosen based two factors: 1) retrofit to the existing SWiFT turbines and 2) replicate rotor loads and wake formation of a utility scale turbine to support turbine -turbine interaction research at multiple scales. The blades are expected to provide a publicly available baseline blade design which will enable increased participation in future blade research as well as accelerated hardware manufacture and test for demonstration of innovation. This paper discusses aeroelastic scaling approaches, a rotor design process and a summary of design concepts.
Large-Scale Data Analytics (LSDA) problems require finding meaningful patterns in data sets that are so large as to require leading-edge processing and storage capability. LSDA problems are increasingly important for government mission work, industrial application, and scientific discovery. Effective solution of some important LSDA problems requires a computational workload that is substantially different from that associated with traditional High Performance Computing (HPC) simulations intended to help understand physical phenomena or to conduct engineering. While traditional HPC application codes exploit structural regularity and data locality to improve performance, many analytics problems lead more naturally to very fine-grained communication between unpredictable sets of processors, resulting in less regular communication patterns that do not map efficiently on to typical HPC systems. In both simulation and analytics domains, however, data movement increasingly dominates the performance, energy usage, and price of computing systems. It is therefore plausible that we could find a more synergistic technology path forward. Even though future machines may continue to be configured differently for the two domains, a more common technological roadmap between them in the form of a degree of convergence in the underlying componentry and design principles to address these common technical challenges could have substantial technical and economic benefits.
Bauer, Christina A.; Jones, Simon C.; Kinnibrugh, Tiffany L.; Tongwa, Paul; Farrell, Richard A.; Vakil, Avinash; Timofeeva, Tatiana V.; Khrustalev, Victor N.; Allendorf, Mark D.
In order to provide Members of the Workforce with timely, accurate, and consistent responses to security-related questions and concerns, the Security and Emergency Management (S&EM) Center developed Security Connection, a customer-interfacing, single entry point resource center. Security Connection is manned by Security Connection Representatives (SCR) who process incoming questions via multiple sources, including phone and e-mail. In addition, SCRs also provide call volume relief to various Security programs and assume responsibility for answering program questions for line customers. In this manner, Security Connection adds value to both line customers and Security programs.
The DOE is currently directing extensive research into developing fuel cycle technologies that will enable the safe, secure, economic, and sustainable expansion of nuclear energy. The task is formidable considering the numerous fuel cycle options, the large dynamic systems that each represent, and the necessity to accurately predict their behavior. The path to successfully develop and implement an advanced fuel cycle is highly dependent on the modeling capabilities and simulation tools available for performing useful relevant analysis to assist stakeholders in decision making. Therefore a high-fidelity fuel cycle simulation tool that performs system analysis, including uncertainty quantification and optimization was developed. The resulting simulator also includes the capability to calculate environmental impact measures for individual components and the system. An integrated system method and analysis approach that provides consistent and comprehensive evaluations of advanced fuel cycles was developed. A general approach was utilized allowing for the system to be modified in order to provide analysis for other systems with similar attributes. By utilizing this approach, the framework for simulating many different fuel cycle options is provided. Two example fuel cycle configurations were developed to take advantage of used fuel recycling and transmutation capabilities in waste management scenarios leading to minimized waste inventories.
This report presents a concise history in tabular form of events leading up to site identification in 1978, site selection in 1987, subsequent characterization, and ongoing analysis through 2008 of the performance of a repository for spent nuclear fuel and high-level radioactive waste at Yucca Mountain in southern Nevada. The tabulated events generally occurred in five periods: (1) commitment to mined geologic disposal and identification of sites; (2) site selection and analysis, based on regional geologic characterization through literature and analogous data; (3) feasibility analysis demonstrating calculation procedures and importance of system components, based on rough measures of performance using surface exploration, waste process knowledge, and general laboratory experiments; (4) suitability analysis demonstrating viability of disposal system, based on environment-specific laboratory experiments, in-situ experiments, and underground disposal system characterization; and (5) compliance analysis, based on completed site-specific characterization. Because the relationship is important to understanding the evolution of the Yucca Mountain Project, the tabulation also shows the interaction between four broad categories of political bodies and government agencies/institutions: (a) technical milestones of the implementing institutions, (b) development of the regulatory requirements and related federal policy in laws and court decisions, (c) Presidential and agency directives and decisions, and (d) critiques of the Yucca Mountain Project and pertinent national and world events related to nuclear energy and radioactive waste.
Ductile metals and other materials typically deform plastically under large applied loads; a behavior most often modeled using plastic deformation constitutive models. However, it is possible to capture some of the key behaviors of plastic deformation using only the framework for nonlinear elastic mechanics. In this paper, we develop a phenomenological, hysteretic, nonlinear elastic constitutive model that captures many of the features expected of a plastic deformation model. This model is based on calculating a secant modulus directly from a materials stress-strain curve. Scalar stress and strain values are obtained in three dimensions by using the von Mises invariants. Hysteresis is incorporated by tracking an additional history variable and assuming an elastic unloading response. This model is demonstrated in both single- and multi-element simulations under varying strain conditions.
Halligan, Matthew S.; Beetner, Daryl G.; Grant, Steven L.
Little research has been performed to study how intentional electromagnetic signals may couple into recording devices. An electromagnetic susceptibility study was performed on an analog tape recorder, a digital video camera, a wired computer microphone, and a wireless microphone system to electromagnetic interference. Devices were subjected to electromagnetic stimulations in the frequency range of 1-990 MHz and field strengths up to 4.9 V/m. Carrier and message frequencies of the stimulation signals were swept, and the impacts of device orientation and antenna polarization were explored. Message signals coupled into all devices only when amplitude modulated signals were used as stimulation signals. Test conditions that produced maximum sensitivity were highly specific to each device. Only narrow carrier frequency ranges could be used for most devices to couple messages into recordings. A basic detection technique using cross-correlation demonstrated the need for messages to be as long as possible to maximize message detection and minimize detection error. Analysis suggests that detectable signals could be coupled to these recording devices under realistic ambient conditions.
In recent years, DFT-MD has been shown to be a useful computational tool for exploring the properties of WDM. These calculations achieve excellent agreement with shock compression experiments, which probe the thermodynamic parameters of the Hugoniot state. New X-ray Thomson Scattering diagnostics promise to deliver independent measurements of electronic density and temperature, as well as structural information in shocked systems. However, they require the development of new levels of theory for computing the associated observables within a DFT framework. The experimentally observable x-ray scattering cross section is related to the electronic density-density response function, which is obtainable using TDDFT - a formally exact extension of conventional DFT that describes electron dynamics and excited states. In order to develop a capability for modeling XRTS data and, more generally, to establish a predictive capability for first principles simulations of matter in extreme conditions, real-time TDDFT with Ehrenfest dynamics has been implemented in an existing PAW code for DFT-MD calculations. The purpose of this report is to record implementation details and benchmarks as the project advances from software development to delivering novel scientific results. Results range from tests that establish the accuracy, efficiency, and scalability of our implementation, to calculations that are verified against accepted results in the literature. Aside from the primary XRTS goal, we identify other more general areas where this new capability will be useful, including stopping power calculations and electron-ion equilibration.
Oil leaks were found in wellbores of Caverns 105 and 109 at the Big Hill Strategic Petroleum Reserve site. According to the field observations, two instances of casing damage occurred at the depth of the interbed between the caprock bottom and salt top. A three dimensional finite element model, which contains wellbore element blocks and allows each cavern to be configured individually, is constructed to investigate the wellbore damage mechanism. The model also contains element blocks to represent interface between each lithology and a shear zone to examine the interbed behavior in a realistic manner. The causes of the damaged casing segments are a result of vertical and horizontal movements of the interbed between the caprock and salt dome. The salt top subsides because the volume of caverns below the salt top decrease with time due to salt creep closure, while the caprock subsides at a slower rate because the caprock is thick and stiffer. This discrepancy yields a deformation of the well. The deformed wellbore may fail at some time. An oil leak occurs when the wellbore fails. A possible oil leak date of each well is determined using the equivalent plastic strain failure criterion. A well grading system for a remediation plan is developed based on the predicted leak dates of each wellbore.
This report presents analytic transmission line models for calculating the shielding effectiveness of two common calibration standard cables. The two cables have different canonical aperture types, which produce the same low frequency coupling but different responses at resonance. The dominant damping mechanism is produced by the current probe loads at the ends of the cables, which are characterized through adaptor measurements. The model predictions for the cables are compared with experimental measurements and good agreement between the results is demonstrated. This setup constitutes a nice repeatable geometry that nevertheless exhibits some of the challenges involved in modeling non-radio frequency geometries.
A proposed method is considered to classify the regions in the close neighborhood of selected measurements according to the ratio of two radionuclides measured from either a radioactive plume or a deposited radionuclide mixture. The subsequent associated locations are then considered in the area of interest with a representative ratio class. This method allows for a more comprehensive and meaningful understanding of the data sampled following a radiological incident.
We present results from the Bayesian calibration of hydrological parameters of the Community Land Model (CLM), which is often used in climate simulations and Earth system models. A statistical inverse problem is formulated for three hydrological parameters, conditional on observations of latent heat surface fluxes over 48 months. Our calibration method uses polynomial and Gaussian process surrogates of the CLM, and solves the parameter estimation problem using a Markov chain Monte Carlo sampler. Posterior probability densities for the parameters are developed for two sites with different soil and vegetation covers. Our method also allows us to examine the structural error in CLM under two error models. We find that surrogate models can be created for CLM in most cases. The posterior distributions are more predictive than the default parameter values in CLM. Climatologically averaging the observations does not modify the parameters' distributions significantly. The structural error model reveals a correlation time-scale which can be used to identify the physical process that could be contributing to it. While the calibrated CLM has a higher predictive skill, the calibration is under-dispersive.
The increased use and interest in wind energy over the last few years has necessitated an increase in the manufacturing of wind turbine blades. This increase in manufacturing has in many ways out stepped the current understanding of not only the materials used but also the manufacturing methods used to construct composite laminates. The goal of this study is to develop a list of process parameters which influence the quality of composite laminates manufactured using vacuum assisted resin transfer molding and to evaluate how they influence laminate quality. Known to be primary factors for the manufacturing process are resin flow rate and vacuum pressure. An incorrect balance of these parameters will often cause porosity or voids in laminates that ultimately degrade the strength of the composite. Fiber waviness has also been seen as a major contributor to failures in wind turbine blades and is often the effect of mishandling during the lay-up process. Based on laboratory tests conducted, a relationship between these parameters and laminate quality has been established which will be a valuable tool in developing best practices and standard procedures for the manufacture of wind turbine blade composites.
Accurate knowledge of thermophysical properties of urethane foam is considered extremely important for meaningful models and analyses to be developed of scenarios wherein the foam is heated. Its performance at temperature requires a solid understanding of the foam material properties at temperature. Also, foam properties vary with density/porosity. An experimental program to determine the thermal properties of the two foams and their parent solid urethane was developed in order to support development of a predictive model relating density and thermal properties from first principles. Thermal properties (thermal conductivity, diffusivity, and specific heat) of the foam were found to vary with temperatures from 26°C to 90°C. Thermal conductivity generally increases with increasing temperature for a given initial density and ranges from .0433 W/mK at 26°C to .0811 W/mK at 90°C; thermal diffusivity generally decreases with increasing temperature for a given initial density and ranges from .4101 mm2/s at 26°C to .1263 mm2/s at 90°C; and specific heat generally increases with increasing temperature for a given initial density and ranges from .1078 MJ/m3K at 26°C to .6323 MJ/m3K at 90°C. Thermal properties of the solid urethane were also found to vary with temperatures from 26°C to 90°C. Average thermal conductivity generally increases with increasing temperature for a given initial density and ranges from 0.126 to 0.131 W/mK at 26°C to 0.153 to 0.157 W/mK at 90°C; average thermal diffusivity generally decreases with increasing temperature for a given initial density and ranges from 0.142 to 0.147 mm2/s at 26°C to 0.124 to 0.125 mm2/s at 90°C; and average specific heat generally increases with increasing temperature for a given initial density and ranges from 0.889 to 0.899 MJ/m3K to 1.229 to 1.274 MJ/m3K at 90°C. The density of both foam and solid urethane decreased with increasing temperature.
Synthetic Aperture Radar (SAR) performance testing and estimation is facilitated by observing the system response to known target scene elements. Trihedral corner reflectors and other canonical targets play an important role because their Radar Cross Section (RCS) can be calculated analytically. However, reflector orientation and the proximity of the ground and mounting structures can significantly impact the accuracy and precision with which measurements can be made. These issues are examined in this report.
Two attachments are provided for performance testing of sensors and other Physical Protection System (PPS) components. The first attachment is a table of Trials and Failures, giving Probability of Detection (PD) for a designated confidence level and sorted by trials. The second attachment contains the same data, sorted by failures.
At the request of GDF Suez, a Rough Order of Magnitude (ROM) cost estimate was prepared for the design, construction, testing, and data analysis for an experimental series of large-scale (Liquefied Natural Gas) LNG spills on land and water that would result in the largest pool fires and vapor dispersion events ever conducted. Due to the expected cost of this large, multi-year program, the authors utilized Sandia's structured cost estimating methodology. This methodology insures that the efforts identified can be performed for the cost proposed at a plus or minus 30 percent confidence. The scale of the LNG spill, fire, and vapor dispersion tests proposed by GDF could produce hazard distances and testing safety issues that need to be fully explored. Based on our evaluations, Sandia can utilize much of our existing fire testing infrastructure for the large fire tests and some small dispersion tests (with some modifications) in Albuquerque, but we propose to develop a new dispersion testing site at our remote test area in Nevada because of the large hazard distances. While this might impact some testing logistics, the safety aspects warrant this approach. In addition, we have included a proposal to study cryogenic liquid spills on water and subsequent vaporization in the presence of waves. Sandia is working with DOE on applications that provide infrastructure pertinent to wave production. We present an approach to conduct repeatable wave/spill interaction testing that could utilize such infrastructure.