TTR 01065 Specification Supervisor Training
Abstract not provided.
Abstract not provided.
Abstract not provided.
Abstract not provided.
Abstract not provided.
Abstract not provided.
Abstract not provided.
Abstract not provided.
Abstract not provided.
Abstract not provided.
Abstract not provided.
Abstract not provided.
Abstract not provided.
Abstract not provided.
Abstract not provided.
Abstract not provided.
32nd ASME Wind Energy Symposium
An extensive database of simulated loads representing almost 100 years of operation of a utility-scale wind turbine has been developed using high-performance computing resources. Such a large amount of data makes it possible to evaluate several proposals being considered in planned revisions of industry guidelines such as the International Electrotechnical Commission's 61400-1 wind turbine design standard. Current design provisions, especially those dependent on large amounts of data, can be critically examined and validated or alternative proposals can be made based on studies using this loads database. We discuss one design load case in particular that requires nominal 50-year loads, often difficult to establish with limited simulations followed by statistical extrapolation, to which a load factor (1.25) is applied. Alternatives that use other load statistics easier to establish from simulations are systematically evaluated. Such robust load statistics are associated with lower levels of uncertainty. Load factors to be applied to such alternative nominal loads are higher than those for the 50-year load. We discuss how the loads database developed enabled systematic study of a proposal that can serve as an alternative to use of a factored 50-year load. Calibration of this proposal accounts for the uncertainty in estimation of loads from simulation and the large database allows assessment against 50-year loads with quantifiable (and low) uncertainty.
In high-consequence engineering organizations, such as Sandia, quality assurance may be heavily dependent on staff competency. Competency-dependent quality assurance models are at risk when the environment changes, as it has with increasing attrition rates, budget and schedule cuts, and competing program priorities. Risks in Sandia's competency-dependent culture can be mitigated through changes to hiring, training, and customer engagement approaches to manage people, partners, and products. Sandia's technical quality engineering organization has been able to mitigate corporate-level risks by driving changes that benefit all departments, and in doing so has assured Sandia's commitment to excellence in high-consequence engineering and national service.
One objective of the Climate Science for a Sustainable Energy Future (CSSEF) program is to develop the capability to thoroughly test and understand the uncertainties in the overall climate model and its components as they are being developed. The focus on uncertainties involves sensitivity analysis: the capability to determine which input parameters have a major influence on the output responses of interest. This report presents some initial sensitivity analysis results performed by Lawrence Livermore National Laboratory (LNNL), Sandia National Laboratories (SNL), and Pacific Northwest National Laboratory (PNNL). In the 2011-2012 timeframe, these laboratories worked in collaboration to perform sensitivity analyses of a set of CAM5, 2° runs, where the response metrics of interest were precipitation metrics. The three labs performed their sensitivity analysis (SA) studies separately and then compared results. Overall, the results were quite consistent with each other although the methods used were different. This exercise provided a robustness check of the global sensitivity analysis metrics and identified some strongly influential parameters.
Spurious energy in received radar data is a consequence of nonideal component and circuit behavior. This might be due to I/Q imbalance, nonlinear component behavior, additive interference (e.g. cross-talk, etc.), or other sources. The manifestation of the spurious energy in a range-Doppler map or image can be influenced by appropriate pulse-to-pulse phase modulation. Comparing multiple images having been processed with the same data but different signal paths and modulations allows identifying undesired spurs and then cropping or apodizing them.
Radar ISR does not always involve cooperative or even friendly targets. An adversary has numerous techniques available to him to counter the effectiveness of a radar ISR sensor. These generally fall under the banner of jamming, spoofing, or otherwise interfering with the EM signals required by the radar sensor. Consequently mitigation techniques are prudent to retain efficacy of the radar sensor. We discuss in general terms a number of mitigation techniques.
Experimental Techniques
Abstract not provided.
Sandia journal manuscript; Not yet accepted for publication
People save for retirement throughout their career because it is virtually impossible to save all you’ll need in retirement the year before you retire. Similarly, without installing incremental amounts of clean fossil, renewable or transformative energy technologies throughout the coming decades, a radical and immediate change will be near impossible the year before a policy goal is set to be in place. This notion of steady installation growth over acute installations of technology to meet policy goals is the core topic of discussion for this research. This research operationalizes this notion by developing the theoretical underpinnings of regulatory and market acceptance delays by building upon the common Technology Readiness Level (TRL) framework and offers two new additions to the research community. The new and novel Regulatory Readiness Level (RRL) and Market Readiness Level (MRL) frameworks were developed. These components, collectively called the Technology, Regulatory and Market (TRM) readiness level framework allow one to build new constraints into existing Integrated Assessment Models (IAMs) to address research questions such as, ‘To meet our desired technical and policy goals, what are the factors that affect the rate we must install technology to achieve these goals in the coming decades?’
Chemical Communications
Abstract not provided.
This document describes the marine hydrokinetic (MHK) input file and subroutines for the Sandia National Laboratories Environmental Fluid Dynamics Code (SNL-EFDC), which is a combined hydrodynamic, sediment transport, and water quality model based on the Environmental Fluid Dynamics Code (EFDC) developed by John Hamrick, formerly sponsored by the U.S. Environmental Protection Agency, and now maintained by Tetra Tech, Inc. SNL-EFDC has been previously enhanced with the incorporation of the SEDZLJ sediment dynamics model developed by Ziegler, Lick, and Jones. SNL-EFDC has also been upgraded to more accurately simulate algae growth with specific application to optimizing biomass in an open-channel raceway for biofuels production. A detailed description of the input file containing data describing the MHK device/array is provided, along with a description of the MHK FORTRAN routine. Both a theoretical description of the MHK dynamics as incorporated into SNL-EFDC and an explanation of the source code are provided. This user manual is meant to be used in conjunction with the original EFDC and sediment dynamics SNL-EFDC manuals. Through this document, the authors provide information for users who wish to model the effects of an MHK device (or array of devices) on a flow system with EFDC and who also seek a clear understanding of the source code, which is available from staff in the Water Power Technologies Department at Sandia National Laboratories, Albuquerque, New Mexico.
Abstract not provided.
Journal of Computational Physics
Abstract not provided.
Tensor (multiway array) factorization and decomposition offers unique advantages for activity characterization in spatio-temporal datasets because these methods are compatible with sparse matrices and maintain multiway structure that is otherwise lost in collapsing for regular matrix factorization. This report describes our research as part of the PANTHER LDRD Grand Challenge to develop a foundational basis of mathematical techniques and visualizations that enable unsophisticated users (e.g. users who are not steeped in the mathematical details of matrix algebra and mulitway computations) to discover hidden patterns in large spatiotemporal data sets.
This report summarizes the results of a NEAMS project focused on the use of uncertainty and sensitivity analysis methods within the NEK-5000 and Dakota software framework for assessing failure probabilities as part of probabilistic risk assessment. NEK-5000 is a software tool under development at Argonne National Laboratory to perform computational fluid dynamics calculations for applications such as thermohydraulics of nuclear reactor cores. Dakota is a software tool developed at Sandia National Laboratories containing optimization, sensitivity analysis, and uncertainty quantification algorithms. The goal of this work is to demonstrate the use of uncertainty quantification methods in Dakota with NEK-5000.
Abstract not provided.
Applied Physics Letters
Abstract not provided.
Abstract not provided.
The present study results are focused on laboratory testing of surrogate waste materials. The surrogate wastes correspond to a conservative estimate of degraded Waste Isolation Pilot Plant (WIPP) containers and TRU waste materials at the end of the 10,000 year regulatory period. Testing consists of hydrostatic, triaxial, and uniaxial strain tests performed on surrogate waste recipes that were previously developed by Hansen et al. (1997). These recipes can be divided into materials that simulate 50% and 100% degraded waste by weight. The percent degradation indicates the anticipated amount of iron corrosion, as well as the decomposition of cellulosics, plastics, and rubbers (CPR). Axial, lateral, and volumetric strain and axial, lateral, and pore stress measurements were made. Two unique testing techniques were developed during the course of the experimental program. The first involves the use of dilatometry to measure sample volumetric strain under a hydrostatic condition. Bulk moduli of the samples measured using this technique were consistent with those measured using more conventional methods. The second technique involved performing triaxial tests under lateral strain control. By limiting the lateral strain to zero by controlling the applied confining pressure while loading the specimen axially in compression, one can maintain a right-circular cylindrical geometry even under large deformations. This technique is preferred over standard triaxial testing methods which result in inhomogeneous deformation or “barreling”. Manifestations of the inhomogeneous deformation included non-uniform stress states, as well as unrealistic Poisson’s ratios (> 0.5) or those that vary significantly along the length of the specimen. Zero lateral strain controlled tests yield a more uniform stress state, and admissible and uniform values of Poisson’s ratio.
Journal of Chemical Education
Abstract not provided.
Abstract not provided.
The US Department of Energy (DOE) Energy Efficiency and Renewable Energy (EERE) Office of Fuel Cell Technologies Office (FCTO) is establishing the Hydrogen Fueling Infrastructure Research and Station Technology (H2FIRST) partnership, led by the National Renewable Energy Laboratory (NREL) and Sandia National Laboratories (SNL). FCTO is establishing this partnership and the associated capabilities in support of H2USA, the public/private partnership launched in 2013. The H2FIRST partnership provides the research and technology acceleration support to enable the widespread deployment of hydrogen infrastructure for the robust fueling of light-duty fuel cell electric vehicles (FCEV). H2FIRST will focus on improving private-sector economics, safety, availability and reliability, and consumer confidence for hydrogen fueling. This whitepaper outlines the goals, scope, activities associated with the H2FIRST partnership.
SIAM (Society of Industrial and Applied Mathematics) News
Abstract not provided.
An analysis of the Waste Isolation Pilot Plant (WIPP) colloid model constraints and parameter values was performed. The focus of this work was primarily on intrinsic colloids, mineral fragment colloids, and humic substance colloids, with a lesser focus on microbial colloids. Comments by the US Environmental Protection Agency (EPA) concerning intrinsic Th(IV) colloids and Mg-Cl-OH mineral fragment colloids were addressed in detail, assumptions and data used to constrain colloid model calculations were evaluated, and inconsistencies between data and model parameter values were identified. This work resulted in a list of specific conclusions regarding model integrity, model conservatism, and opportunities for improvement related to each of the four colloid types included in the WIPP performance assessment.
Transient electrostatic discharge (ESD) events are studied to assemble a predictive model of discharge from polymer surfaces. An analog circuit simulation is produced and its response is compared to various literature sources to explore its capabilities and limitations. Results suggest that polymer ESD events can be predicted to within an order of magnitude. These results compare well to empirical findings from other sources having similar reproducibility.
Simultaneous Thermogravimetric Modulated Beam Mass Spectrometry (STMBMS) measurements have been conducted on a new Insensitive Munitions (IM) formulation. IMX-101 is the first explosive to be fully IM qualified under new NATO STANAG guidelines for fielded munitions. The formulation uses dinitroanisole (DNAN) as a new melt cast material to replace TNT, and shows excellent IM performance when formulated with other energetic ingredients. The scope of this work is to explain this superior IM performance by investigating the reactive processes occurring in the material when subjected to a well-controlled thermal environment. The dominant reactive processes observed were a series of complex chemical interactions between the three main ingredients (DNAN, NQ, and NTO) that occurs well below the onset of the normal decomposition process of any of the individual ingredients. This process shifts the thermal response of the formulations to a much lower temperature, where the kinetically controlled reaction processes are much slower. This low temperature shift has the effect of allowing the reactions to consume the reactive solids (NQ, NTO) well before the reaction rates increase and reach thermal runaway, resulting in a relatively benign response to the external stimuli. The main findings on the interaction processes are presented.
Applied Optics
Abstract not provided.
Abstract not provided.
Abstract not provided.
Abstract not provided.
Abstract not provided.
Abstract not provided.
Abstract not provided.
Abstract not provided.
Abstract not provided.
Abstract not provided.
Abstract not provided.
Abstract not provided.
Abstract not provided.
Abstract not provided.
Abstract not provided.
Abstract not provided.
Abstract not provided.
Abstract not provided.
Abstract not provided.
Abstract not provided.
Abstract not provided.
Abstract not provided.
Abstract not provided.
Abstract not provided.
Abstract not provided.
Abstract not provided.
Abstract not provided.
Abstract not provided.
Abstract not provided.
Abstract not provided.
Abstract not provided.
Abstract not provided.
Abstract not provided.
Abstract not provided.
Abstract not provided.
Abstract not provided.
Abstract not provided.
Abstract not provided.
Abstract not provided.
Abstract not provided.
Abstract not provided.
Abstract not provided.
Abstract not provided.
Abstract not provided.
Abstract not provided.
Abstract not provided.
Abstract not provided.
A methodology for using the MELCOR code with the Latin Hypercube Sampling method was developed to estimate uncertainty in various predicted quantities such as hydrogen generation or release of fission products under severe accident conditions. In this case, the emphasis was on estimating the range of hydrogen sources in station blackout conditions in the Sequoyah Ice Condenser plant, taking into account uncertainties in the modeled physics known to affect hydrogen generation. The method uses user-specified likelihood distributions for uncertain model parameters, which may include uncertainties of a stochastic nature, to produce a collection of code calculations, or realizations, characterizing the range of possible outcomes. Forty MELCOR code realizations of Sequoyah were conducted that included 10 uncertain parameters, producing a range of in-vessel hydrogen quantities. The range of total hydrogen produced was approximately 583kg ± 131kg. Sensitivity analyses revealed expected trends with respected to the parameters of greatest importance, however, considerable scatter in results when plotted against any of the uncertain parameters was observed, with no parameter manifesting dominant effects on hydrogen generation. It is concluded that, with respect to the physics parameters investigated, in order to further reduce predicted hydrogen uncertainty, it would be necessary to reduce all physics parameter uncertainties similarly, bearing in mind that some parameters are inherently uncertain within a range. It is suspected that some residual uncertainty associated with modeling complex, coupled and synergistic phenomena, is an inherent aspect of complex systems and cannot be reduced to point value estimates. The probabilistic analyses such as the one demonstrated in this work are important to properly characterize response of complex systems such as severe accident progression in nuclear power plants.
Abstract not provided.
Abstract not provided.
Abstract not provided.
Abstract not provided.
Abstract not provided.
Abstract not provided.
Abstract not provided.
Abstract not provided.
Abstract not provided.
Abstract not provided.
Abstract not provided.
Reconsolidated crushed salt is being considered as a backfilling material placed upon nuclear waste within a salt repository environment. In-depth knowledge of thermal and mechanical properties of the crushed salt as it reconsolidates is critical to thermal/mechanical modeling of the reconsolidation process. An experimental study was completed to quantitatively evaluate the thermal conductivity of reconsolidated crushed salt as a function of porosity and temperature. The crushed salt for this study came from the Waste Isolation Pilot Plant (WIPP). In this work the thermal conductivity of crushed salt with porosity ranging from 1% to 40% was determined from room temperature up to 300°C, using two different experimental methods. Thermal properties (including thermal conductivity, thermal diffusivity and specific heat) of single-crystal salt were determined for the same temperature range. The salt was observed to dewater during heating; weight loss from the dewatering was quantified. The thermal conductivity of reconsolidated crushed salt decreases with increasing porosity; conversely, thermal conductivity increases as the salt consolidates. The thermal conductivity of reconsolidated crushed salt for a given porosity decreases with increasing temperature. A simple mixture theory model is presented to predict and compare to the data developed in this study.
Abstract not provided.