Grid-based mesh generation methods have been available for many years and can provide a reliable method for meshing arbitrary geometries with hexahedral elements. The principal use for these methods has mostly been limited to biological-type models where topology that may incorporate sharp edges and curve definitions are not critical. While these applications have been effective, robust generation of hexahedral meshes on mechanical models, where the topology is typically of prime importance, impose difficulties that existing grid-based methods have not yet effectively addressed. This work introduces a set of procedures that can be used in resolving the features of a geometric model for grid-based hexahedral mesh generation for mechanical or topology-rich models.
This report summarizes the findings for phase one of the agent review and discusses the review methods and results. The phase one review identified a short list of agent systems that would prove most useful in the service architecture of an information management, analysis, and retrieval system. Reviewers evaluated open-source and commercial multi-agent systems and scored them based upon viability, uniqueness, ease of development, ease of deployment, and ease of integration with other products. Based on these criteria, reviewers identified the ten most appropriate systems. The report also mentions several systems that reviewers deemed noteworthy for the ideas they implement, even if those systems are not the best choices for information management purposes.
The first approach-to-critical experiment in the Seven Percent Critical Experiment series was recently completed at Sandia. This experiment is part of the Seven Percent Critical Experiment which will provide new critical and reactor physics benchmarks for fuel enrichments greater than five weight percent. The inverse multiplication method was used to determine the state of the system during the course of the experiment. Using the inverse multiplication method, it was determined that the critical experiment went slightly supercritical with 1148 fuel elements in the fuel array. The experiment is described and the results of the experiment are presented.
Mappings from a master element to the physical mesh element, in conjunction with local metrics such as those appearing in the Target-matrix paradigm, are used to measure quality at points within an element. The approach is applied to both linear and quadratic triangular elements; this enables, for example, one to measure quality within a quadratic finite element. Quality within an element may also be measured on a set of symmetry points, leading to so-called symmetry metrics. An important issue having to do with the labeling of the element vertices is relevant to mesh quality tools such as Verdict and Mesquite. Certain quality measures like area, volume, and shape should be label-invariant, while others such as aspect ratio and orientation should not. It is shown that local metrics whose Jacobian matrix is non-constant are label-invariant only at the center of the element, while symmetry metrics can be label-invariant anywhere within the element, provided the reference element is properly restricted.
Nuclear nonproliferation efforts are supported by measurements that are capable of rapidly characterizing special nuclear materials (SNM). Neutron multiplicity counting is frequently used to estimate properties of SNM, including neutron source strength, multiplication, and generation time. Different classes of model have been used to estimate these and other properties from the measured neutron counting distribution and its statistics. This paper describes a technique to compute statistics of the neutron counting distribution using deterministic neutron transport models. This approach can be applied to rapidly and accurately analyze neutron multiplicity counting measurements.
The Alternative Liquid Fuels Simulation Model (AltSim) is a high-level dynamic simulation model which calculates and compares the production and end use costs, greenhouse gas emissions, and energy balances of several alternative liquid transportation fuels. These fuels include: corn ethanol, cellulosic ethanol from various feedstocks (switchgrass, corn stover, forest residue, and farmed trees), biodiesel, and diesels derived from natural gas (gas to liquid, or GTL), coal (coal to liquid, or CTL), and coal with biomass (CBTL). AltSim allows for comprehensive sensitivity analyses on capital costs, operation and maintenance costs, renewable and fossil fuel feedstock costs, feedstock conversion ratio, financial assumptions, tax credits, CO{sub 2} taxes, and plant capacity factor. This paper summarizes the structure and methodology of AltSim, presents results, and provides a detailed sensitivity analysis. The Energy Independence and Security Act (EISA) of 2007 sets a goal for the increased use of biofuels in the U.S., ultimately reaching 36 billion gallons by 2022. AltSim's base case assumes EPA projected feedstock costs in 2022 (EPA, 2009). For the base case assumptions, AltSim estimates per gallon production costs for the five ethanol feedstocks (corn, switchgrass, corn stover, forest residue, and farmed trees) of $1.86, $2.32, $2.45, $1.52, and $1.91, respectively. The projected production cost of biodiesel is $1.81/gallon. The estimates for CTL without biomass range from $1.36 to $2.22. With biomass, the estimated costs increase, ranging from $2.19 per gallon for the CTL option with 8% biomass to $2.79 per gallon for the CTL option with 30% biomass and carbon capture and sequestration. AltSim compares the greenhouse gas emissions (GHG) associated with both the production and consumption of the various fuels. EISA allows fuels emitting 20% less greenhouse gases (GHG) than conventional gasoline and diesels to qualify as renewable fuels. This allows several of the CBTL options to be included under the EISA mandate. The estimated GHG emissions associated with the production of gasoline and diesel are 19.80 and 18.40 kg of CO{sub 2} equivalent per MMBtu (kgCO{sub 2}e/MMBtu), respectively (NETL, 2008). The estimated emissions are significantly higher for several alternatives: ethanol from corn (70.6), GTL (51.9), and CTL without biomass or sequestration (123-161). Projected emissions for several other alternatives are lower; integrating biomass and sequestration in the CTL processes can even result in negative net emissions. For example, CTL with 30% biomass and 91.5% sequestration has estimated production emissions of -38 kgCO{sub 2}e/MMBtu. AltSim also estimates the projected well-to-wheel, or lifecycle, emissions from consuming each of the various fuels. Vehicles fueled with conventional diesel or gasoline and driven 12,500 miles per year emit 5.72-5.93 tons of CO{sub 2} equivalents per year (tCO{sub 2}e/yr). Those emissions are significantly higher for vehicles fueled with 100% ethanol from corn (8.03 tCO{sub 2}e/yr) or diesel from CTL without sequestration (10.86 to 12.85 tCO{sub 2}/yr). Emissions could be significantly lower for vehicles fueled with diesel from CBTL with various shares of biomass. For example, for CTL with 30% biomass and carbon sequestration, emissions would be 2.21 tCO{sub 2}e per year, or just 39% of the emissions for a vehicle fueled with conventional diesel. While the results presented above provide very specific estimates for each option, AltSim's true potential is as a tool for educating policy makers and for exploring 'what if?' type questions. For example, AltSim allows one to consider the affect of various levels of carbon taxes on the production cost estimates, as well as increased costs to the end user on an annual basis. Other sections of AltSim allow the user to understand the implications of various polices in terms of costs to the government or land use requirements. AltSim's structure allows the end user to explore each of these alternatives and understand the sensitivities implications associated with each assumption as well as the implications for bottom line economics, energy use, and greenhouse gas emissions.
The objective of this work is to perform an uncertainty quantification (UQ) and model validation analysis of simulations of tests in the cross-wind test facility (XTF) at Sandia National Laboratories. In these tests, a calorimeter was subjected to a fire and the thermal response was measured via thermocouples. The UQ and validation analysis pertains to the experimental and predicted thermal response of the calorimeter. The calculations were performed using Sierra/Fuego/Syrinx/Calore, an Advanced Simulation and Computing (ASC) code capable of predicting object thermal response to a fire environment. Based on the validation results at eight diversely representative TC locations on the calorimeter the predicted calorimeter temperatures effectively bound the experimental temperatures. This post-validates Sandia's first integrated use of fire modeling with thermal response modeling and associated uncertainty estimates in an abnormal-thermal QMU analysis.
This report summarizes the strategy and preparations for the first phase in the pressurized water reactor (PWR) ignition experimental program. During this phase, a single full length, prototypic 17×17 PWR fuel assembly will simulate a severe loss-of-coolantaccident in the spent fuel pool whereby the fuel is completely uncovered and heats up until ignition of the cladding occurs. Electrically resistive heaters with zircaloy cladding will substitute for the spent nuclear fuel. The assembly will be placed in a single pool cell with the outer wall well insulated. This boundary condition will imitate the situation of an assembly surrounded by assemblies of similar offload age.
The relationship between explosive yield and seismic magnitude has been extensively studied for underground nuclear tests larger than about 1 kt. For monitoring smaller tests over local ranges (within 200 km), we need to know whether the available formulas can be extrapolated to much lower yields. Here, we review published information on amplitude decay with distance, and on the seismic magnitudes of industrial blasts and refraction explosions in the western U. S. Next we measure the magnitudes of some similar shots in the northeast. We find that local magnitudes ML of small, contained explosions are reasonably consistent with the magnitude-yield formulas developed for nuclear tests. These results are useful for estimating the detection performance of proposed local seismic networks.
In many parts of the United States, as well as other regions of the world, competing demands for fresh water or water suitable for desalination are outstripping sustainable supplies. In these areas, new water supplies are necessary to sustain economic development and agricultural uses, as well as support expanding populations, particularly in the Southwestern United States. Increasing the supply of water will more than likely come through desalinization of water reservoirs that are not suitable for present use. Surface-deployed seismic and electromagnetic (EM) methods have the potential for addressing these critical issues within large volumes of an aquifer at a lower cost than drilling and sampling. However, for detailed analysis of the water quality, some sampling utilizing boreholes would be required with geophysical methods being employed to extrapolate these sampled results to non-sampled regions of the aquifer. The research in this report addresses using seismic and EM methods in two complimentary ways to aid in the identification of water reservoirs that are suitable for desalinization. The first method uses the seismic data to constrain the earth structure so that detailed EM modeling can estimate the pore water conductivity, and hence the salinity. The second method utilizes the coupling of seismic and EM waves through the seismo-electric (conversion of seismic energy to electrical energy) and the electro-seismic (conversion of electrical energy to seismic energy) to estimate the salinity of the target aquifer. Analytic 1D solutions to coupled pressure and electric wave propagation demonstrate the types of waves one expects when using a seismic or electric source. A 2D seismo-electric/electro-seismic is developed to demonstrate the coupled seismic and EM system. For finite-difference modeling, the seismic and EM wave propagation algorithms are on different spatial and temporal scales. We present a method to solve multiple, finite-difference physics problems that has application beyond the present use. A limited field experiment was conducted to assess the seismo-electric effect. Due to a variety of problems, the observation of the electric field due to a seismic source is not definitive.
Uncertainty quantification in climate models is challenged by the sparsity of the available climate data due to the high computational cost of the model runs. Another feature that prevents classical uncertainty analyses from being easily applicable is the bifurcative behavior in the climate data with respect to certain parameters. A typical example is the Meridional Overturning Circulation in the Atlantic Ocean. The maximum overturning stream function exhibits discontinuity across a curve in the space of two uncertain parameters, namely climate sensitivity and CO2 forcing. We develop a methodology that performs uncertainty quantification in this context in the presence of limited data.
We investigate the potential for neutron generation using the 1 MeV RHEPP-1 intense pulsed ion beam facility at Sandia National Laboratories for a number of emerging applications. Among these are interrogation of cargo for detection of special nuclear materials (SNM). Ions from single-stage sources driven by pulsed power represent a potential source of significant neutron bursts. While a number of applications require higher ion energies (e.g. tens of MeV) than that provided by RHEPP-1, its ability to generate deuterium beams allow for neutron generation at and below 1 MeV. This report details the successful generation and characterization of deuterium ion beams, and their use in generating up to 3 x 10{sup 10} neutrons into 4{pi} per 5kA ion pulse.
This paper has three goals. The first is to review Shannon's theory of information and the subsequent advances leading to today's statistics-based text analysis algorithms, showing that the semantics of the text is neglected. The second goal is to propose an extension of Shannon's original model that can take into account semantics, where the 'semantics' of a message is understood in terms of the intended or actual changes on the recipient of a message. The third goal is to propose several lines of research that naturally fall out of the proposed model. Each computational approach to solving some problem rests on an underlying model or set of models that describe how key phenomena in the real world are represented and how they are manipulated. These models are both liberating and constraining. They are liberating in that they suggest a path of development for new tools and algorithms. They are constraining in that they intentionally ignore other potential paths of development. Modern statistical-based text analysis algorithms have a specific intellectual history and set of underlying models rooted in Shannon's theory of communication. For Shannon, language is treated as a stochastic generator of symbol sequences. Shannon himself, subsequently Weaver, and at least one of his predecessors are all explicit in their decision to exclude semantics from their models. This rejection of semantics as 'irrelevant to the engineering problem' is elegant and combined with developments particularly by Salton and subsequently by Latent Semantic Analysis, has led to a whole collection of powerful algorithms and an industry for data mining technologies. However, the kinds of problems currently facing us go beyond what can be accounted for by this stochastic model. Today's problems increasingly focus on the semantics of specific pieces of information. And although progress is being made with the old models, it seems natural to develop or extend information theory to account for semantics. By developing such theory, we can improve the quality of the next generation analytical tools. Far from being a mere intellectual curiosity, a new theory can provide the means for us to take into account information that has been to date ignored by the algorithms and technologies we develop. This paper will begin with an examination of Shannon's theory of communication, discussing the contributions and the limitations of the theory and how that theory gets expanded into today's statistical text analysis algorithms. Next, we will expand Shannon's model. We'll suggest a transactional definition of semantics that focuses on the intended and actual change that messages are intended to have on the recipient. Finally, we will examine implications of the model for algorithm development.
An initial version of a Systems Dynamics (SD) modeling framework was developed for the analysis of a broad range of energy technology and policy questions. The specific question selected to demonstrate this process was 'what would be the carbon and import implications of expanding nuclear electric capacity to provide power for plug in hybrid vehicles?' Fifteen SNL SD energy models were reviewed and the US Energy and Greenhouse gas model (USEGM) and the Global Nuclear Futures model (GEFM) were identified as the basis for an initial modeling framework. A basic U.S. Transportation model was created to model U.S. fleet changes. The results of the rapid adoption scenario result in almost 40% of light duty vehicles being PHEV by 2040 which requires about 37 GWy/y of additional electricity demand, equivalent to about 25 new 1.4 GWe nuclear plants. The adoption rate of PHEVs would likely be the controlling factor in achieving the associated reduction in carbon emissions and imports.
The density of molten nitrate salts was measured to determine the effects of the constituents on the density of multi-component mixtures. The molten salts consisted of various proportions of the nitrates of potassium, sodium, lithium and calcium. Density measurements ere performed using an Archimedean method and the results were compared to data reported in the literature for the individual constituent salts or simple combinations, such as the binary Solar Salt mixture of NaNO3 and KNO3. The addition of calcium nitrate generally ncreased density, relative to potassium nitrate or sodium nitrate, while lithium nitrate decreased density. The temperature dependence of density is described by a linear equation regardless of composition. The molar volume, and thereby, density of multi-component mixtures an be calculated as a function of temperature using a linear additivity rule based on the properties of the individual constituents.
This report documents the various photovoltaic (PV) performance models and software developed and utilized by researchers at Sandia National Laboratories (SNL) in support of the Photovoltaics and Grid Integration Department. In addition to PV performance models, hybrid system and battery storage models are discussed. A hybrid system using other distributed sources and energy storage can help reduce the variability inherent in PV generation, and due to the complexity of combining multiple generation sources and system loads, these models are invaluable for system design and optimization. Energy storage plays an important role in reducing PV intermittency and battery storage models are used to understand the best configurations and technologies to store PV generated electricity. Other researcher's models used by SNL are discussed including some widely known models that incorporate algorithms developed at SNL. There are other models included in the discussion that are not used by or were not adopted from SNL research but may provide some benefit to researchers working on PV array performance, hybrid system models and energy storage. The paper is organized into three sections to describe the different software models as applied to photovoltaic performance, hybrid systems, and battery storage. For each model, there is a description which includes where to find the model, whether it is currently maintained and any references that may be available. Modeling improvements underway at SNL include quantifying the uncertainty of individual system components, the overall uncertainty in modeled vs. measured results and modeling large PV systems. SNL is also conducting research into the overall reliability of PV systems.
Longcope Jr., Donald B.; Warren, Thomas L.; Duong, Henry
In this paper we develop an aft-body loading function for penetration simulations that is based on the spherical cavity-expansion approximation. This loading function assumes that there is a preexisting cavity of radius a{sub o} before the expansion occurs. This causes the radial stress on the cavity surface to be less than what is obtained if the cavity is opened from a zero initial radius. This in turn causes less resistance on the aft body as it penetrates the target which allows for greater rotation of the penetrator. Results from simulations are compared with experimental results for oblique penetration into a concrete target with an unconfined compressive strength of 23 MPa.
Biofouling, the unwanted growth of biofilms on a surface, of water-treatment membranes negatively impacts in desalination and water treatment. With biofouling there is a decrease in permeate production, degradation of permeate water quality, and an increase in energy expenditure due to increased cross-flow pressure needed. To date, a universal successful and cost-effect method for controlling biofouling has not been implemented. The overall goal of the work described in this report was to use high-performance computing to direct polymer, material, and biological research to create the next generation of water-treatment membranes. Both physical (micromixers - UV-curable epoxy traces printed on the surface of a water-treatment membrane that promote chaotic mixing) and chemical (quaternary ammonium groups) modifications of the membranes for the purpose of increasing resistance to biofouling were evaluated. Creation of low-cost, efficient water-treatment membranes helps assure the availability of fresh water for human use, a growing need in both the U. S. and the world.
This report provides the DOE and industry with a general process for analyzing power electronics reliability. The analysis can help with understanding the main causes of failures, downtime, and cost and how to reduce them. One approach is to collect field maintenance data and use it directly to calculate reliability metrics related to each cause. Another approach is to model the functional structure of the equipment using a fault tree to derive system reliability from component reliability. Analysis of a fictitious device demonstrates the latter process. Optimization can use the resulting baseline model to decide how to improve reliability and/or lower costs. It is recommended that both electric utilities and equipment manufacturers make provisions to collect and share data in order to lay the groundwork for improving reliability into the future. Reliability analysis helps guide reliability improvements in hardware and software technology including condition monitoring and prognostics and health management.
A global partnership between nuclear energy supplier nations and user nations could enable the safe and secure expansion of nuclear power throughout the world. Although it is likely that supplier nations and their industries would be anxious to sell reactors and fuel services as part of this partnership, their commitment to close the fuel cycle (i.e., permanently take back fuel and high-level waste) remains unclear. At the 2007 Waste Management Symposia in Tucson, Arizona, USA, a distinguished international panel explored fuel take back and waste disposal from the perspective of current and prospective user nations. This paper reports on the findings of that panel and presents a path for policy makers to move forward with the partnership vision.
This report summarizes the Combinatorial Algebraic Topology: software, applications & algorithms workshop (CAT Workshop). The workshop was sponsored by the Computer Science Research Institute of Sandia National Laboratories. It was organized by CSRI staff members Scott Mitchell and Shawn Martin. It was held in Santa Fe, New Mexico, August 29-30. The CAT Workshop website has links to some of the talk slides and other information, http://www.cs.sandia.gov/CSRI/Workshops/2009/CAT/index.html. The purpose of the report is to summarize the discussions and recap the sessions. There is a special emphasis on technical areas that are ripe for further exploration, and the plans for follow-up amongst the workshop participants. The intended audiences are the workshop participants, other researchers in the area, and the workshop sponsors.
An electromagnetic analysis is performed on different first wall designs for the ITER device. The electromagnetic forces and torques present due to a plasma disruption event are calculated and compared for the different designs.
An electromagnetic analysis is performed on the ITER shield modules under different plasma disruption scenarios using the OPERA-3d software. The modeling procedure is explained, electromagnetic torques are presented, and results of the modeling are discussed.
We have performed molecular dynamics simulations of cascade damage in the gadolinium pyrochlore Gd{sub 2}Zr{sub 2}O{sub 7}, comparing results obtained from traditional methodologies that ignore the effect of electron-ion interactions with a 'two-temperature model' in which the electronic subsystem is modeled using a diffusion equation to determine the electronic temperature. We find that the electron-ion interaction friction coefficient {gamma}{sub p} is a significant parameter in determining the behavior of the system following the formation of the primary knock-on atom (here, a U{sup 3+} ion). The mean final U{sup 3+} displacement and the number of defect atoms formed is shown to decrease uniformly with increasing {gamma}{sub p}; however, other properties, such as the final equilibrium temperature and the oxygen-oxygen radial distribution function show a more complicated dependence on {gamma}{sub p}.
This report documents progress in discovering new catalytic technologies that will support the development of advanced biofuels. The global shift from petroleum-based fuels to advanced biofuels will require transformational breakthroughs in biomass deconstruction technologies, because current methods are neither cost effective nor sufficiently efficient or robust for scaleable production. Discovery and characterization of lignocellulolytic enzyme systems adapted to extreme environments will accelerate progress. Obvious extreme environments to mine for novel lignocellulolytic deconstruction technologies include aridland ecosystems (ALEs), such as those of the Sevilleta Long Term Ecological Research (LTER) site in central New Mexico (NM). ALEs represent at least 40% of the terrestrial biosphere and are classic extreme environments, with low nutrient availability, high ultraviolet radiation flux, limited and erratic precipitation, and extreme variation in temperatures. ALEs are functionally distinct from temperate environments in many respects; one salient distinction is that ALEs do not accumulate soil organic carbon (SOC), in marked contrast to temperate settings, which typically have large pools of SOC. Low productivity ALEs do not accumulate carbon (C) primarily because of extraordinarily efficient extracellular enzyme activities (EEAs) that are derived from underlying communities of diverse, largely uncharacterized microbes. Such efficient enzyme activities presumably reflect adaptation to this low productivity ecosystem, with the result that all available organic nutrients are assimilated rapidly. These communities are dominated by ascomycetous fungi, both in terms of abundance and contribution to ecosystem-scale metabolic processes, such as nitrogen and C cycling. To deliver novel, robust, efficient lignocellulolytic enzyme systems that will drive transformational advances in biomass deconstruction, we have: (1) secured an award through the Department of Energy (DoE) Joint Genome Institute (JGI) to perform metatranscriptomic functional profiling of eukaryotic microbial communities of blue grama grass (Bouteloua gracilis) rhizosphere (RHZ) soils and (2) isolated and provided initial genotypic and phenotypic characterization data for thermophilic fungi. Our preliminary results show that many strains in our collection of thermophilic fungi frequently outperform industry standards in key assays; we also demonstrated that this collection is taxonomically diverse and phenotypically compelling. The studies summarized here are being performed in collaboration with University of New Mexico and are based at the Sevilleta LTER research site.
This report describes work performed from October 2007 through September 2009 under the Sandia Laboratory Directed Research and Development project titled 'Reduced Order Modeling of Fluid/Structure Interaction.' This project addresses fundamental aspects of techniques for construction of predictive Reduced Order Models (ROMs). A ROM is defined as a model, derived from a sequence of high-fidelity simulations, that preserves the essential physics and predictive capability of the original simulations but at a much lower computational cost. Techniques are developed for construction of provably stable linear Galerkin projection ROMs for compressible fluid flow, including a method for enforcing boundary conditions that preserves numerical stability. A convergence proof and error estimates are given for this class of ROM, and the method is demonstrated on a series of model problems. A reduced order method, based on the method of quadratic components, for solving the von Karman nonlinear plate equations is developed and tested. This method is applied to the problem of nonlinear limit cycle oscillations encountered when the plate interacts with an adjacent supersonic flow. A stability-preserving method for coupling the linear fluid ROM with the structural dynamics model for the elastic plate is constructed and tested. Methods for constructing efficient ROMs for nonlinear fluid equations are developed and tested on a one-dimensional convection-diffusion-reaction equation. These methods are combined with a symmetrization approach to construct a ROM technique for application to the compressible Navier-Stokes equations.
Red teams that address complex systems have rarely taken advantage of Modeling and Simulation (M&S) in a way that reproduces most or all of a red-blue team exchange within a computer. Chess programs, starting with IBM's Deep Blue, outperform humans in that red-blue interaction, so why shouldn't we think computers can outperform traditional red teams now or in the future? This and future position papers will explore possible ways to use M&S to augment or replace traditional red teams in some situations, the features Red Team M&S should possess, how one might connect live and simulated red teams, and existing tools in this domain.
Group 12 metal cyclam complexes and their derivatives as well as (octyl){sub 2}Sn(OMe){sub 2} were examined as potential catalysts for the production of dimethyl carbonate (DMC) using CO{sub 2} and methanol. The zinc cyclams will readily take up carbon dioxide and methanol at room temperature and atmospheric pressure to give the metal methyl carbonate. The tin exhibited an improvement in DMC yields. Studies involving the reaction of bis-phosphino- and (phosphino)(silyl)-amido group 2 and 12 complexes with CO{sub 2} and CS{sub 2} were performed. Notable results include formation of phosphino-substituted isocyanates, fixation of three moles of CO{sub 2} in an unprecedented [N(CO{sub 2}){sub 3}]{sup 3-} anion, and rapid splitting of CS{sub 2} by main group elements under extremely mild conditions. Similar investigations of divalent group 14 silyl amides led to room temperature splitting of CO{sub 2} into CO and metal oxide clusters, and the formation of isocyanates and carbodiimides.
This late start RTBF project started the development of barium titanate (BTO)/glass nanocomposite capacitors for future and emerging energy storage applications. The long term goal of this work is to decrease the size, weight, and cost of ceramic capacitors while increasing their reliability. Ceramic-based nanocomposites have the potential to yield materials with enhanced permittivity, breakdown strength (BDS), and reduced strain, which can increase the energy density of capacitors and increase their shot life. Composites of BTO in glass will limit grain growth during device fabrication (preserving nanoparticle grain size and enhanced properties), resulting in devices with improved density, permittivity, BDS, and shot life. BTO will eliminate the issues associated with Pb toxicity and volatility as well as the variation in energy storage vs. temperature of PZT based devices. During the last six months of FY09 this work focused on developing syntheses for BTO nanoparticles and firing profiles for sintering BTO/glass composite capacitors.
Bioweapons and emerging infectious diseases pose formidable and growing threats to our national security. Rapid advances in biotechnology and the increasing efficiency of global transportation networks virtually guarantee that the United States will face potentially devastating infectious disease outbreaks caused by novel ('unknown') pathogens either intentionally or accidentally introduced into the population. Unfortunately, our nation's biodefense and public health infrastructure is primarily designed to handle previously characterized ('known') pathogens. While modern DNA assays can identify known pathogens quickly, identifying unknown pathogens currently depends upon slow, classical microbiological methods of isolation and culture that can take weeks to produce actionable information. In many scenarios that delay would be costly, in terms of casualties and economic damage; indeed, it can mean the difference between a manageable public health incident and a full-blown epidemic. To close this gap in our nation's biodefense capability, we will develop, validate, and optimize a system to extract nucleic acids from unknown pathogens present in clinical samples drawn from infected patients. This system will extract nucleic acids from a clinical sample, amplify pathogen and specific host response nucleic acid sequences. These sequences will then be suitable for ultra-high-throughput sequencing (UHTS) carried out by a third party. The data generated from UHTS will then be processed through a new data assimilation and Bioinformatic analysis pipeline that will allow us to characterize an unknown pathogen in hours to days instead of weeks to months. Our methods will require no a priori knowledge of the pathogen, and no isolation or culturing; therefore it will circumvent many of the major roadblocks confronting a clinical microbiologist or virologist when presented with an unknown or engineered pathogen.
Energy production is inextricably linked to national security and poses the danger of altering the environment in potentially catastrophic ways. There is no greater problem than sustainable energy production. Our purpose was to attack this problem by examining processes, technology, and science needed for recycling CO{sub 2} back into transportation fuels. This approach can be thought of as 'bio-inspired' as nature employs the same basic inputs, CO{sub 2}/energy/water, to produce biomass. We addressed two key deficiencies apparent in current efforts. First, a detailed process analysis comparing the potential for chemical and conventional engineering methods to provide a route for the conversion of CO{sub 2} and water to fuel has been completed. No apparent 'showstoppers' are apparent in the synthetic route. Opportunities to improve current processes have also been identified and examined. Second, we have also specifically addressed the fundamental science of the direct production of methanol from CO{sub 2} using H{sub 2} as a reductant.
The NUclear EVacuation Analysis Code (NUEVAC) has been developed by Sandia National Laboratories to support the analysis of shelter-evacuate (S-E) strategies following an urban nuclear detonation. This tool can model a range of behaviors, including complex evacuation timing and path selection, as well as various sheltering or mixed evacuation and sheltering strategies. The calculations are based on externally generated, high resolution fallout deposition and plume data. Scenario setup and calculation outputs make extensive use of graphics and interactive features. This software is designed primarily to produce quantitative evaluations of nuclear detonation response options. However, the outputs have also proven useful in the communication of technical insights concerning shelter-evacuate tradeoffs to urban planning or response personnel.
Previous studies in the nuclear weapons complex have shown that ambiguous work instructions (WIs) and operating procedures (OPs) can lead to human error, which is a major cause for concern. This report outlines some of the sources of ambiguity in written English and describes three recommendations for reducing ambiguity in WIs and OPs. The recommendations are based on commonly used research techniques in the fields of linguistics and cognitive psychology. The first recommendation is to gather empirical data that can be used to improve the recommended word lists that are provided to technical writers. The second recommendation is to have a review in which new WIs and OPs and checked for ambiguities and clarity. The third recommendation is to use self-paced reading time studies to identify any remaining ambiguities before the new WIs and OPs are put into use. If these three steps are followed for new WIs and OPs, the likelihood of human errors related to ambiguity could be greatly reduced.
This report describes the results of a small experimental study that investigated potential sources of ambiguity in written work instructions (WIs). The English language can be highly ambiguous because words with different meanings can share the same spelling. Previous studies in the nuclear weapons complex have shown that ambiguous WIs can lead to human error, which is a major cause for concern. To study possible sources of ambiguity in WIs, we determined which of the recommended action verbs in the DOE and BWXT writer's manuals have numerous meanings to their intended audience, making them potentially ambiguous. We used cognitive psychology techniques to conduct a survey in which technicians who use WIs in their jobs indicated the first meaning that came to mind for each of the words. Although the findings of this study are limited by the small number of respondents, we identified words that had many different meanings even within this limited sample. WI writers should pay particular attention to these words and to their most frequent meanings so that they can avoid ambiguity in their writing.
Fast electrical energy storage or Voltage-Driven Technology (VDT) has dominated fast, high-voltage pulsed power systems for the past six decades. Fast magnetic energy storage or Current-Driven Technology (CDT) is characterized by 10,000 X higher energy density than VDT and has a great number of other substantial advantages, but it has all but been neglected for all of these decades. The uniform explanation for neglect of CDT technology is invariably that the industry has never been able to make an effective opening switch, which is essential for the use of CDT. Most approaches to opening switches have involved plasma of one sort or another. On a large scale, gaseous plasmas have been used as a conductor to bridge the switch electrodes that provides an opening function when the current wave front propagates through to the output end of the plasma and fully magnetizes the plasma - this is called a Plasma Opening Switch (POS). Opening can be triggered in a POS using a magnetic field to push the plasma out of the A-K gap - this is called a Magnetically Controlled Plasma Opening Switch (MCPOS). On a small scale, depletion of electron plasmas in semiconductor devices is used to affect opening switch behavior, but these devices are relatively low voltage and low current compared to the hundreds of kilo-volts and tens of kilo-amperes of interest to pulsed power. This work is an investigation into an entirely new approach to opening switch technology that utilizes new materials in new ways. The new materials are Ferroelectrics and using them as an opening switch is a stark contrast to their traditional applications in optics and transducer applications. Emphasis is on use of high performance ferroelectrics with the objective of developing an opening switch that would be suitable for large scale pulsed power applications. Over the course of exploring this new ground, we have discovered new behaviors and properties of these materials that were here to fore unknown. Some of these unexpected discoveries have lead to new research directions to address challenges.
This manual describes the use of the Xyce Parallel Electronic Simulator. Xyce has been designed as a SPICE-compatible, high-performance analog circuit simulator, and has been written to support the simulation needs of the Sandia National Laboratories electrical designers. This development has focused on improving capability over the current state-of-the-art in the following areas: (1) Capability to solve extremely large circuit problems by supporting large-scale parallel computing platforms (up to thousands of processors). Note that this includes support for most popular parallel and serial computers. (2) Improved performance for all numerical kernels (e.g., time integrator, nonlinear and linear solvers) through state-of-the-art algorithms and novel techniques. (3) Device models which are specifically tailored to meet Sandia's needs, including some radiation-aware devices (for Sandia users only). (4) Object-oriented code design and implementation using modern coding practices that ensure that the Xyce Parallel Electronic Simulator will be maintainable and extensible far into the future. Xyce is a parallel code in the most general sense of the phrase - a message passing parallel implementation - which allows it to run efficiently on the widest possible number of computing platforms. These include serial, shared-memory and distributed-memory parallel as well as heterogeneous platforms. Careful attention has been paid to the specific nature of circuit-simulation problems to ensure that optimal parallel efficiency is achieved as the number of processors grows. The development of Xyce provides a platform for computational research and development aimed specifically at the needs of the Laboratory. With Xyce, Sandia has an 'in-house' capability with which both new electrical (e.g., device model development) and algorithmic (e.g., faster time-integration methods, parallel solver algorithms) research and development can be performed. As a result, Xyce is a unique electrical simulation capability, designed to meet the unique needs of the laboratory.
This document is a reference guide to the Xyce Parallel Electronic Simulator, and is a companion document to the Xyce Users’ Guide. The focus of this document is (to the extent possible) exhaustively list device parameters, solver options, parser options, and other usage details of Xyce. This document is not intended to be a tutorial. Users who are new to circuit simulation are better served by the Xyce Users’ Guide.
The purpose of this project is to develop multi-layered co-extrusion (MLCE) capabilities at Sandia National Laboratories to produce multifunctional polymeric structures. Multi-layered structures containing layers of alternating electrical, mechanical, optical, or structural properties can be applied to a variety of potential applications including energy storage, optics, sensors, mechanical, and barrier applications relevant to the internal and external community. To obtain the desired properties, fillers must be added to the polymer materials that are much smaller than the end layer thickness. We developed two filled polymer systems, one for conductive layers and one for dielectric layers and demonstrated the potential for using MLCE to manufacture capacitors. We also developed numerical models to help determine the material and processing parameters that impact processing and layer stability.
Nobel Prize winner Richard Smalley was an avid champion for the cause of energy research. Calling it 'the single most important problem facing humanity today,' Smalley promoted the development of nanotechnology as a means to harness solar energy. Using nanotechnology to create solar fuels (i.e., fuels created from sunlight, CO{sub 2}, and water) is an especially intriguing idea, as it impacts not only energy production and storage, but also climate change. Solar irradiation is the only sustainable energy source of a magnitude sufficient to meet projections for global energy demand. Biofuels meet the definition of a solar fuel. Unfortunately, the efficiency of photosynthesis will need to be improved by an estimated factor of ten before biofuels can fully replace fossil fuels. Additionally, biological organisms produce an array of hydrocarbon products requiring further processing before they are usable for most applications. Alternately, 'bio-inspired' nanostructured photocatalytic devices that efficiently harvest sunlight and use that energy to reduce CO{sub 2} into a single useful product or chemical intermediate can be envisioned. Of course, producing such a device is very challenging as it must be robust and multifunctional, i.e. capable of promoting and coupling the multi-electron, multi-photon water oxidation and CO{sub 2} reduction processes. Herein, we summarize some of the recent and most significant work towards creating light harvesting nanodevices that reduce CO{sub 2} to CO (a key chemical intermediate) that are based on key functionalities inspired by nature. We report the growth of Co(III)TPPCl nanofibers (20-100 nm in diameter) on gas diffusion layers via an evaporation induced self-assembly (EISA) method. Remarkably, as-fabricated electrodes demonstrate light-enhanced activity for CO{sub 2} reduction to CO as evidenced by cyclic voltammograms and electrolysis with/without light irradiation. To the best of our knowledge, it is the first time to observe such a light-enhanced CO{sub 2} reduction reaction based on nanostructured cobalt(III) porphyrin catalysts. Additionally, gas chromatography (GC) verifies that light irradiation can improve CO production by up to 31.3% during 2 hours of electrolysis. In addition, a variety of novel porphyrin nano- or micro-structures were also prepared including nanospheres, nanotubes, and micro-crosses.
We have used Matlab and Google Earth to construct a prototype application for modeling the performance of local seismic networks for monitoring small, contained explosions. Published equations based on refraction experiments provide estimates of peak ground velocities as a function of event distance and charge weight. Matlab routines implement these relations to calculate the amplitudes across a network of stations from sources distributed over a geographic grid. The amplitudes are then compared to ambient noise levels at the stations, and scaled to determine the smallest yield that could be detected at each source location by a specified minimum number of stations. We use Google Earth as the primary user interface, both for positioning the stations of a hypothetical local network, and for displaying the resulting detection threshold contours.
This report summarizes existing statistical engines in VTK/Titan and presents both the serial and parallel k-means statistics engines. It is a sequel to [PT08], [BPRT09], and [PT09] which studied the parallel descriptive, correlative, multi-correlative, principal component analysis, and contingency engines. The ease of use of the new parallel k-means engine is illustrated by the means of C++ code snippets and algorithm verification is provided. This report justifies the design of the statistics engines with parallel scalability in mind, and provides scalability and speed-up analysis results for the k-means engine.
The Proton-21 Laboratory in the Ukraine has been publishing results on shock-induced transmutation of several elements, including Cobalt 60 into non-radioactive elements. This report documents exploratory characterization of a shock-compressed Aluminum-6061 sample, which is the only available surrogate for the high-purity copper samples in the Proton-21 experiments. The goal was to determine Sandia's ability to detect possible shock-wave-induced transmutation products and to unambiguously validate or invalidate the claims in collaboration with the Proton-21 Laboratory. We have developed a suitable characterization process and tested it on the surrogate sample. Using trace elemental analysis capabilities, we found elevated and localized concentrations of impurity elements like the Ukrainians report. All our results, however, are consistent with the ejection of impurities that were not in solution in our alloy or were deposited from the cathode during irradiation or possibly storage. Based on the detection capabilities demonstrated and additional techniques available, we are positioned to test samples from Proton-21 if funded to do so.
We report the results of an LDRD effort to investigate new technologies for the identification of small-sized (mm to cm) debris in low-earth orbit. This small-yet-energetic debris presents a threat to the integrity of space-assets worldwide and represents significant security challenge to the international community. We present a nonexhaustive review of recent US and Russian efforts to meet the challenges of debris identification and removal and then provide a detailed description of joint US-Russian plans for sensitive, laser-based imaging of small debris at distances of hundreds of kilometers and relative velocities of several kilometers per second. Plans for the upcoming experimental testing of these imaging schemes are presented and a preliminary path toward system integration is identified.
This paper develops Classical and Bayesian methods for quantifying the uncertainty in reliability for a system of mixed series and parallel components for which both go/no-go and variables data are available. Classical methods focus on uncertainty due to sampling error. Bayesian methods can explore both sampling error and other knowledge-based uncertainties. To date, the reliability community has focused on qualitative statements about uncertainty because there was no consensus on how to quantify them. This paper provides a proof of concept that workable, meaningful quantification methods can be constructed. In addition, the application of the methods demonstrated that the results from the two fundamentally different approaches can be quite comparable. In both approaches, results are sensitive to the details of how one handles components for which no failures have been seen in relatively few tests.
I used supramolecular self-assembling cyanine and the polyamine spermine binding to Escherichia coli genomic DNA as a model for DNA collapse during high throughput screening. Polyamine binding to DNA converts the normally right handed B-DNA into left handed Z-DNA conformation. Polyamine binding to DNA was inhibited by the supramolecular self-assembling cyanine. Self-assembly of cyanine upon DNA scaffold was likewise competitively inhibited by spermine as signaled by fluorescence quench from DNA-cyanine ensemble. Sequence of DNA exposure to cyanine or spermine was critical in determining the magnitude of fluorescence quench. Methanol potentiated spermine inhibition by >10-fold. The IC{sub 50} for spermine inhibition was 0.35 {+-} 0.03 {micro}M and the association constant Ka was 2.86 x 10{sup -6}M. Reversibility of the DNA-polyamine interactions was evident from quench mitigation at higher concentrations of cyanine. System flexibility was demonstrated by similar spermine interactions with {lambda}DNA. The choices and rationale regarding the polyamine, the cyanine dye as well as the remarkable effects of methanol are discussed in detail. Cyanine might be a safer alternative to the mutagenic toxin ethidium bromide for investigating DNA-drug interactions. The combined actions of polyamines and alcohols mediate DNA collapse producing hybrid bio-nanomaterials with novel signaling properties that might be useful in biosensor applications. Finally, this work will be submitted to Analytical Sciences (Japan) for publication. This journal published our earlier, related work on cyanine supramolecular self-assembly upon a variety of nucleic acid scaffolds.
This report describes a new methodology, social language network analysis (SLNA), that combines tools from social language processing and network analysis to identify socially situated relationships between individuals which, though subtle, are highly influential. Specifically, SLNA aims to identify and characterize the nature of working relationships by processing artifacts generated with computer-mediated communication systems, such as instant message texts or emails. Because social language processing is able to identify psychological, social, and emotional processes that individuals are not able to fully mask, social language network analysis can clarify and highlight complex interdependencies between group members, even when these relationships are latent or unrecognized. This report outlines the philosophical antecedents of SLNA, the mechanics of preprocessing, processing, and post-processing stages, and some example results obtained by applying this approach to a 15-month corporate discussion archive.
Staggered bioterrorist attacks with aerosolized pathogens on population centers present a formidable challenge to resource allocation and response planning. The response and planning will commence immediately after the detection of the first attack and with no or little information of the second attack. In this report, we outline a method by which resource allocation may be performed. It involves probabilistic reconstruction of the bioterrorist attack from partial observations of the outbreak, followed by an optimization-under-uncertainty approach to perform resource allocations. We consider both single-site and time-staggered multi-site attacks (i.e., a reload scenario) under conditions when resources (personnel and equipment which are difficult to gather and transport) are insufficient. Both communicable (plague) and non-communicable diseases (anthrax) are addressed, and we also consider cases when the data, the time-series of people reporting with symptoms, are confounded with a reporting delay. We demonstrate how our approach develops allocations profiles that have the potential to reduce the probability of an extremely adverse outcome in exchange for a more certain, but less adverse outcome. We explore the effect of placing limits on daily allocations. Further, since our method is data-driven, the resource allocation progressively improves as more data becomes available.
Understanding charge transport processes at a molecular level using computational techniques is currently hindered by a lack of appropriate models for incorporating anistropic electric fields in molecular dynamics (MD) simulations. An important technological example is ion transport through solid-electrolyte interphase (SEI) layers that form in many common types of batteries. These layers regulate the rate at which electro-chemical reactions occur, affecting power, safety, and reliability. In this work, we develop a model for incorporating electric fields in MD using an atomistic-to-continuum framework. This framework provides the mathematical and algorithmic infrastructure to couple finite element (FE) representations of continuous data with atomic data. In this application, the electric potential is represented on a FE mesh and is calculated from a Poisson equation with source terms determined by the distribution of the atomic charges. Boundary conditions can be imposed naturally using the FE description of the potential, which then propagates to each atom through modified forces. The method is verified using simulations where analytical or theoretical solutions are known. Calculations of salt water solutions in complex domains are performed to understand how ions are attracted to charged surfaces in the presence of electric fields and interfering media.
Fiber-optic gas phase surface plasmon resonance (SPR) detection of several contaminant gases of interest to state-of-health monitoring in high-consequence sealed systems has been demonstrated. These contaminant gases include H{sub 2}, H{sub 2}S, and moisture using a single-ended optical fiber mode. Data demonstrate that results can be obtained and sensitivity is adequate in a dosimetric mode that allows periodic monitoring of system atmospheres. Modeling studies were performed to direct the design of the sensor probe for optimized dimensions and to allow simultaneous monitoring of several constituents with a single sensor fiber. Testing of the system demonstrates the ability to detect 70mTorr partial pressures of H{sub 2} using this technique and <280 {micro}Torr partial pressures of H{sub 2}S. In addition, a multiple sensor fiber has been demonstrated that allows a single fiber to measure H{sub 2}, H{sub 2}S, and H{sub 2}O without changing the fiber or the analytical system.
Working with leading experts in the field of cognitive neuroscience and computational intelligence, SNL has developed a computational architecture that represents neurocognitive mechanisms associated with how humans remember experiences in their past. The architecture represents how knowledge is organized and updated through information from individual experiences (episodes) via the cortical-hippocampal declarative memory system. We compared the simulated behavioral characteristics with those of humans measured under well established experimental standards, controlling for unmodeled aspects of human processing, such as perception. We used this knowledge to create robust simulations of & human memory behaviors that should help move the scientific community closer to understanding how humans remember information. These behaviors were experimentally validated against actual human subjects, which was published. An important outcome of the validation process will be the joining of specific experimental testing procedures from the field of neuroscience with computational representations from the field of cognitive modeling and simulation.
The kinetic Monte Carlo method and its variants are powerful tools for modeling materials at the mesoscale, meaning at length and time scales in between the atomic and continuum. We have completed a 3 year LDRD project with the goal of developing a parallel kinetic Monte Carlo capability and applying it to materials modeling problems of interest to Sandia. In this report we give an overview of the methods and algorithms developed, and describe our new open-source code called SPPARKS, for Stochastic Parallel PARticle Kinetic Simulator. We also highlight the development of several Monte Carlo models in SPPARKS for specific materials modeling applications, including grain growth, bubble formation, diffusion in nanoporous materials, defect formation in erbium hydrides, and surface growth and evolution.
Depaoli, David W.; Birdwell, Joseph F.; Gauld, Ian C.; Cipiti, Benjamin B.; De Almeida, Valmor F.
A number of codes have been developed in the past for safeguards analysis, but many are dated, and no single code is able to cover all aspects of materials accountancy, process monitoring, and diversion scenario analysis. The purpose of this work was to integrate a transient solvent extraction simulation module developed at Oak Ridge National Laboratory, with the Separations and Safeguards Performance Model (SSPM), developed at Sandia National Laboratory, as a first step toward creating a more versatile design and evaluation tool. The SSPM was designed for materials accountancy and process monitoring analyses, but previous versions of the code have included limited detail on the chemical processes, including chemical separations. The transient solvent extraction model is based on the ORNL SEPHIS code approach to consider solute build up in a bank of contactors in the PUREX process. Combined, these capabilities yield a more robust transient separations and safeguards model for evaluating safeguards system design. This coupling and initial results are presented. In addition, some observations toward further enhancement of separations and safeguards modeling based on this effort are provided, including: items to be addressed in integrating legacy codes, additional improvements needed for a fully functional solvent extraction module, and recommendations for future integration of other chemical process modules.
This highly interdisciplinary team has developed dual-color, total internal reflection microscopy (TIRF-M) methods that enable us to optically detect and track in real time protein migration and clustering at membrane interfaces. By coupling TIRF-M with advanced analysis techniques (image correlation spectroscopy, single particle tracking) we have captured subtle changes in membrane organization that characterize immune responses. We have used this approach to elucidate the initial stages of cell activation in the IgE signaling network of mast cells and the Toll-like receptor (TLR-4) response in macrophages stimulated by bacteria. To help interpret these measurements, we have undertaken a computational modeling effort to connect the protein motion and lipid interactions. This work provides a deeper understanding of the initial stages of cellular response to external agents, including dynamics of interaction of key components in the signaling network at the 'immunological synapse,' the contact region of the cell and its adversary.