This paper presents experimental results for two fuel-related topics in a diesel engine: (1) how fuel volatility affects the premixed burn and heat release rate, and (2) how ignition quality influences the soot formation. Fast evaporation of fuel may lead to more intense heat release if a higher percentage of the fuel is mixed with air to form a combustible mixture. However, if the evaporation of fuel is driven by mixing with high-temperature gases from the ambient, a high-volatility fuel will require less oxygen entrainment and mixing for complete vaporization and, consequently, may not have potential for significant heat release simply because it has vaporized. Fuel cetane number changes also cause uncertainty regarding soot formation because variable ignition delay will change levels of fuel-air mixing prior to combustion. To address these questions, experiments are performed using a constant-volume combustion chamber simulating typical low-temperature-combustion (LTC) diesel conditions. We use fuels that have the same ignition delay (and therefore similar time for premixing with air), but different fuel volatility, to assess the heat-release rate and spatial location of combustion. Under this condition, where fuel volatility is decoupled from the ignition delay, results show almost the same heat release rate and spatial location of the premixed burn. The effect of ignition quality on soot formation has also been studied while maintaining similar levels of fuel-ambient mixing prior to combustion. To achieve the same ignition delay, the high-cetane-number fuel is injected into an ambient gas at a lower temperature and vice versa. The total soot mass within the spray is measured and compared for fuels with different cetane numbers but with the same premixing level (e.g. the same ignition delay and lift-off length). Experimental results show that the combination of high cetane number and low ambient gas temperature produces lower soot than the other combination, because the ambient temperature predominantly affects soot formation.
The research described in this report developed the theoretical and conceptual framework for understanding, recognizing, and anticipating the origins, dynamic mechanisms, perceptions, and social structures of Islamic social reform movements in the Muslim homeland and in diaspora communities. This research has revealed valuable insights into the dynamic mechanisms associated with reform movements and, as such, offers the potential to provide indications and warnings of impending violence. This study produced the following significant findings: (1) A framework for understanding Islamic radicalization in the context of Social Movement Theory was developed and implemented. This framework provides a causal structure for the interrelationships among the myriad features of a social movement. (2) The degree to which movement-related activity shows early diffusion across multiple social contexts is a powerful distinguisher of successful and unsuccessful social movements. Indeed, this measurable appears to have significantly more predictive power than volume of such activity and also more power than various system intrinsics. (3) Significant social movements can occur only if both the intra-context 'infectivity' of the movement exceeds a certain threshold and the inter-context interactions associated with the movement occur with a frequency that is larger than another threshold. Note that this is reminiscent of, and significantly extends, well-known results for epidemic thresholds in disease propagation models. (4) More in-depth content analysis of blogs through the lens of Argumentation Theory has the potential to reveal new insights into radicalization in the context of Social Movement Theory. This connection has the potential to be of value from two important perspectives - first, this connection has the potential to provide more in depth insights into the forces underlying the emergence of radical behavior and second, this connection may provide insights into how to use the blogosphere to influence the emergent dialog to effectively impact the resulting actions taken by the potential radicals. The authors of this report recognize that Islamic communities are not the only source of radicalism; indeed many other groups, religious and otherwise, have used and continue to use, radicalism to achieve their ends. Further, the authors also recognize that not all Muslims use, or condone the use of, radical behavior. Indeed, only a very small segment of the Muslim communities throughout the world use and/or support such behavior. Nevertheless, the focus of this research is, indeed, on understanding, recognizing, and anticipating the origins, dynamic mechanisms, perceptions, and social structures of Islamic radicalism.
Education and training are the foundation for a state's development and maintenance of an indigenous capability to conduct a nuclear energy and research program, from both the regulatory perspective and the licensee or operator perspective. The International Training Course on the Physical Protection of Nuclear Facilities and Materials (ITC) is the original international training program in the area of physical protection of nuclear material, which the United States has been conducting since 1978. This course focuses on a systems engineering performance-based approach to requirements definition, design, and evaluation for physical protection systems. During the first twenty-one presentations of ITC, more than 600 national experts from more than sixty International Atomic Energy Agency member states were trained. This paper describes the content, structure, and process of ITC.
Carbon-manganese steels are candidates for the structural materials in hydrogen gas pipelines, however it is well known that these steels are susceptible to hydrogen embrittlement. Decades of research and industrial experience have established that hydrogen embrittlement compromises the structural integrity of steel components. This experience has also helped identify the failure modes that can operate in hydrogen containment structures. As a result, there are tangible ideas for managing hydrogen embrittement in steels and quantifying safety margins for steel hydrogen containment structures. For example, fatigue crack growth aided by hydrogen embrittlement is a key failure mode for steel hydrogen containment structures subjected to pressure cycling. Applying appropriate structural integrity models coupled with measurement of relevant material properties allows quantification of safety margins against fatigue crack growth in hydrogen containment structures. Furthermore, application of these structural integrity models is aided by the development of micromechanics models, which provide important insights such as the hydrogen distribution near defects in steel structures. The principal objective of this project is to enable application of structural integrity models to steel hydrogen pipelines. The new American Society of Mechanical Engineers (ASME) B31.12 design code for hydrogen pipelines includes a fracture mechanics-based design option, which requires material property inputs such as the threshold for rapid cracking and fatigue crack growth rate under cyclic loading. Thus, one focus of this project is to measure the rapid-cracking thresholds and fatigue crack growth rates of line pipe steels in high-pressure hydrogen gas. These properties must be measured for the base materials but more importantly for the welds, which are likely to be most vulnerable to hydrogen embrittlement. The measured properties can be evaluated by predicting the performance of the pipeline using a relevant structural integrity model, such as that in ASME B31.12. A second objective of this project is to enable development of micromechanics models of hydrogen embrittlement in pipeline steels. The focus of this effort is to establish physical models of hydrogen embrittlement in line pipe steels using evidence from analytical techniques such as electron microscopy. These physical models then serve as the framework for developing sophisticated finite-element models, which can provide quantitative insight into the micromechanical state near defects. Understanding the micromechanics of defects can ensure that structural integrity models are applied accurately and conservatively.
Sandia National Laboratories, California (SNL/CA) is a government-owned/contractor-operated laboratory. Sandia Corporation, a Lockheed Martin Company, operates the laboratory for the Department of Energy's National Nuclear Security Administration (NNSA). The NNSA Sandia Site Office oversees operations at the site, using Sandia Corporation as a management and operating contractor. This Site Environmental Report for 2009 was prepared in accordance with DOE Order 231.1A (DOE 2004a). The report provides a summary of environmental monitoring information and compliance activities that occurred at SNL/CA during calendar year 2009. General site and environmental program information is also included. The Site Environmental Report is divided into ten chapters. Chapter 1, the Executive Summary, highlights compliance and monitoring results obtained in 2009. Chapter 2 provides a brief introduction to SNL/CA and the existing environment found on site. Chapter 3 summarizes SNL/CA's compliance activities with the major environmental requirements applicable to site operations. Chapter 4 presents information on environmental management, performance measures, and environmental programs. Chapter 5 presents the results of monitoring and surveillance activities in 2009. Chapter 6 discusses quality assurance. Chapters 7 through 9 provide supporting information for the report and Chapter 10 is the report distribution list.
The Advanced Engineering Environment (AEE) project identifies emerging engineering environment tools and assesses their value to Sandia National Laboratories and our partners in the Nuclear Security Enterprise (NSE) by testing them in our design environment. This project accomplished several pilot activities, including: the preliminary definition of an engineering bill of materials (BOM) based product structure in the Windchill PDMLink 9.0 application; an evaluation of Mentor Graphics Data Management System (DMS) application for electrical computer-aided design (ECAD) library administration; and implementation and documentation of a Windchill 9.1 application upgrade. The project also supported the migration of legacy data from existing corporate product lifecycle management systems into new classified and unclassified Windchill PDMLink 9.0 systems. The project included two infrastructure modernization efforts: the replacement of two aging AEE development servers for reliable platforms for ongoing AEE project work; and the replacement of four critical application and license servers that support design and engineering work at the Sandia National Laboratories/California site.
Are your employees unhappy with internal corporate search? Frequent complaints include: too many results to sift through; results are unrelated/outdated; employees aren't sure which terms to search for. One way to improve intranet search is to implement a controlled vocabulary ontology. Employing this takes the guess work out of searching, makes search efficient and precise, educates employees about the lingo used within the corporation, and allows employees to contribute to the corpus of terms. It promotes internal corporate search to rival its superior sibling, internet search. We will cover our experiences, lessons learned, and conclusions from implementing a controlled vocabulary ontology at Sandia National Laboratories. The work focuses on construction of this ontology from the content perspective and the technical perspective. We'll discuss the following: (1) The tool we used to build a polyhierarchical taxonomy; (2) Examples of two methods of indexing the content: traditional 'back of the book' and folksonomy word-mapping; (3) Tips on how to build future search capabilities while building the basic controlled vocabulary; (4) How to implement the controlled vocabulary as an ontology that mimics Google's search suggestions; (5) Making the user experience more interactive and intuitive; and (6) Sorting suggestions based on preferred, alternate and related terms using SPARQL queries. In summary, future improvements will be presented, including permitting end-users to add, edit and remove terms, and filtering on different subject domains.
A sensitivity study was performed utilizing a three dimensional finite element model to assess allowable cavern field sizes in strategic petroleum reserve salt domes. A potential exists for tensile fracturing and dilatancy damage to salt that can compromise the integrity of a cavern field in situations where high extraction ratios exist. The effects of salt creep rate, depth of salt dome top, dome size, caprock thickness, elastic moduli of caprock and surrounding rock, lateral stress ratio of surrounding rock, cavern size, depth of cavern, and number of caverns are examined numerically. As a result, a correlation table between the parameters and the impact on the performance of a storage field was established. In general, slower salt creep rates, deeper depth of salt dome top, larger elastic moduli of caprock and surrounding rock, and a smaller radius of cavern are better for structural performance of the salt dome.
This paper introduces an effective-media toolset that can be used for the design of metamaterial structures based on metallic components such as split-ring resonators and dipoles, as well as dielectric spherical resonators. For demonstration purposes the toolset will be used to generate infrared metamaterial designs, and the predicted performances will be verified with full-wave numerical simulations.
Chemically reacting flow models generally involve inputs and parameters that are determined from empirical measurements, and therefore exhibit a certain degree of uncertainty. Estimating the propagation of this uncertainty into computational model output predictions is crucial for purposes of reacting flow model validation, model exploration, as well as design optimization. Recent years have seen great developments in probabilistic methods and tools for efficient uncertainty quantification (UQ) in computational models. These tools are grounded in the use of Polynomial Chaos (PC) expansions for representation of random variables. The utility and effectiveness of PC methods have been demonstrated in a range of physical models, including structural mechanics, transport in porous media, fluid dynamics, aeronautics, heat transfer, and chemically reacting flow. While high-dimensionality remains nominally an ongoing challenge, great strides have been made in dealing with moderate dimensionality along with non-linearity and oscillatory dynamics. In this talk, I will give an overview of UQ in chemical systems. I will cover both: (1) the estimation of uncertain input parameters from empirical data, and (2) the forward propagation of parametric uncertainty to model outputs. I will cover the basics of forward PC UQ methods with examples of their use. I will also highlight the need for accurate estimation of the joint probability density over the uncertain parameters, in order to arrive at meaningful estimates of model output uncertainties. Finally, I will discuss recent developments on the inference of this density given partial information from legacy experiments, in the absence of raw data.
SIRHEN (Sandia InfraRed HEtrodyne aNalysis) is a program for reducing data from photonic Doppler velocimetry (PDV) measurements. SIRHEN uses the short-time Fourier transform method to extract velocity information. The program can be run in MATLAB (2008b or later) or as a Windows executable. This report describes the new Sandia InfraRed HEtrodyne aNalysis program (SIRHEN; pronounced 'siren') that has been developed for efficient and robust analysis of PDV data. The program was designed for easy use within Sandia's dynamic compression community.
To maintain effective containment surveillance (CS) system capabilities, the requirements for such systems must continue to evolve outpacing diversion capabilities, reduce costs, and meet the needs of the looming nuclear renaissance. What are the future sensor-based capabilities that must be available to support growing CS requirements and what are the technologies needed to provide the underlying capabilities? This presentation is intended to discuss the present gaps in sensor-based containment and surveillance relevant technologies, and future development trends which may address these gaps. Consumer driven technology development will represent a component rich source of technologies and devices that can find application in containment and surveillance tools helping to minimize the technology gaps. Recognizing and utilizing these sources is paramount to cost effective solutions. Where these gaps cannot be addressed by consumer based development, custom, CS specific approaches are the only solution.
Discontinuity detection is an important component in many fields: Image recognition, Digital signal processing, and Climate change research. Current methods shortcomings are: Restricted to one- or two-dimensional setting, Require uniformly spaced and/or dense input data, and Give deterministic answers without quantifying the uncertainty. Spectral methods for Uncertainty Quantification with global, smooth bases are challenged by discontinuities in model simulation results. Domain decomposition reduces the impact of nonlinearities and discontinuities. However, while gaining more smoothness in each subdomain, the current domain refinement methods require prohibitively many simulations. Therefore, detecting discontinuities up front and refining accordingly provides huge improvement to the current methodologies.
It is known that, in general, the correlation structure in the joint distribution of model parameters is critical to the uncertainty analysis of that model. Very often, however, studies in the literature only report nominal values for parameters inferred from data, along with confidence intervals for these parameters, but no details on the correlation or full joint distribution of these parameters. When neither posterior nor data are available, but only summary statistics such as nominal values and confidence intervals, a joint PDF must be chosen. Given the summary statistics it may not be reasonable nor necessary to assume the parameters are independent random variables. We demonstrate, using a Bayesian inference procedure, how to construct a posterior density for the parameters exhibiting self consistent correlations, in the absence of data, given (1) the fit-model, (2) nominal parameter values, (3) bounds on the parameters, and (4) a postulated statistical model, around the fit-model, for the missing data. Our approach ensures external Bayesian updating while marginalizing over possible data realizations. We then address the matching of given parameter bounds through the choice of hyperparameters, which are introduced in postulating the statistical model, but are not given nominal values. We discuss some possible approaches, including (1) inferring them in a separate Bayesian inference loop and (2) optimization. We also perform an empirical evaluation of the algorithm showing the posterior obtained with this data free inference compares well with the true posterior obtained from inference against the full data set.
The U.S. Department of Energy's (DOE) National Nuclear Security Administration (NNSA) established the Global Threat Reduction Initiative's (GTRI) mission to reduce and protect nuclear and radiological materials located at civilian sites worldwide. Internationally, over 80 countries are cooperating with GTRI to enhance security of facilities with these materials. In 2004, a GTRI delegation began working with the Tanzania Atomic Energy Commission, (TAEC). The team conducted site assessments for the physical protection of radiological materials in Tanzania. Today, GTRI and the Government of Tanzania continue cooperative efforts to enhance physical security at several radiological sites, including a central sealed-source storage facility, and sites in the cities of Arusha, Dar Es Salaam, and Tanga. This paper describes the scope of physical protection work, lessons learned, and plans for future cooperation between the GTRI program and the TAEC. Additionally the paper will review the cooperative efforts between TAEC and the International Atomic Energy Agency (IAEA) with regards to a remote monitoring system at a storage facility and to the repackaging of radioactive sources.
Automated processing, modeling, and analysis of unstructured text (news documents, web content, journal articles, etc.) is a key task in many data analysis and decision making applications. As data sizes grow, scalability is essential for deep analysis. In many cases, documents are modeled as term or feature vectors and latent semantic analysis (LSA) is used to model latent, or hidden, relationships between documents and terms appearing in those documents. LSA supplies conceptual organization and analysis of document collections by modeling high-dimension feature vectors in many fewer dimensions. While past work on the scalability of LSA modeling has focused on the SVD, the goal of our work is to investigate the use of distributed memory architectures for the entire text analysis process, from data ingestion to semantic modeling and analysis. ParaText is a set of software components for distributed processing, modeling, and analysis of unstructured text. The ParaText source code is available under a BSD license, as an integral part of the Titan toolkit. ParaText components are chained-together into data-parallel pipelines that are replicated across processes on distributed-memory architectures. Individual components can be replaced or rewired to explore different computational strategies and implement new functionality. ParaText functionality can be embedded in applications on any platform using the native C++ API, Python, or Java. The ParaText MPI Process provides a 'generic' text analysis pipeline in a command-line executable that can be used for many serial and parallel analysis tasks. ParaText can also be deployed as a web service accessible via a RESTful (HTTP) API. In the web service configuration, any client can access the functionality provided by ParaText using commodity protocols ... from standard web browsers to custom clients written in any language.
The Integrated Modeling, Mapping, and Simulation (IMMS) program is designing and prototyping a simulation and collaboration environment for linking together existing and future modeling and simulation tools to enable analysts, emergency planners, and incident managers to more effectively, economically, and rapidly prepare, analyze, train, and respond to real or potential incidents. When complete, the IMMS program will demonstrate an integrated modeling and simulation capability that supports emergency managers and responders with (1) conducting 'what-if' analyses and exercises to address preparedness, analysis, training, operations, and lessons learned, and (2) effectively, economically, and rapidly verifying response tactics, plans and procedures.
Researchers at Sandia National Laboratories in Livermore, California are creating what is in effect a vast digital petridish able to hold one million operating systems at once in an effort to study the behavior of rogue programs known as botnets. Botnets are used extensively by malicious computer hackers to steal computing power fron Internet-connected computers. The hackers harness the stolen resources into a scattered but powerful computer that can be used to send spam, execute phishing, scams or steal digital information. These remote-controlled 'distributed computers' are difficult to observe and track. Botnets may take over parts of tens of thousands or in some cases even millions of computers, making them among the world's most powerful computers for some applications.
While RAID is the prevailing method of creating reliable secondary storage infrastructure, many users desire more flexibility than offered by current implementations. To attain needed performance, customers have often sought after hardware-based RAID solutions. This talk describes a RAID system that offloads erasure correction coding calculations to GPUs, allowing increased reliability by supporting new RAID levels while maintaining high performance.
Materials with switchable states are desirable in many areas of science and technology. The ability to thermally transform a dielectric material to a conductive state should allow for the creation of electronics with built-in safety features. Specifically, the non-desirable build-up and discharge of electricity in the event of a fire or over-heating would be averted by utilizing thermo-switchable dielectrics in the capacitors of electrical devices (preventing the capacitors from charging at elevated temperatures). We have designed a series of polymers that effectively switch from a non-conductive to a conductive state. The thermal transition is governed by the stability of the leaving group after it leaves as a free entity. Here, we present the synthesis and characterization of a series of precursor polymers that eliminate to form poly(p-phenylene vinylene) (PPV's).
A multi-laboratory ontology construction effort during the summer and fall of 2009 prototyped an ontology for counterfeit semiconductor manufacturing. This effort included an ontology development team and an ontology validation methods team. Here the third team of the Ontology Project, the Data Analysis (DA) team reports on their approaches, the tools they used, and results for mining literature for terminology pertinent to counterfeit semiconductor manufacturing. A discussion of the value of ontology-based analysis is presented, with insights drawn from other ontology-based methods regularly used in the analysis of genomic experiments. Finally, suggestions for future work are offered.
There is an increasing awareness that efficient and effective nuclear facility design is best achieved when requirements from the 3S disciplines Safety, Safeguards, and Security - are balanced and intrinsic to the facility design. This can be achieved when policy, processes, methods, and technologies are understood and applied in these areas during all phases of the design process. For the purposes of this paper, Security-by-design will be defined as the system level incorporation of the physical protection system (PPS) into a new or retrofitted nuclear power plant (NPP) or nuclear facility (NF) resulting in intrinsic security. Security-by-design can also be viewed as a framework to achieve robust and durable security systems. This paper reports on work performed to date to create a Security-by-Design Handbook, under a bilateral agreement between the United States and Japan, specifically, a review of physical protection principles and best practices, and a decommissioning to better understand where these principles and practices can be applied. This paper describes physical protection principles and best practices to achieve security-by- design that were gathered from International, Japanese, and U.S. sources. Principles are included for achieving security early in the design process where security requirements are typically less costly and easier to incorporate. The paper then describes a generic design process that covers the entire facility lifecycle from scoping and planning of the project to decommissioning and decontamination. Early design process phases, such as conceptual design, offer opportunities to add security features intrinsic to the facility design itself. Later phases, including design engineering and construction, are important for properly integrating security features into a coherent design and for planning for and assuring the proper performance of the security system during the operation and decommissioning of the facility. The paper also describes some future activities on this bilateral project to create a Security-by-Design Handbook. When completed, the Handbook is intended to assist countries with less experience in nuclear power programs to apply principles and best practices in an effective and efficient manner as early in the design as possible to achieve robust and durable security.
The objective is to find the optimal fuel inventory management strategy roadmap for each supplier along the fuel delivery supply chain. SoSAT (System of Systems Analysis Toolset) Enterprise is a suite of software tools: State Model tool; Stochastic simulation tool; Advanced data visualization tools; and Optimization tools. Initially designed to provide DoDand supporting organizations the capability to analyze a System-of-Systems (SoS) and its various platforms: (1) Supporting multiple US Army Program Executive Office Integration (PEO-I) trade studies; (2) Supporting US Army Program Executive Office of Ground Combat Systems (PEO GCS) for Fleet Management and Modernization Planning initiative; and (3) Participating in formal Verification, Validation & Accreditation effort with Army Organizations (AMSAA and ATEC).
The Department of Defense's (DoD) Energy Posture identified dependence of the US Military on fossil fuel energy as a key issue facing the military. Inefficient energy consumption leads to increased costs, effects operational performance and warfighter protection through large and vulnerable logistics support infrastructures. Military's use of energy is a critical national security problem. DoD's proposed metrics Fully Burdened Cost of Fuel and Energy Efficiency Key Performance Parameter (FBCF and Energy KPP) are a positive step to force energy use accountability onto Military programs. The ability to measure impacts of sustainment are required to fully measure Energy KPP. Sandia's work with Army demonstrates the capability to measure performance which includes energy constraint.
After more than 50 years of molecular simulations, accurate empirical models are still the bottleneck in the wide adoption of simulation techniques. Addressing this issue with a fresh paradigm is the need of the day. In this study, we outline a new genetic-programming based method to develop empirical models for a system purely from its energy and/or forces. While the approach was initially developed for the development of classical force-fields from ab-initio calculations, we also discuss its application to the molecular coarse-graining of methanol. Two models, one representing methanol by a single site and the other via two sites will be developed using this method. They will be validated against existing coarse-grained potentials for methanol by comparing thermophysical properties.
In the past decade, organic optoelectronic devices have made much advances that they become viable technologies. These organic optoelectronic devices involve integration of organics with highly dissimilar materials, e.g. metals, semiconductors, and oxides, with critical device actions taking places at the organic-inorganic interfaces. For examples, in organic photovoltaics, exciton dissociation and carrier separation occur at the donor-acceptor heterojunctions; careful design of junction area and band alignment is critical for optimizing device performance. In this talk, I will show two examples of modifying organic-inorganic interfaces with alkanethiol self-assembled monolayers (SAMs) to improve device performance. Alkanethiols are large band gap molecules that are expected to act as a transport barrier. When the Au cathode in polyfluorene OLEDs is modified with alkanethiol SAMs, the current is found to be lower while the output luminescent intensity higher, leading to higher external quantum efficiency at a given current density. In ZnO-polythiophene hybrid solar cells, increasing alkanethiol SAM length surprisingly leads to higher photocurrent, despite the SAM reduces electron transfer. I will discuss the mechanisms behind these unexpected improvements.
This document is the final report for the Sandia National Laboratory funded Student Fellowship position at New Mexico State University (NMSU) from 2008 to 2010. Ivan Mecimore, the PhD student in Electrical Engineering at NMSU, was conducting research into image and video processing techniques to identify features and correlations within images without requiring the decoding of the data compression. Such an analysis technique would operate on the encoded bit stream, potentially saving considerable processing time when operating on a platform that has limited computational resources. Unfortunately, the student has elected in mid-year not to continue with his research or the fellowship position. The student is unavailable to provide any details of his research for inclusion in this final report. As such, this final report serves solely to document the information provided in the previous end of year summary.
Structural phase transformations between ferroelectric (FE), antiferroelectric (AFE), and paraelectric (FE) phases are frequently observed in the zirconia-rich phase region on the lead zirconate-titanate (PZT) phase diagram. Since the free energy difference among these phases is small, phase transformation can be easily induced by temperature, pressure and electric field. These induced transformation characteristics have been used for many practical applications. This study focuses on a hydrostatic pressure induced FE-to-AFE phase transformation in a tin modified PZT ceramic (PSZT). The relative phase stability between FE and AFE phases is determined by the dielectric permittivity measurement as a function of temperature from -60 C to 125 C. A pressure-temperature phase diagram for the PSZT system will be presented.
Material control and accounting (MC&A) safeguards operations that track and account for critical assets at nuclear facilities provide a key protection approach for defeating insider adversaries. These activities, however, have been difficult to characterize in ways that are compatible with the probabilistic path analysis methods that are used to systematically evaluate the effectiveness of a site's physical protection (security) system (PPS). MC&A activities have many similar characteristics to operator procedures performed in a nuclear power plant (NPP) to check for anomalous conditions. This work applies human reliability analysis (HRA) methods and models for human performance of NPP operations to develop detection probabilities for MC&A activities. This has enabled the development of an extended probabilistic path analysis methodology in which MC&A protections can be combined with traditional sensor data in the calculation of PPS effectiveness. The extended path analysis methodology provides an integrated evaluation of a safeguards and security system that addresses its effectiveness for attacks by both outside and inside adversaries.
Historically, U.S. arms control policy and the U.S. nuclear weapons enterprise have been reactive to each other, rather than interdependent and mutually reinforcing. One element of the divergence has been the long timescale necessary to plan and create substantive changes in the infrastructure vs. the inherent unpredictability of arms control outcomes. We explore several examples that illustrate this tension, some of the costs and implications associated with this reactive paradigm, and illustrate that, while the nuclear weapons enterprise has long considered the implications of arms control in sizing capacity of its missions, it has not substantively considered arms control in construction requirement for capabilities and products. Since previous arms control agreements have limited numbers and types of deployed systems, with delivery systems as the object of verification, this disconnect has not been forefront. However, as future agreements unfold, the warhead itself may become the treaty limited item and the object of verification. Such a scenario might offer both the need and the opportunity to integrate nuclear weapons and arms control requirements in unprecedented ways. This paper seeks to inspire new thinking on how such integration could be fostered and the extent to which it can facilitate significant reduction in nuclear stockpiles.
We present a new fabrication technique called Membrane Projection Lithography for the production of three-dimensional metamaterials at infrared wavelengths. Using this technique, multilayer infrared metamaterials that include both in-plane and out-of-plane resonators can be fabricated.
With increasing concern over the ability to detect and characterize special nuclear materials, the need for computer codes that can successfully predict the response of detector systems to various measurement scenarios is extremely important. These computer algorithms need to be benchmarked against a variety of experimental configurations to ensure their accuracy and understand their limitations. The Monte Carlo code MCNP-PoliMi is a modified version of the MCNP-4c code. Recently these modifications have been ported into the new MCNPX 2.6.0 code, which gives the new MCNPX-PoliMi a wider variety of options and abilities, taking advantage of the improvements made to MCNPX. To verify the ability of the MCNPX-PoliMi code to simulate the response of a neutron multiplicity detector simulated results were compared to experimental data. The experiment consisted of a 4.5-kg sphere of alpha-phase plutonium that was moderated with various thicknesses of polyethylene. The results showed that our code system can simulate the multiplicity distributions with relatively good agreement with measured data. The enhancements made to MCNP since the release of MCNP-4c have had little to no effect on the ability of the MCNP-PoliMi to resolve the discrepancies observed in the simulated neutron multiplicity distributions when compared experimental data.
A two-dimensional phononic crystal (PnC) that can operate in the GHz range is created in a freestanding silicon substrate using NanoFIBrication (using a focused ion beam (FIB) to fabricate nanostructures). First, a simple cubic 6.75 x 6.75 ?m array of vias with 150 nm spacing is generated. After patterning the vias, they are backfilled with void-free tungsten scatterers. Each via has a diameter of 48 nm. Numerical calculations predict this 2D PnC will generate a band gap near 22 GHz. A protective layer of chromium on top of the thin (100 nm) silicon membrane confines the surface damage to the chromium, which can be removed at a later time. Inspection of the underside of the membrane shows the vias flaring out at the exit, which we are dubbing the 'trumpet effect'. The trumpet effect is explained by modeling the lateral damage in a freestanding membrane.
This brief paper explores the development of scalable, nonlinear, fully-implicit solution methods for a stabilized unstructured finite element (FE) discretization of the 2D incompressible (reduced) resistive MHD system. The discussion considers the stabilized FE formulation in context of a fully-implicit time integration and direct-to-steady-state solution capability. The nonlinear solver strategy employs Newton-Krylov methods, which are preconditioned using fully-coupled algebraic multilevel (AMG) techniques and a new approximate block factorization (ABF) preconditioner. The intent of these preconditioners is to enable robust, scalable and efficient solution approaches for the large-scale sparse linear systems generated by the Newton linearization. We present results for the fully-coupled AMG preconditioner for two prototype problems, a low Lundquist number MHD Faraday conduction pump and moderately-high Lundquist number incompressible magnetic island coalescence problem. For the MHD pump results we explore the scaling of the fully-coupled AMG preconditioner for up to 4096 processors for problems with up to 64M unknowns on a CrayXT3/4. Using the island coalescence problem we explore the weak scaling of the AMG preconditioner and the influence of the Lundquist number on the iteration count. Finally we present some very recent results for the algorithmic scaling of the ABF preconditioner.
Phase transformation between the ferroelectric (FE) and the antiferroelectric (AFE) phases in tin modified lead zirconate titanate (PSZT) ceramics can be influenced by pressure and electric field. Increasing the pressure has the tendency to favor the AFE phase while electric field favors the FE phase. In this study, these phase transformations are studied as functions of external pressure, temperature, and dc bias. The shifting of transformation temperature and the relative phase stability between FE and AFE with respect to these external parameters will be presented. Results will be compared to a pressure-induced depoling behavior (or FE-to-AFE phase transformation) for the PSZT ceramic. Fundamental issues relates to the relative phase stability will be discussed from the perspective of lattice dynamics theory.
Grid-tied PV energy smoothing was implemented by using a valve regulated lead-acid (VRLA) battery as a temporary energy storage device to both charge and discharge as required to smooth the inverter energy output from the PV array. Inverter output was controlled by the average solar irradiance over the previous 1h time interval. On a clear day the solar irradiance power curve is offset by about 1h, while on a variable cloudy day the inverter output power curve will be smoothed based on the average solar irradiance. Test results demonstrate that this smoothing algorithm works very well. Battery state of charge was more difficult to manage because of the variable system inefficiencies. Testing continued for 30-days and established consistent operational performance for extended periods of time under a wide variety of resource conditions. Both battery technologies from Exide (Absolyte) and East Penn (Advanced Valve Regulated Lead-Acid) proved to cycle well at a partial state of charge over the time interval tested.
Dielectric resonators are an effective means to realize isotropic, low-loss optical metamaterials. As proof of this concept, a cubic resonator is analytically designed and then tested in the long-wave infrared.
Dynamical systems theory provides a powerful framework for understanding the behavior of complex evolving systems. However applying these ideas to large-scale dynamical systems such as discretizations of multi-dimensional PDEs is challenging. Such systems can easily give rise to problems with billions of dynamical variables, requiring specialized numerical algorithms implemented on high performance computing architectures with thousands of processors. This talk will describe LOCA, the Library of Continuation Algorithms, a suite of scalable continuation and bifurcation tools optimized for these types of systems that is part of the Trilinos software collection. In particular, we will describe continuation and bifurcation analysis techniques designed for large-scale dynamical systems that are based on specialized parallel linear algebra methods for solving augmented linear systems. We will also discuss several other Trilinos tools providing nonlinear solvers (NOX), eigensolvers (Anasazi), iterative linear solvers (AztecOO and Belos), preconditioners (Ifpack, ML, Amesos) and parallel linear algebra data structures (Epetra and Tpetra) that LOCA can leverage for efficient and scalable analysis of large-scale dynamical systems.
Near Real Time Accountability (NRTA) of actinides at high precision in reprocessing plants has been a long sought-after goal in the safeguards community. Achieving this goal is hampered by the difficulty of making precision measurements in the reprocessing environment, equipment cost, and impact to plant operations. Thus the design of future reprocessing plants requires an optimization of different approaches. The Separations and Safeguards Performance Model, developed at Sandia National Laboratories, was used to evaluate a number of NRTA strategies in a UREX+ reprocessing plant. Strategies examined include the incorporation of additional actinide measurements of internal plant vessels, more use of process monitoring data, and the option of periodic draining of inventory to key tanks. Preliminary results show that the addition of measurement technologies can increase the overall measurement uncertainty due to additional error propagation, so care must be taken when designing an advanced system. Initial results also show that relying on a combination of different NRTA techniques will likely be the best option. The model provides a platform for integrating all the data. The modeling results for the different NRTA options under various material loss conditions will be presented.
Simulating the detailed evolution of microstructure at the mesoscale is increasingly being addressed by a number of methods. Discrete element modeling and Potts kinetic Monte Carlo have achieved success in capturing different aspects of sintering well. Discrete element cannot treat the details of neck formation and other shape evolution, especially when considering particles of arbitrary shapes. Potts kMC treats the micorstructural evolution very well, but cannot incorporate complex stress states that form especially during differential sintering. A model that is capable of simulating microstructural evolution during sintering at the mesoscale and can incorporate differential stresses is being developed. This multi-physics model that can treat both interfacial energies and the inter-particle stresses will be introduced. It will be applied to simulate microstructural evolution while resolving individual particles and the stresses that develop between them due to local shrinkage. Results will be presented and the future development of this model will be discussed.
The National Nuclear Security Administration (NNSA) has submitted an application to the Nuclear Regulatory Commission (NRC) for the air shipment of plutonium metal within the Plutonium Air Transportable (PAT-1) packaging. The PAT-1 packaging is currently authorized for the air transport of plutonium oxide in solid form only. The INMM presentation will provide a limited overview of the scope of the plutonium metal initiative and provide a status of the NNSA application to the NRC.
The Standing Committee on International Security of Radioactive and Nuclear Materials in the Nonproliferation and Arms Control Division conducted its fourth annual workshop in February 2010 on Reducing the Risk from Radioactive and Nuclear Materials. This workshop examined new technologies in real-time tracking of radioactive materials, new risks and policy issues in transportation security, the best practices and challenges found in addressing illicit radioactive materials trafficking, industry leadership in reducing proliferation risk, and verification of the Nuclear Nonproliferation Treaty, Article VI. Technology gaps, policy gaps, and prioritization for addressing the identified gaps were discussed. Participants included academia, policy makers, radioactive materials users, physical security and safeguards specialists, and vendors of radioactive sources and transportation services. This paper summarizes the results of this workshop with the recommendations and calls to action for the Institute of Nuclear Materials Management (INMM) membership community.
The data in many disciplines such as social networks, web analysis, etc. is link-based, and the link structure can be exploited for many different data mining tasks. In this paper, we consider the problem of temporal link prediction: Given link data for time periods 1 through T, can we predict the links in time period T + 1? Specifically, we look at bipartite graphs changing over time and consider matrix- and tensor-based methods for predicting links. We present a weight-based method for collapsing multi-year data into a single matrix. We show how the well-known Katz method for link prediction can be extended to bipartite graphs and, moreover, approximated in a scalable way using a truncated singular value decomposition. Using a CANDECOMP/PARAFAC tensor decomposition of the data, we illustrate the usefulness of exploiting the natural three-dimensional structure of temporal link data. Through several numerical experiments, we demonstrate that both matrix- and tensor-based techniques are effective for temporal link prediction despite the inherent difficulty of the problem.
We present the bandwidth enhancement of an EAM monolithically integrated with two mutually injection-locked lasers. An improvement in the modulation efficiency and bandwidth are shown with mutual injection locking.
Doppler radars can distinguish targets from clutter if the target's velocity along the radar line of sight is beyond that of the clutter. Some targets of interest may have a Doppler shift similar to that of clutter. The nature of sea clutter is different in the clutter and exo-clutter regions. This behavior requires special consideration regarding where a radar can expect to find sea-clutter returns in Doppler space and what detection algorithms are most appropriate to help mitigate false alarms and increase probability of detection of a target. This paper studies the existing state-of-the-art in the understanding of Doppler characteristics of sea clutter and scattering from the ocean to better understand the design and performance choices of a radar in differentiating targets from clutter under prevailing sea conditions.
Unbalanced double null ELMy H-mode configurations in DIII-D are used to simulate the situation in ITER high triangularity, burning plasma magnetic equilibria, where the second X-point lies close to the top of the vacuum vessel, creating a secondary divertor region at the upper blanket modules. The measured plasma conditions in the outer secondary divertor closely duplicated those projected for ITER. {sup 13}CH{sub 4} was injected into the secondary outer divertor to simulate sputtering there. The majority of the {sup 13}C found was in the secondary outer divertor. This material migration pattern is radically different than that observed for main wall {sup 13}CH{sub 4} injections into single null configurations where the deposition is primarily at the inner divertor. The implications for tritium codeposition resulting from sputtering at the secondary divertor in ITER are significant since release of tritium from Be co-deposits at the main wall bake temperature for ITER, 240 C, is incomplete. The principal features of the measured {sup 13}C deposition pattern have been replicated by the OEDGE interpretive code.
Motomesh is a Motorola product that performs mesh networking at both the client and access point levels and allows broadband mobile data connections with or between clients moving at vehicular speeds. Sandia National aboratories has extensive experience with this product and its predecessors in infrastructure-less mobile environments. This report documents experiments, which characterize certain aspects of how the Motomesh network performs when obile units are added to a fixed network infrastructure.
When a fluid jet impinges on a solid substrate, a variety of behaviors may occur around the impact region. One example is mounding, where the fluid enters the impact region faster than it can flow away, forming a mound of fluid above the main surface. For some operating conditions, this mound can destabilize and buckle, entraining air in the mound. Other behaviors include submerging flow, where the jet impinges into an otherwise steady pool of liquid, entraining a thin air layer as it enters the pool. This impact region is one of very high shear rates and as such, complex fluids behave very differently than do Newtonian fluids. In this work, we attempt to characterize this range of behavior for Newtonian and non-Newtonian fluids using dimensionless parameters. We model the fluid as a modified Bingham-Carreau-Yasuda fluid, which exhibits the full range of pseudoplastic flow properties throughout the impact region. Additionally, we study viscoelastic effects through the use of the Giesekus model. Both 2-D and 3-D numerical simulations are performed using a variety of finite element method techniques for tracking the jet interface, including Arbitrary Lagrangian Eulerian (ALE), diffuse level sets, and a conformal decomposition finite element method (CDFEM). The presence of shear-thinning characteristics drastically reduces unstable mounding behavior, yet can lead to air entrainment through the submerging flow regime. We construct an operating map to understand for what flow parameters mounding and submerging flows will occur, and how the fluid rheology affects these behaviors. This study has many implications in high-speed industrial bottle filling applications.
To test the hypothesis that high quality 3D Earth models will produce seismic event locations which are more accurate and more precise, we are developing a global 3D P wave velocity model of the Earth's crust and mantle using seismic tomography. In this paper, we present the most recent version of our model, SALSA3D version 1.5, and demonstrate its ability to reduce mislocations for a large set of realizations derived from a carefully chosen set of globally-distributed ground truth events. Our model is derived from the latest version of the Ground Truth (GT) catalog of P and Pn travel time picks assembled by Los Alamos National Laboratory. To prevent over-weighting due to ray path redundancy and to reduce the computational burden, we cluster rays to produce representative rays. Reduction in the total number of ray paths is {approx}50%. The model is represented using the triangular tessellation system described by Ballard et al. (2009), which incorporates variable resolution in both the geographic and radial dimensions. For our starting model, we use a simplified two layer crustal model derived from the Crust 2.0 model over a uniform AK135 mantle. Sufficient damping is used to reduce velocity adjustments so that ray path changes between iterations are small. We obtain proper model smoothness by using progressive grid refinement, refining the grid only around areas with significant velocity changes from the starting model. At each grid refinement level except the last one we limit the number of iterations to prevent convergence thereby preserving aspects of broad features resolved at coarser resolutions. Our approach produces a smooth, multi-resolution model with node density appropriate to both ray coverage and the velocity gradients required by the data. This scheme is computationally expensive, so we use a distributed computing framework based on the Java Parallel Processing Framework, providing us with {approx}400 processors. Resolution of our model is assessed using a variation of the standard checkerboard method. We compare the travel-time prediction and location capabilities of SALSA3D to standard 1D models via location tests on a global event set with GT of 5 km or better. These events generally possess hundreds of Pn and P picks from which we generate different realizations of station distributions, yielding a range of azimuthal coverage and ratios of teleseismic to regional arrivals, with which we test the robustness and quality of relocation. The SALSA3D model reduces mislocation over standard 1D ak135 regardless of Pn to P ratio, with the improvement being most pronounced at higher azimuthal gaps.
Sandia National Laboratories develops technologies to: (1) sustain, modernize, and protect our nuclear arsenal (2) Prevent the spread of weapons of mass destruction; (3) Provide new capabilities to our armed forces; (4) Protect our national infrastructure; (5) Ensure the stability of our nation's energy and water supplies; and (6) Defend our nation against terrorist threats. We identified the need for a single overarching Integrated Workplace Management System (IWMS) that would enable us to focus on customer missions and improve FMOC processes. Our team selected highly configurable commercial-off-the-shelf (COTS) software with out-of-the-box workflow processes that integrate strategic planning, project management, facility assessments, and space management, and can interface with existing systems, such as Oracle, PeopleSoft, Maximo, Bentley, and FileNet. We selected the Integrated Workplace Management System (IWMS) from Tririga, Inc. Facility Management System (FMS) Benefits are: (1) Create a single reliable source for facility data; (2) Improve transparency with oversight organizations; (3) Streamline FMOC business processes with a single, integrated facility-management tool; (4) Give customers simple tools and real-time information; (5) Reduce indirect costs; (6) Replace approximately 30 FMOC systems and 60 homegrown tools (such as Microsoft Access databases); and (7) Integrate with FIMS.
Modeling the interaction of dislocations with internal boundaries and free surfaces is essential to understanding the effect of material microstructure on dislocation motion. However, discrete dislocation dynamics methods rely on infinite domain solutions of dislocation fields which makes modeling of heterogeneous materials difficult. A finite domain dislocation dynamics capability is under development that resolves both the dislocation array and polycrystalline structure in a compatible manner so that free surfaces and material interfaces are easily treated. In this approach the polycrystalline structure is accommodated using the GFEM, and the displacement due to the dislocation array is added to the displacement approximation. Shown in figure 1 are representative results from simulations of randomly placed and oriented dislocation sources in a cubic nickel polycrystal. Each grain has a randomly assigned (unique) material basis, and available glide planes are chosen accordingly. The change in basis between neighboring grains has an important effect on the motion of dislocations since the resolved shear on available glide planes can change dramatically. Dislocation transmission through high angle grain boundaries is assumed to occur by absorption into the boundary and subsequent nucleation in the neighboring grain. Such behavior is illustrated in figure 1d. Nucleation from the vertically oriented source in the bottom right grain is due to local stresses from dislocation pile-up in the neighboring grain. In this talk, the method and implementation is presented as well as some representative results from large scale (i.e., massively parallel) simulations of dislocation motion in cubic nano-domain nickel alloy. Particular attention will be paid to the effect of grain size on polycrystalline strength.