Nowadays composite materials have been extensively utilized in many military and industrial applications. For example, the newest Boeing 787 uses 50% composite (mostly carbon fiber reinforced plastic) in production. However, the weak delamination strength of fiber reinforced composites, when subjected to external impact such as ballistic impact, has been always potential serious threats to the safety of passengers. Dynamic fracture toughness is a critical indicator of the performance from delamination in such impact events. Quasi-static experimental techniques for fracture toughness have been well developed. For example, end notched flexure (ENF) technique, which is illustrated in Fig. 1, has become a typical method to determined mode-II fracture toughness for composites under quasi-static loading conditions. However, dynamic fracture characterization of composites has been challenging. This has resulted in conflictive and confusing conclusions in regard to strain rate effects on fracture toughness of composites.
Liquid foams are viscoelastic liquids, exhibiting a fast relaxation attributed to local bubble motions and a slow response due to structural evolution of the intrinsically unstable system. In this work, these processes are examined in unique organic foams that differ from the typically investigated aqueous systems in two major ways: the organic foams (1) posses a much higher continuous phase viscosity and (2) exhibit a coarsening response that involves coalescence of cells. The transient and dynamic relaxation responses of the organic foams are evaluated and discussed in relation to the response of aqueous foams. The change in the foam response with increasing gas fraction, from that of a Newtonian liquid to one that is strongly viscoelastic, is also presented. In addition, the temporal dependencies of the linear viscoelastic response are assessed in the context of the foam structural evolution. These foams and characterization techniques provide a basis for testing stabilization mechanisms in epoxy-based foams for encapsulation applications.
Alkali nitrate eutectic mixtures are finding application as industrial heat transfer fluids in concentrated solar power generation systems. An important property for such applications is the melting point, or phase coexistence temperature. We have computed melting points for lithium, sodium and potassium nitrate from molecular dynamics simulations using a recently developed method, which uses thermodynamic integration to compute the free energy difference between the solid and liquid phases. The computed melting point for NaNO3 was within 15K of its experimental value, while for LiNO3 and KNO3, the computed melting points were within 100K of the experimental values [4]. We are currently extending the approach to calculate melting temperatures for binary mixtures of lithium and sodium nitrate.
The annual program report provides detailed information about all aspects of the SNL/CA Pollution Prevention Program for a given calendar year. It functions as supporting documentation to the SNL/CA Environmental Management System Program Manual. The program report describes the activities undertaken during the past year, and activities planned in future years to implement the Pollution Prevention Program, one of six programs that supports environmental management at SNL/CA.
Development of silicon, enhancement mode nanostructures for solid-state quantum computing will be described. A primary motivation of this research is the recent unprecedented manipulation of single electron spins in GaAs quantum dots, which has been used to demonstrate a quantum bit. Long spin decoherence times are predicted possible in silicon qubits. This talk will focus on silicon enhancement mode quantum dot structures that emulate the GaAs lateral quantum dot qubit but use an enhancement mode field effect transistor (FET) structure. One critical concern for silicon quantum dots that use oxides as insulators in the FET structure is that defects in the metal oxide semiconductor (MOS) stack can produce both detrimental electrostatic and paramagnetic effects on the qubit. Understanding the implications of defects in the Si MOS system is also relevant for other qubit architectures that have nearby dielectric passivated surfaces. Stable, lithographically defined, single-period Coulomb-blockade and single-electron charge sensing in a quantum dot nanostructure using a MOS stack will be presented. A combination of characterization of defects, modeling and consideration of modified approaches that incorporate SiGe or donors provides guidance about the enhancement mode MOS approach for future qubits and quantum circuit micro-architecture.
The microscopic Polymer Reference Interaction Site Model theory has been applied to spherical and rodlike fillers dissolved in three types of chemically heterogeneous polymer melts: alternating AB copolymer, random AB copolymers, and an equimolar blend of two homopolymers. In each case, one monomer species adsorbs more strongly on the filler mimicking a specific attraction, while all inter-monomer potentials are hard core which precludes macrophase or microphase separation. Qualitative differences in the filler potential-of-mean force are predicted relative to the homopolymer case. The adsorbed bound layer for alternating copolymers exhibits a spatial moduluation or layering effect but is otherwise similar to that of the homopolymer system. Random copolymers and the polymer blend mediate a novel strong, long-range bridging interaction between fillers at moderate to high adsorption strengths. The bridging strength is a non-monotonic function of random copolymer composition, reflecting subtle competing enthalpic and entropic considerations.
The low-energy properties of the Anderson model for a single impurity coupled to two leads are studied using the GW approximation. We find that quantities such as the spectral function at zero temperature, the linear-response conductance as function of temperature or the differential conductance as function of bias voltage exhibit universal scaling behavior in the Kondo regime. We show how the form of the GW scaling functions relates to the form of the scaling functions obtained from the exact solution at equilibrium. We also compare the energy scale that goes inside the GW scaling functions with the exact Kondo temperature, for a broad range of the Coulomb interaction strength in the asymptotic regime. This analysis allows to clarify a presently suspended question in the literature, namely whether or not the GW solution captures the Kondo resonance.
FRMAC was born out of circumstances 25 years ago when 17 federal agencies descended on the states with good intention during the Three-Mile Island nuclear power plant incident. At that time it quickly became evident that a better way was needed to support state and local governments in their time of emergency and recovery process. FRMAC's single voice of Federal support coordinates the multiple agencies that respond to a radiological event. Over the years, FRMAC has exercised, evaluated, and honed its ability to quickly respond to the needs of our communities. As the times have changed, FRMAC has expanded its focus from nuclear power plant incidents, to threats of a terrorist radiological dispersal device (RDD), to the unthinkable - an Improvised nuclear device (IND). And just as having the right tools are part of any trade, FRMAC's tool set has and is evolving to meet contemporary challenges - not just to improve the time it takes to collect data and assess the situation, but to provide a quality and comprehensive product that supports a stressed decision maker, responsible for the protection of the public. Innovations in the movement of data and information have changed our everyday lives. So too, FRMAC is capitalizing on industry innovations to improve the flow of information: from the early predictive models, to streamlining the process of getting data out of the field; to improving the time it takes to get assessed products in to the hands of the decision makers. FRMAC is focusing on the future through the digital age of electronic data processing. Public protective action and dose avoidance is the challenge.
In many industrial processes, gaseous moisture is undesirable as it can lead to metal corrosion, polymer degradation, and other materials aging processes. However, generating and measuring precise moisture concentrations is challenging due to the need to cover a broad concentration range (parts-per-billion to percent) and the affinity of moisture to a wide range surfaces and materials. This document will discuss the techniques employed by the Mass Spectrometry Laboratory of the Materials Reliability Department at Sandia National Laboratories to generate and measure known gaseous moisture concentrations. This document highlights the use of a chilled mirror and primary standard humidity generator for the characterization of aluminum oxide moisture sensors. The data presented shows an excellent correlation in frost point measured between the two instruments, and thus provides an accurate and reliable platform for characterizing moisture sensors and performing other moisture related experiments.
The goal of z-pinch inertial fusion energy (IFE) is to extend the single-shot z-pinch inertial confinement fusion (ICF) results on Z to a repetitive-shot z-pinch power plant concept for the economical production of electricity. Z produces up to 1.8 MJ of x-rays at powers as high as 230 TW. Recent target experiments on Z have demonstrated capsule implosion convergence ratios of 14-21 with a double-pinch driven target, and DD neutron yields up to 8x10exp10 with a dynamic hohlraum target. For z-pinch IFE, a power plant concept is discussed that uses high-yield IFE targets (3 GJ) with a low rep-rate per chamber (0.1 Hz). The concept includes a repetitive driver at 0.1 Hz, a Recyclable Transmission Line (RTL) to connect the driver to the target, high-yield targets, and a thick-liquid wall chamber. Recent funding by a U.S. Congressional initiative for $4M for FY04 is supporting research on RTLs, repetitive pulsed power drivers, shock mitigation, full RTL cycle planned experiments, high-yield IFE targets, and z-pinch power plant technologies. Recent results of research in all of these areas are discussed, and a Road Map for Z-Pinch IFE is presented.
This paper discusses implications and appropriate treatment of systematic uncertainty in experiments and modeling. Systematic uncertainty exists when experimental conditions, and/or measurement bias errors, and/or bias contributed by post-processing the data, are constant over the set of experiments but the particular values of the conditions and/or biases are unknown to within some specified uncertainty. Systematic uncertainties in experiments do not automatically show up in the output data, unlike random uncertainty which is revealed when multiple experiments are performed. Therefore, the output data must be properly 'conditioned' to reflect important sources of systematic uncertainty in the experiments. In industrial scale experiments the systematic uncertainty in experimental conditions (especially boundary conditions) is often large enough that the inference error on how the experimental system maps inputs to outputs is often quite substantial. Any such inference error and uncertainty thereof also has implications in model validation and calibration/conditioning; ignoring systematic uncertainty in experiments can lead to 'Type X' error in these procedures. Apart from any considerations of modeling and simulation, reporting of uncertainty associated with experimental results should include the effects of any significant systematic uncertainties in the experiments. This paper describes and illustrates the treatment of multivariate systematic uncertainties of interval and/or probabilistic natures, and combined cases. The paper also outlines a practical and versatile 'real-space' framework and methodology within which experimental and modeling uncertainties (correlated and uncorrelated, systematic and random, aleatory and epistemic) are treated to mitigate risk in model validation, calibration/conditioning, hierarchical modeling, and extrapolative prediction.
The goals of this project are to understand the fundamental principles that govern the formation and function of novel nanoscale and nanocomposite materials. Specific scientific issues being addressed include: design and synthesis of complex molecular precursors with controlled architectures, controlled synthesis of nanoclusters and nanoparticles, development of robust two or three-dimensionally ordered nanocomposite materials with integrated functionalities that can respond to internal or external stimuli through specific molecular interactions or phase transitions, fundamental understanding of molecular self-assembly mechanisms on multiple length scales, and fundamental understanding of transport, electronic, optical, magnetic, catalytic and photocatalytic properties derived from the nanoscale phenomena and unique surface and interfacial chemistry for DOE's energy mission.
With the Lemnos framework, interoperability of control security equipment is straightforward. To obtain interoperability between proprietary security appliance units, one or both vendors must now write cumbersome 'translation code.' If one party changes something, the translation code 'breaks.' The Lemnos project is developing and testing a framework that uses widely available security functions and protocols like IPsec - to form a secure communications channel - and Syslog, to exchange security log messages. Using this model, security appliances from two or more different vendors can clearly and securely exchange information, helping to better protect the total system. Simplify regulatory compliance in a complicated security environment by leveraging the Lemnos framework. As an electric utility, are you struggling to implement the NERC CIP standards and other regulations? Are you weighing the misery of multiple management interfaces against committing to a ubiquitous single-vendor solution? When vendors build their security appliances to interoperate using the Lemnos framework, it becomes practical to match best-of-breed offerings from an assortment of vendors to your specific control systems needs. The Lemnos project is developing and testing a framework that uses widely available open-source security functions and protocols like IPsec and Syslog to create a secure communications channel between appliances in order to exchange security data.
The internal structure of stars depends on the radiative opacity of the stellar matter. However, opacity models have never been experimentally tested at the conditions that exist inside stars. Experiments at the Sandia Z facility are underway to measure the x-ray transmission of iron, an important stellar constituent, at temperature and density high enough to evaluate the physical underpinnings of stellar opacity models. Initial experiments provided information on the charge state distribution and the energy level structure for the iron ions that exist at the solar radiation/convection boundary. Data analysis and new experiments at higher densities and temperatures will be described.
Changing paradigms from paper laboratory notebooks to electronic creates challenges. Meeting regulatory requirements in an R&D environment drives thorough documentation. Creating complete experimental records is easier using electronic laboratory notebooks. Supporting investigations through re-creating experimental conditions is greatly facilitated using an ELN.
Pandemic influenza has become a serious global health concern; in response, governments around the world have allocated increasing funds to containment of public health threats from this disease. Pandemic influenza is also recognized to have serious economic implications, causing illness and absence that reduces worker productivity and economic output and, through mortality, robs nations of their most valuable assets - human resources. This paper reports two studies that investigate both the short- and long-term economic implications of a pandemic flu outbreak. Policy makers can use the growing number of economic impact estimates to decide how much to spend to combat the pandemic influenza outbreaks. Experts recognize that pandemic influenza has serious global economic implications. The illness causes absenteeism, reduced worker productivity, and therefore reduced economic output. This, combined with the associated mortality rate, robs nations of valuable human resources. Policy makers can use economic impact estimates to decide how much to spend to combat the pandemic influenza outbreaks. In this paper economists examine two studies which investigate both the short- and long-term economic implications of a pandemic influenza outbreak. Resulting policy implications are also discussed. The research uses the Regional Economic Modeling, Inc. (REMI) Policy Insight + Model. This model provides a dynamic, regional, North America Industrial Classification System (NAICS) industry-structured framework for forecasting. It is supported by a population dynamics model that is well-adapted to investigating macro-economic implications of pandemic influenza, including possible demand side effects. The studies reported in this paper exercise all of these capabilities.
This document provides common best practices for the efficient utilization of parallel file systems for analysts and application developers. A multi-program, parallel supercomputer is able to provide effective compute power by aggregating a host of lower-power processors using a network. The idea, in general, is that one either constructs the application to distribute parts to the different nodes and processors available and then collects the result (a parallel application), or one launches a large number of small jobs, each doing similar work on different subsets (a campaign). The I/O system on these machines is usually implemented as a tightly-coupled, parallel application itself. It is providing the concept of a 'file' to the host applications. The 'file' is an addressable store of bytes and that address space is global in nature. In essence, it is providing a global address space. Beyond the simple reality that the I/O system is normally composed of a small, less capable, collection of hardware, that concept of a global address space will cause problems if not very carefully utilized. How much of a problem and the ways in which those problems manifest will be different, but that it is problem prone has been well established. Worse, the file system is a shared resource on the machine - a system service. What an application does when it uses the file system impacts all users. It is not the case that some portion of the available resource is reserved. Instead, the I/O system responds to requests by scheduling and queuing based on instantaneous demand. Using the system well contributes to the overall throughput on the machine. From a solely self-centered perspective, using it well reduces the time that the application or campaign is subject to impact by others. The developer's goal should be to accomplish I/O in a way that minimizes interaction with the I/O system, maximizes the amount of data moved per call, and provides the I/O system the most information about the I/O transfer per request.