Low temperature co-fire ceramic (LTCC) materials technology offers a cost-effective and versatile approach to design and manufacture high performance and reliable advanced microelectronic packages (e.g., for wireless communications). A critical issue in manufacturing LTCC microelectronics is the need to precisely and reproducibly control shrinkage on sintering. Master Sintering Curve (MSC) theory has been evaluated and successfully applied as a tool to predict and control LTCC sintering. Dilatometer sintering experiments were designed and completed to characterize the anisotropic sintering behavior of green LTCC materials formed by tape casting. The resultant master sintering curve generated from these data provides a means to predict density as a function of sintering time and temperature. The application of MSC theory to DuPont 951 Green Tape{trademark} will be demonstrated.
In the past decade, an advanced composite repair technology has made great strides in commercial aviation use. Extensive testing and analysis, through joint programs between the Sandia Labs FAA Airworthiness Assurance Center and the aviation industry, have proven that composite materials can be used to repair damaged aluminum structure. Successful pilot programs have produced flight performance history to establish the viability and durability of bonded composite patches as a permanent repair on commercial aircraft structures. With this foundation in place, efforts are underway to adapt bonded composite repair technology to civil structures. This paper presents a study in the application of composite patches on large trucks and hydraulic shovels typically used in mining operations. Extreme fatigue, temperature, erosive, and corrosive environments induce an array of equipment damage. The current weld repair techniques for these structures provide a fatigue life that is inferior to that of the original plate. Subsequent cracking must be revisited on a regular basis. It is believed that the use of composite doublers, which do not have brittle fracture problems such as those inherent in welds, will help extend the structure's fatigue life and reduce the equipment downtime. Two of the main issues for adapting aircraft composite repairs to civil applications are developing an installation technique for carbon steel structure and accommodating large repairs on extremely thick structures. This paper will focus on the first phase of this study which evaluated the performance of different mechanical and chemical surface preparation techniques. The factors influencing the durability of composite patches in severe field environments will be discussed along with related laminate design and installation issues.
It has recently been shown that local values of the conventional exchange energy per particle cannot be described by an analytic expansion in the density variation. Yet, it is known that the total exchange-correlation (XC) energy per particle does not show any corresponding nonanalyticity. Indeed, the nonanalyticity is here shown to be an effect of the separation into conventional exchange and correlation. We construct an alternative separation in which the exchange part is made well behaved by screening its long-ranged contributions, and the correlation part is adjusted accordingly. This alternative separation is as valid as the conventional one, and introduces no new approximations to the total XC energy. We demonstrate functional development based on this approach by creating and deploying a local-density-approximation-type XC functional. Hence, this work includes both the theory and the practical calculations needed to provide a starting point for an alternative approach towards improved approximations of the total XC energy.
A 1:4-scale model of a prestressed concrete containment vessel (PCCV), representative of a pressurized water reactor (PWR) plant in Japan, was constructed by NUPEC at Sandia National Laboratories from January 1997 through June, 2000. Concurrently, Sandia instrumented the model with nearly 1500 transducers to measure strain, displacement and forces in the model from prestressing through the pressure testing. The limit state test of the PCCV model, culminating in functional failure (i.e. leakage by cracking and liner tearing) was conducted in September, 2000 at Sandia National Laboratories. After inspecting the model and the data after the limit state test, it became clear that, other than liner tearing and leakage, structural damage was limited to concrete cracking and the overall structural response (displacements, rebar and tendon strains, etc.) was only slightly beyond yield. (Global hoop strains at the mid-height of the cylinder only reached 0.4%, approximately twice the yield strain in steel.) In order to provide additional structural response data, for comparison with inelastic response conditions, the PCCV model filled nearly full with water and pressurized to 3.6 times the design pressure, when a catastrophic rupture occurred preceded only briefly by successive tensile failure of several hoop tendons. This paper summarizes the results of these tests.
The U.S. Department of Energy recently announced the first five grants for the Genomes to Life (GTL) Program. The goal of this program is to ''achieve the most far-reaching of all biological goals: a fundamental, comprehensive, and systematic understanding of life.'' While more information about the program can be found at the GTL website (www.doegenomestolife.org), this paper provides an overview of one of the five GTL projects funded, ''Carbon Sequestration in Synechococcus Sp.: From Molecular Machines to Hierarchical Modeling.'' This project is a combined experimental and computational effort emphasizing developing, prototyping, and applying new computational tools and methods to elucidate the biochemical mechanisms of the carbon sequestration of Synechococcus Sp., an abundant marine cyanobacteria known to play an important role in the global carbon cycle. Understanding, predicting, and perhaps manipulating carbon fixation in the oceans has long been a major focus of biological oceanography and has more recently been of interest to a broader audience of scientists and policy makers. It is clear that the oceanic sinks and sources of CO(2) are important terms in the global environmental response to anthropogenic atmospheric inputs of CO(2) and that oceanic microorganisms play a key role in this response. However, the relationship between this global phenomenon and the biochemical mechanisms of carbon fixation in these microorganisms is poorly understood. The project includes five subprojects: an experimental investigation, three computational biology efforts, and a fifth which deals with addressing computational infrastructure challenges of relevance to this project and the Genomes to Life program as a whole. Our experimental effort is designed to provide biology and data to drive the computational efforts and includes significant investment in developing new experimental methods for uncovering protein partners, characterizing protein complexes, identifying new binding domains. We will also develop and apply new data measurement and statistical methods for analyzing microarray experiments. Our computational efforts include coupling molecular simulation methods with knowledge discovery from diverse biological data sets for high-throughput discovery and characterization of protein-protein complexes and developing a set of novel capabilities for inference of regulatory pathways in microbial genomes across multiple sources of information through the integration of computational and experimental technologies. These capabilities will be applied to Synechococcus regulatory pathways to characterize their interaction map and identify component proteins in these pathways. We will also investigate methods for combining experimental and computational results with visualization and natural language tools to accelerate discovery of regulatory pathways. Furthermore, given that the ultimate goal of this effort is to develop a systems-level of understanding of how the Synechococcus genome affects carbon fixation at the global scale, we will develop and apply a set of tools for capturing the carbon fixation behavior of complex of Synechococcus at different levels of resolution. Finally, because the explosion of data being produced by high-throughput experiments requires data analysis and models which are more computationally complex, more heterogeneous, and require coupling to ever increasing amounts of experimentally obtained data in varying formats, we have also established a companion computational infrastructure to support this effort as well as the Genomes to Life program as a whole.
The Big Hill salt dome, located in southeastern Texas, is home to one of four underground oil-storage facilities managed by the U. S. Department of Energy Strategic Petroleum Reserve (SPR) Program. Sandia National Laboratories, as the geotechnical advisor to the SPR, conducts site-characterization investigations and other longer-term geotechnical and engineering studies in support of the program. This report describes the conversion of two-dimensional geologic interpretations of the Big Hill site into three-dimensional geologic models. The new models include the geometry of the salt dome, the surrounding sedimentary units, mapped faults, and the 14 oil storage caverns at the site. This work provides a realistic and internally consistent geologic model of the Big Hill site that can be used in support of future work.
Developers of computer codes, analysts who use the codes, and decision makers who rely on the results of the analyses face a critical question: How should confidence in modeling and simulation be critically assessed? Verification and validation (V&V) of computational simulations are the primary methods for building and quantifying this confidence. Briefly, verification is the assessment of the accuracy of the solution to a computational model. Validation is the assessment of the accuracy of a computational simulation by comparison with experimental data. In verification, the relationship of the simulation to the real world is not an issue. In validation, the relationship between computation and the real world, i.e., experimental data, is the issue.
The Cretaceous strata that fill the San Juan Basin of northwestern New Mexico and southwestern Colorado were shortened in a generally north-south to north northeast-south southwest direction during the Laramide orogeny. This shortening was the result of compression of the strata between southward indentation of the San Juan uplift at the north edge of the basin and northward to northeastward indentation of the Zuni uplift from the south. Right-lateral strike-slip motion was concentrated at the eastern and western margins of the basin to form the Hogback monocline and the Nacimiento uplift at the same time. Small amounts of shear may have occurred along pre-existing basement faults within the basin as well. Vertical extension fractures, striking north-south to north northeast-south southwest (parallel to the Laramide maximum horizontal compressive stress) with local variations, formed in both Mesaverde and Dakota sandstones under this system, and are found in outcrops and in the subsurface. The less-mature Mesaverde sandstones typically contain relatively long and irregular vertical extension fractures, whereas the underlying quartzitic Dakota sandstones contain more numerous, shorter, sub-parallel, closely spaced extension fractures. Conjugate shear fractures in several orientations are also present locally in Dakota strata.
Laboratory measurements provide benchmark data for wavelength-dependent plasma opacities to assist inertial confinement fusion, astrophysics, and atomic physics research. There are several potential benefits to using z-pinch radiation for opacity measurements, including relatively large cm-scale lateral sample sizes and relatively-long 3-5 ns experiment durations. These features enhance sample uniformity. The spectrally resolved transmission through a CH-tamped NaBr foil was measured. The z-pinch produced the X-rays for both the heating source and backlight source. The (50+4) eV foil electron temperature and (3±1) × 1021 cm-3 foil electron density were determined by analysis of the Na absorption features. LTE and NLTE opacity model calculations of the n=2 to 3, 4 transitions in bromine ionized into the M-shell are in reasonably good agreement with the data.
The properties of solid foams depend on their structure, which usually evolves in the fluid state as gas bubbles expand to form polyhedral cells. The characteristic feature of foam structure-randomly packed cells of different sizes and shapes-is examined in this article by considering soap froth. This material can be modeled as a network of minimal surfaces that divide space into polyhedral cells. The cell-level geometry of random soap froth is calculated with Brakke's Surface Evolver software. The distribution of cell volumes ranges from monodisperse to highly polydisperse. Topological and geometric properties, such as surface area and edge length, of the entire foam and individual cells, are discussed. The shape of struts in solid foams is related to Plateau borders in liquid foams and calculated for different volume fractions of material. The models of soap froth are used as templates to produce finite element models of open-cell foams. Three-dimensional images of open-cell foams obtained with x-ray microtomography allow virtual reconstruction of skeletal structures that compare well with the Surface Evolver simulations of soap-froth geometry.
Because the entire flowfield is generally illuminated in microscopic particle image velocimetry (microPIV), determining the depth over which particles will contribute to the measured velocity is more difficult than in traditional, light-sheet PIV. This paper experimentally and computationally measures the influence that volume illumination, optical parameters, and particle size have on the depth of correlation for typical microPIV systems. First, it is demonstrated mathematically that the relative contribution to the measured velocity at a given distance from the object plane is proportional to the curvature of the local cross-correlation function at that distance. The depth of correlation is then determined in both the physical experiments and in computational simulations by directly measuring the relative contribution to the correlation function of particles located at a known separation from the object plane. These results are then compared with a previously derived analytical model that predicts the depth of correlation from the basic properties of the imaging system and seed particles used for the microPIV measurements. Excellent agreement was obtained between the analytical model and both computational and physical experiments, verifying the accuracy of the previously derived analytical model.
A probabilistic, transient, three-phase model of chemical transport through human skin has been developed to assess the relative importance of uncertain parameters and processes during chemical exposure assessments and transdermal drug delivery. Penetration routes through the skin that were modeled include the following: (1) intercellular diffusion through the multiphase stratum corneum; (2) aqueous-phase diffusion through sweat ducts; and (3) oil-phase diffusion through hair follicles. Uncertainty distributions were developed for the model parameters, and a Monte Carlo analysis was performed to simulate probability distributions of mass fluxes through each of the routes for a hypothetical scenario of chemical transport through the skin. At early times (60 seconds), the sweat ducts provided a significant amount of simulated mass flux into the bloodstream. At longer times (1 hour), diffusion through the stratum corneum became important because of its relatively large surface area. Sensitivity analyses using stepwise linear regression were also performed to identify model parameters that were most important to the simulated mass fluxes at different times.
Thermally-induced natural convection heat transfer in the annulus between horizontal concentric cylinders has been studied using the commercial code Fluent. The boundary layers are meshed all the way to the wall because forced convection wall functions are not appropriate. Various one-and two-equation turbulence models have been considered. Overall and local heat transfer rates are compared with existing experimental data.
A series of experiments has been performed in the Sandia National Laboratories FLAME facility with a 2-meter diameter JP-8 fuel pool fire. Sandia heat flux gages were employed to measure the incident flux at 8 locations outside the flame. Experiments were repeated to generate sufficient data for accurate confidence interval analysis. Additional sources of error are quantified and presented together with the data. The goal of this paper is to present these results in a way that is useful for validation of computer models that are capable of predicting heat flux from large fires. We anticipate using these data for comparison to validate models within the Advanced Simulation and Computing (ASC, formerly ASCI) codes FUEGO and SYRINX that predict fire dynamics and radiative transport through participating media. We present preliminary comparisons between existing models and experimental results.
An improved model for the gas damping of out-of-plane motion of a microbeam is developed based on the Reynolds equation (RE). A boundary condition for the RE is developed that relates the pressure at the beam perimeter to the beam motion. The two coefficients in this boundary condition are determined from Navier-Stokes (NS) simulations with the slip boundary condition for small slip lengths (relative to the gap height) and from Direct Simulation Monte Carlo (DSMC) molecular gas dynamics simulations for larger slip lengths. This boundary condition significantly improves the accuracy of the RE for cases where the beam width is only slightly greater than the gap height.
RMPP (reliable message passing protocol) is a lightweight transport protocol designed for clusters that provides end-to-end flow control and fault tolerance. In this article, presentations were made that compares RMPP to TCP, UDP, and "Utopia". The article compared the protocols on four benchmarks: bandwidth, latency, all-to-all, and communication-computation overlap. The results have shown that message-based protocols like RMPP have several advantages over TCP including ease of implementation, support for computation/communication overlap, and low CPU overhead.
The general problem considered is an optimization problem involving product design where some initial data are available and computer simulation is to be used to obtain more information. Resources and system complexity together restrict the number of simulations that can be performed in search of optimal settings for the product parameters. Consequently levels of these parameters, used in the simulations, (the experimental design) must be selected in an efficient way. We describe an algorithmic 'response-modeling' approach for performing this selection. The algorithm is illustrated using a rolamite design application. We provide (as examples) optimal one, two and three-point experimental designs for the rolamite computational analyses.