Publications

Results 65201–65400 of 96,771

Search results

Jump to search filters

A non-local, ordinary-state-based viscoelasticity model for peridynamics

Mitchell, John A.

A non-local, ordinary-state-based, peridynamics viscoelasticity model is developed. In this model, viscous effects are added to deviatoric deformations and the bulk response remains elastic. The model uses internal state variables and is conceptually similar to linearized isotropic viscolelasticity in the local theory. The modulus state, which is used to form the Jacobian matrix in Newton-Raphson algorithms, is presented. The model is shown to satisfy the 2nd law of thermodynamics and is applicable to problems in solid continuum mechanics where fracture and rate effects are important; it inherits all the advantages for modeling fracture associated with peridynamics. By combining this work with the previously published ordinary-state-based plasticity model, the model may be amenable to viscoplasticity problems where plasticity and rate effects are simultaneously important. Also, the model may be extended to include viscous effects for spherical deformations as well. The later two extensions are not presented and may be the subject of further work.

More Details

Nanomanufacturing : nano-structured materials made layer-by-layer

Schunk, Randy; Grest, Gary S.; Chandross, M.; Reedy, Earl D.; Cox, James C.; Fan, Hongyou F.; Roberts, Scott A.

Large-scale, high-throughput production of nano-structured materials (i.e. nanomanufacturing) is a strategic area in manufacturing, with markets projected to exceed $1T by 2015. Nanomanufacturing is still in its infancy; process/product developments are costly and only touch on potential opportunities enabled by growing nanoscience discoveries. The greatest promise for high-volume manufacturing lies in age-old coating and imprinting operations. For materials with tailored nm-scale structure, imprinting/embossing must be achieved at high speeds (roll-to-roll) and/or over large areas (batch operation) with feature sizes less than 100 nm. Dispersion coatings with nanoparticles can also tailor structure through self- or directed-assembly. Layering films structured with these processes have tremendous potential for efficient manufacturing of microelectronics, photovoltaics and other topical nano-structured devices. This project is designed to perform the requisite R and D to bring Sandia's technology base in computational mechanics to bear on this scale-up problem. Project focus is enforced by addressing a promising imprinting process currently being commercialized.

More Details

Comparison of binary collision approximation and molecular dynamics for displacement cascades in GaAs

Foiles, Stephen M.

The predictions of binary collision approximation (BCA) and molecular dynamics (MD) simulations of displacement cascades in GaAs are compared. There are three issues addressed in this work. The first is the optimal choice of the effective displacement threshold to use in the BCA calculations to obtain the best agreement with MD results. Second, the spatial correlations of point defects are compared. This is related to the level of clustering that occurs for different types of radiation. Finally, the size and structure of amorphous zones seen in the MD simulations is summarized. BCA simulations are not able to predict the formation of amorphous material.

More Details

Nanoparticle modifications of photodefined nanostructures for energy applications

Burckel, David B.; Wheeler, David R.; Washburn, Cody M.; Brozik, Susan M.

The advancement of materials technology towards the development of novel 3D nanostructures for energy applications has been a long-standing challenge. The purpose of this project was to explore photolithographically defineable pyrolyzed photoresist carbon films for possible energy applications. The key attributes that we explored were as follows: (1) Photo-interferometric fabrication methods to produce highly porous (meso, micro, and nano) 3-D electrode structures, and (2) conducting polymer and nanoparticle-modification strategies on these structures to provide enhanced catalytic capabilities and increase conductivity. The resulting electrodes were then explored for specific applications towards possible use in battery and energy platforms.

More Details

Progress toward bridging from atomistic to continuum modeling to predict nuclear waste glass dissolution

Schultz, Peter A.

This report summarizes research performed for the Nuclear Energy Advanced Modeling and Simulation (NEAMS) Subcontinuum and Upscaling Task. The work conducted focused on developing a roadmap to include molecular scale, mechanistic information in continuum-scale models of nuclear waste glass dissolution. This information is derived from molecular-scale modeling efforts that are validated through comparison with experimental data. In addition to developing a master plan to incorporate a subcontinuum mechanistic understanding of glass dissolution into continuum models, methods were developed to generate constitutive dissolution rate expressions from quantum calculations, force field models were selected to generate multicomponent glass structures and gel layers, classical molecular modeling was used to study diffusion through nanopores analogous to those in the interfacial gel layer, and a micro-continuum model (K{mu}C) was developed to study coupled diffusion and reaction at the glass-gel-solution interface.

More Details

Understanding the function and performance of carbon-enhanced lead-acid batteries : milestone report for the DOE Energy Storage Systems Program (FY11 Quarter 4: July through September 2011)

Enos, David E.; Ferreira, Summer R.

This report describes the status of research being performed under CRADA No. SC10/01771.00 (Lead/Carbon Functionality in VRLA Batteries) between Sandia National Laboratories and East Penn Manufacturing, conducted for the U.S. Department of Energy's Energy Storage Systems Program. The Quarter 4 Milestone was completed on time. The milestone entails the initiation of high rate, partial state of charge (HRPSoC) cycling of the carbon enhanced batteries. The morphology, porosity, and porosity distribution within the plates after 1k and 10k cycles were documented, illustrating the changes which take place in the early life of the carbon containing batteries, and as the battery approaches failure due to hard sulfation for the control battery. Longer term cycling on a subset of the received East Penn cells containing different carbons (and a control) continues, and will progress into FY12. Carbon has been explored as an addition to lead-acid battery electrodes in a number of ways. Perhaps the most notable to date has been the hybrid 'Ultrabattery' developed by CSIRO where an asymmetric carbon-based electrochemical capacitor is combined with a lead-acid battery into a single cell, dramatically improving high-rate partial-state-of-charge (HRPSoC) operation. As illustrated below, the 'Ultrabattery' is a hybrid device constructed using a traditional lead-acid battery positive plate (i.e., PbO2) and a negative electrode consisting of a carbon electrode in parallel with a lead-acid negative plate. This device exhibits a dramatically improved cycle life over traditional VRLA batteries, as well as increased charge power and charge acceptance. The 'Ultrabattery' has been produced successfully by both The Furukawa Battery Co. and East Penn Manufacturing. An example illustrating the dramatic improvement in cycle life of the Ultrabattery over a conventional VRLA battery is shown in a graph. In addition to the aforementioned hybrid device, carbon has also been added directly to traditional VRLA batteries as an admixture in both the positive and negative plates, the latter of which has been found to result in similar improvements to battery performance under high-rate partial-state-of-charge (HRPSoC) operation. It is this latter construction, where carbon is added directly to the negative active material (NAM) that is the specific incarnation being evaluated through this program. Thus, the carbon-modified (or Pb-C) battery (termed the 'Advanced' VRLA battery by East Penn Manufacturing) is a traditional VRLA battery where an additional component has been added to the negative electrode during production of the negative plate. The addition of select carbon materials to the NAM of VRLA batteries has been demonstrated to increase cycle life by an order of magnitude or more under (HRPSoC) operation. Additionally, battery capacity increases on cycling and, in fact, exceeds the performance of the batteries when new.

More Details

Component evaluation testing and analysis algorithms

Merchant, Bion J.; Hart, Darren H.

The Ground-Based Monitoring R&E Component Evaluation project performs testing on the hardware components that make up Seismic and Infrasound monitoring systems. The majority of the testing is focused on the Digital Waveform Recorder (DWR), Seismic Sensor, and Infrasound Sensor. In order to guarantee consistency, traceability, and visibility into the results of the testing process, it is necessary to document the test and analysis procedures that are in place. Other reports document the testing procedures that are in place (Kromer, 2007). This document serves to provide a comprehensive overview of the analysis and the algorithms that are applied to the Component Evaluation testing. A brief summary of each test is included to provide the context for the analysis that is to be performed.

More Details

Robust automated knowledge capture

Trumbo, Michael C.; Haass, Michael J.; Adams, Susan S.; Hendrickson, Stacey M.; Abbott, Robert G.

This report summarizes research conducted through the Sandia National Laboratories Robust Automated Knowledge Capture Laboratory Directed Research and Development project. The objective of this project was to advance scientific understanding of the influence of individual cognitive attributes on decision making. The project has developed a quantitative model known as RumRunner that has proven effective in predicting the propensity of an individual to shift strategies on the basis of task and experience related parameters. Three separate studies are described which have validated the basic RumRunner model. This work provides a basis for better understanding human decision making in high consequent national security applications, and in particular, the individual characteristics that underlie adaptive thinking.

More Details

Fast neutron environments

Hattar, Khalid M.; Puskar, J.D.; Doyle, Barney L.; Boyce, Brad B.; Buchheit, Thomas E.; Foiles, Stephen M.; Lu, Ping L.; Clark, Blythe C.; Kotula, Paul G.; Goods, Steven H.

The goal of this LDRD project is to develop a rapid first-order experimental procedure for the testing of advanced cladding materials that may be considered for generation IV nuclear reactors. In order to investigate this, a technique was developed to expose the coupons of potential materials to high displacement damage at elevated temperatures to simulate the neutron environment expected in Generation IV reactors. This was completed through a high temperature high-energy heavy-ion implantation. The mechanical properties of the ion irradiated region were tested by either micropillar compression or nanoindentation to determine the local properties, as a function of the implantation dose and exposure temperature. In order to directly compare the microstructural evolution and property degradation from the accelerated testing and classical neutron testing, 316L, 409, and 420 stainless steels were tested. In addition, two sets of diffusion couples from 316L and HT9 stainless steels with various refractory metals. This study has shown that if the ion irradiation size scale is taken into consideration when developing and analyzing the mechanical property data, significant insight into the structural properties of the potential cladding materials can be gained in about a week.

More Details

Initial operating experience of the 12-MW La Ola photovoltaic system

Johnson, Jay; Schenkman, Benjamin L.; Ellis, Abraham E.; Quiroz, Jimmy E.

The 1.2-MW La Ola photovoltaic (PV) power plant in Lanai, Hawaii, has been in operation since December 2009. The host system is a small island microgrid with peak load of 5 MW. Simulations conducted as part of the interconnection study concluded that unmitigated PV output ramps had the potential to negatively affect system frequency. Based on that study, the PV system was initially allowed to operate with output power limited to 50% of nameplate to reduce the potential for frequency instability due to PV variability. Based on the analysis of historical voltage, frequency, and power output data at 50% output level, the PV system has not significantly affected grid performance. However, it should be noted that the impact of PV variability on active and reactive power output of the nearby diesel generators was not evaluated. In summer 2011, an energy storage system was installed to counteract high ramp rates and allow the PV system to operate at rated output. The energy storage system was not fully operational at the time this report was written; therefore, analysis results do not address system performance with the battery system in place.

More Details

Using triggered operations to offload rendezvous messages

Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)

Barrett, Brian B.; Brightwell, Ronald B.; Hemmert, Karl S.; Wheeler, Kyle B.; Underwood, Keith D.

Historically, MPI implementations have had to choose between eager messaging protocols that require buffering and rendezvous protocols that sacrifice overlap and strong independent progress in some scenarios. The typical choice is to use an eager protocol for short messages and switch to a rendezvous protocol for long messages. If overlap and progress are desired, some implementations offer the option of using a thread. We propose an approach that leverages triggered operations to implement a long message rendezvous protocol that provides strong progress guarantees. The results indicate that a triggered operation based rendezvous can achieve better overlap than a traditional rendezvous implementation and less wasted bandwidth than an eager long protocol. © 2011 Springer-Verlag Berlin Heidelberg.

More Details

Libhashckpt: Hash-based incremental checkpointing using GPU's

Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)

Ferreira, Kurt; Riesen, Rolf; Brightwell, Ronald B.; Bridges, Patrick; Arnold, Dorian

Concern is beginning to grow in the high-performance computing (HPC) community regarding the reliability guarantees of future large-scale systems. Disk-based coordinated checkpoint/restart has been the dominant fault tolerance mechanism in HPC systems for the last 30 years. Checkpoint performance is so fundamental to scalability that nearly all capability applications have custom checkpoint strategies to minimize state and reduce checkpoint time. One well-known optimization to traditional checkpoint/restart is incremental checkpointing, which has a number of known limitations. To address these limitations, we introduce libhashckpt; a hybrid incremental checkpointing solution that uses both page protection and hashing on GPUs to determine changes in application data with very low overhead. Using real capability workloads, we show the merit of this technique for a certain class of HPC applications. © 2011 Springer-Verlag Berlin Heidelberg.

More Details

Proactive defense for evolving cyber threats

Proceedings of 2011 IEEE International Conference on Intelligence and Security Informatics, ISI 2011

Colbaugh, Richard; Glass, Kristin

There is significant interest to develop proactive approaches to cyber defense, in which future attack strategies are anticipated and these insights are incorporated into defense designs. This paper considers the problem of protecting computer networks against intrusions and other attacks, and leverages the coevolutionary relationship between attackers and defenders to derive two new methods for proactive network defense. The first method is a bipartite graph-based machine learning algorithm which enables information concerning previous attacks to be "transferred" for application against novel attacks, thereby substantially increasing the rate with which defense systems can successfully respond to new attacks. The second approach involves exploiting basic threat information (e.g., from cyber security analysts) to generate "synthetic" attack data for use in training defense systems, resulting in networks defenses that are effective against both current and (near) future attacks. The utility of the proposed methods is demonstrated by showing that they outperform standard techniques for the task of detecting malicious network activity in two publicly-available cyber datasets. © 2011 IEEE.

More Details

Electronic properties of vinylene-linked heterocyclic conducting polymers: Predictive design and rational guidance from DFT calculations

Journal of Physical Chemistry C

Wong, Bryan M.; Cordaro, Joseph G.

The band structure and electronic properties in a series of vinylene-linked heterocyclic conducting polymers are investigated using density functional theory (DFT). In order to accurately calculate electronic band gaps, we utilize hybrid functionals with fully periodic boundary conditions to understand the effect of chemical functionalization on the electronic structure of these materials. The use of predictive first-principles calculations coupled with simple chemical arguments highlights the critical role that aromaticity plays in obtaining a low band gap polymer. Contrary to some approaches which erroneously attempt to lower the band gap by increasing the aromaticity of the polymer backbone, we show that being aromatic (or quinoidal) in itself does not ensure a low band gap. Rather, an iterative approach which destabilizes the ground state of the parent polymer toward the aromatic ↔ quinoidal level crossing on the potential energy surface is a more effective way of lowering the band gap in these conjugated systems. Our results highlight the use of predictive calculations guided by rational chemical intuition for designing low band gap polymers in photovoltaic materials. © 2011 American Chemical Society.

More Details

Results from the First Set of Criticals In the Seven Percent Critical Experiment [Slides]

Harms, Gary A.; Ford, John T.

This presentation discusses the recent Sandia critical experiments, specifically the Seven Percent Critical Experiment (7uPCX) which is a Nuclear Energy Research Initiative (NERI) project. It also discusses why 7uPCX was used, how 7uPCX is operated and presents some 7uPCX results. The presentation concludes by discussing future plans for the critical experiments.

More Details

Multi-dimensional optical and laser-based diagnostics of low-temperature ionized plasma discharges

Plasma Sources Science and Technology

Barnat, Edward V.

In this paper, a review of work centered on the utilization of multi-dimensional optical diagnostics to study phenomena arising in radiofrequency plasma discharges is given. The diagnostics range from passive techniques such as optical emission to more active techniques utilizing nanosecond lasers capable of both high temporal and spatial resolution. In this review, emphasis is placed on observations that would have been more difficult, if not impossible, to make without the use of such diagnostic techniques. Examples include the sheath structure around an electrode consisting of two different metals, double layers that arise in magnetized hydrogen discharges, or a large region of depleted argon 1s4 levels around a biased probe in an rf discharge.

More Details

Carotenoid distribution in living cells of haematococcus pluvialis (chlorophyceae)

PLoS ONE

Collins, Aaron M.; Jones, Howland D.; Han, Danxiang; Hu, Qiang; Beechem, Thomas E.; Timlin, Jerilyn A.

Haematococcus pluvialis is a freshwater unicellular green microalga belonging to the class Chlorophyceae and is of commercial interest for its ability to accumulate massive amounts of the red ketocarotenoid astaxanthin (3,3′-dihydroxy-β,β-carotene-4,4′-dione). Using confocal Raman microscopy and multivariate analysis, we demonstrate the ability to spectrally resolve resonance-enhanced Raman signatures associated with astaxanthin and β-carotene along with chlorophyll fluorescence. By mathematically isolating these spectral signatures, in turn, it is possible to locate these species independent of each other in living cells of H. pluvialis in various stages of the life cycle. Chlorophyll emission was found only in the chloroplast whereas astaxanthin was identified within globular and punctate regions of the cytoplasmic space. Moreover, we found evidence for β-carotene to be co-located with both the chloroplast and astaxanthin in the cytosol. These observations imply that β-carotene is a precursor for astaxanthin and the synthesis of astaxanthin occurs outside the chloroplast. Our work demonstrates the broad utility of confocal Raman microscopy to resolve spectral signatures of highly similar chromophores in living cells. © 2011 Collins et al.

More Details

Statistical mechanical foundation of the peridynamic nonlocal continuum theory: Energy and momentum conservation laws

Physical Review E - Statistical, Nonlinear, and Soft Matter Physics

Lehoucq, Richard B.; Sears, Mark P.

The purpose of this paper is to derive the energy and momentum conservation laws of the peridynamic nonlocal continuum theory using the principles of classical statistical mechanics. The peridynamic laws allow the consideration of discontinuous motion, or deformation, by relying on integral operators. These operators sum forces and power expenditures separated by a finite distance and so represent nonlocal interaction. The integral operators replace the differential divergence operators conventionally used, thereby obviating special treatment at points of discontinuity. The derivation presented employs a general multibody interatomic potential, avoiding the standard assumption of a pairwise decomposition. The integral operators are also expressed in terms of a stress tensor and heat flux vector under the assumption that these fields are differentiable, demonstrating that the classical continuum energy and momentum conservation laws are consequences of the more general peridynamic laws. An important conclusion is that nonlocal interaction is intrinsic to continuum conservation laws when derived using the principles of statistical mechanics. © 2011 American Physical Society.

More Details

Cyber Security Indications and Warning System (SV): CRADA 1573.94 Project Accomplishments Summary

Hu, Tan C.

As the national focus on cyber security increases, there is an evolving need for a capability to provide for high-speed sensing of events, correlation of events, and decision-making based on the adverse events seen across multiple independent large-scale network environments. The purpose of this Shared Vision project, Cyber Security Indications and Warning System, was to combine both Sandia's and LMC's expertise to discover new solutions to the challenge of protecting our nation's infrastructure assets. The objectives and scope of the proposal was limited to algorithm and High Performance Computing (HPC) model assessment in the unclassified environment within funding and schedule constraints. The interest is the identification, scalability assessment, and applicability of current utilized cyber security algorithms as applied in an HPC environment.

More Details

Backfilling with guarantees granted upon job submission

Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)

Lindsay, Alexander M.; Galloway-Carson, Maxwell; Johnson, Christopher R.; Bunde, David P.; Leung, Vitus J.

In this paper, we present scheduling algorithms that simultaneously support guaranteed starting times and favor jobs with system-desired traits. To achieve the first of these goals, our algorithms keep a profile with potential starting times for every unfinished job and never move these starting times later, just as in Conservative Backfilling. To achieve the second, they exploit previously unrecognized flexibility in the handling of holes opened in this profile when jobs finish early. We find that, with one choice of job selection function, our algorithms can consistently yield a lower average waiting time than Conservative Backfilling while still providing a guaranteed start time to each job as it arrives. In fact, in most cases, the algorithms give a lower average waiting time than the more aggressive EASY backfilling algorithm, which does not provide guaranteed start times. Alternately, with a different choice of job selection function, our algorithms can focus the benefit on the widest submitted jobs, the reason for the existence of parallel systems. In this case, these jobs experience significantly lower waiting time than Conservative Backfilling with minimal impact on other jobs. © 2011 Springer-Verlag.

More Details

Power handling and intermodulation distortion of contour-mode AlN MEMS resonators and filters

IEEE MTT-S International Microwave Symposium Digest

Nordquist, Christopher N.; Olsson, Roy H.

We report measurements of the power handling and intermodulation distortion of piezoelectric contour mode resonators and filters operating near 500 MHz. The output power capability scales as the inverse of the motional impedance squared, and the power handling of resonator filter circuits scales with the number of resonators combined in series and parallel. Also, the third-order intercept depends on the measurement tone spacing. Individual AlN resonators with 50 Ω motional impedance demonstrate output power capability of +10 dBm and OIP3 > +20 dBm, while an eight resonator filter demonstrates output power handling of +14 dBm and a OIP3 > +32 dBm. © 2011 IEEE.

More Details

Influence of anisotropy on thermal boundary conductance at solid interfaces

Physical Review B - Condensed Matter and Materials Physics

Hopkins, Patrick E.; Beechem, Thomas E.; Duda, John C.; Hattar, Khalid M.; Ihlefeld, Jon I.; Rodriguez, M.A.; Piekos, Edward S.

We investigate the role of anisotropy on interfacial transport across solid interfaces by measuring the thermal boundary conductance from 100 to 500 K across Al/Si and Al/sapphire interfaces with different substrate orientations. The measured thermal boundary conductances show a dependency on substrate crystallographic orientation in the sapphire samples (trigonal conventional cell) but not in the silicon samples (diamond cubic conventional cell). The change in interface conductance in the sapphire samples is ascribed to anisotropy in the Brillouin zone along the principal directions defining the conventional cell. This leads to resultant phonon velocities in the direction of thermal transport that vary nearly 40% based on crystallographic direction. © 2011 American Physical Society.

More Details

An empirical relationship for extrapolating sparse experimental lap joint data

Journal of Applied Mechanics, Transactions ASME

Starr, Michael J.; Segalman, Daniel J.

Correctly incorporating the influence of mechanical joints in built-up mechanical systems is a critical element for model development for structural dynamics predictions. Quality experimental data are often difficult to obtain and is rarely sufficient to determine fully parameters for relevant mathematical models. On the other hand, fine-mesh finite element (FMFE) modeling facilitates innumerable numerical experiments at modest cost. Detailed FMFE analysis of built-up structures with frictional interfaces reproduces trends among problem parameters found experimentally, but there are qualitative differences. Those differences are currently ascribed to the very approximate nature of the friction model available in most finite element codes. Though numerical simulations are insufficient to produce qualitatively correct behavior of joints, some relations, developed here through observations of a multitude of numerical experiments, suggest interesting relationships among joint properties measured under different loading conditions. These relationships can be generalized into forms consistent with data from physical experiments. One such relationship, developed here, expresses the rate of energy dissipation per cycle within the joint under various combinations of extensional and clamping load in terms of dissipation under other load conditions. The use of this relationship - though not exact - is demonstrated for the purpose of extrapolating a representative set of experimental data to span the range of variability observed from real data. © 2011 American Society of Mechanical Engineers.

More Details

A resilience assessment framework for infrastructure and economic systems: Quantitative and qualitative resilience analysis of petrochemical supply chains to a hurricane

Process Safety Progress

Vugrin, Eric D.; Warren, Drake E.; Ehlen, Mark E.

In recent years, the nation has recognized that critical infrastructure protection should consider not only the prevention of disruptive events but also the processes that infrastructure systems undergo to maintain functionality following disruptions. This more comprehensive approach has been termed critical infrastructure resilience. Given the occurrence of a particular disruptive event, the resilience of a system to that event is the system's ability to reduce efficiently both the magnitude and duration of the deviation from targeted system performance levels. Under the direction of the U. S. Department of Homeland Security's Science and Technology Directorate, Sandia National Laboratories has developed a comprehensive resilience assessment framework for evaluating the resilience of infrastructure and economic systems. The framework includes a quantitative methodology that measures resilience costs that result from a disruption to infrastructure function. The framework also includes a qualitative analysis methodology that assesses system characteristics affecting resilience to provide insight and direction for potential improvements. This article describes the resilience assessment framework and demonstrates the utility of the assessment framework through application to two hypothetical scenarios involving the disruption of a petrochemical supply chain by hurricanes. © 2011 American Institute of Chemical Engineers (AIChE).

More Details

Quantification of margins and uncertainties of complex systems in the presence of aleatoric and epistemic uncertainty

Reliability Engineering and System Safety

Urbina, Angel; Mahadevan, Sankaran; Paez, Thomas L.

Performance assessment of complex systems is ideally done through full system-level testing which is seldom available for high consequence systems. Further, a reality of engineering practice is that some features of system behavior are not known from experimental data, but from expert assessment, only. On the other hand, individual component data, which are part of the full system are more readily available. The lack of system level data and the complexity of the system lead to a need to build computational models of a system in a hierarchical or building block approach (from simple components to the full system). The models are then used for performance prediction in lieu of experiments, to estimate the confidence in the performance of these systems. Central to this are the need to quantify the uncertainties present in the system and to compare the system response to an expected performance measure. This is the basic idea behind Quantification of Margins and Uncertainties (QMU). QMU is applied in decision making - there are many uncertainties caused by inherent variability (aleatoric) in materials, configurations, environments, etc., and lack of information (epistemic) in models for deterministic and random variables that influence system behavior and performance. This paper proposes a methodology to quantify margins and uncertainty in the presence of both aleatoric and epistemic uncertainty. It presents a framework based on Bayes networks to use available data at multiple levels of complexity (i.e. components, subsystem, etc.) and demonstrates a method to incorporate epistemic uncertainty given in terms of intervals on a model parameter. © 2011 Elsevier Ltd. All rights reserved.

More Details

Optimization subject to hidden constraints via statistical emulation

Pacific Journal of Optimization

Lee, Herbert K.H.; Gramacy, Robert B.; Linkletter, Crystal; Gray, Genetha A.

We present new methodology for constrained optimization based on building a combination of models, one for the objective function and one for the constraint region. We use a treed Gaussian process as a statistical emulator for the complex objective function, and a random forest to model the probability of meeting the constraints. By combining these models, we can guide the optimization search to promising areas in terms of both the objective function and the constraint. This approach avoids the problem of becoming stuck in a local mode, as well as being able to deal with unconnected viable regions. We demonstrate our methodology on a simulated problem and an example from hydrology. © 2011 Yokohama Publishers.

More Details

Derivative-free optimization via evoluationary algorithms guiding local search (EAGLS) for minlp

Pacific Journal of Optimization

Griffin, J.D.; Fowler, K.R.; Gray, G.A.; Hemker, T.; Parno, M.D.

Derivative-free optimization approaches are commonly used for simulation-based design problems when objective function and possibly constraint evaluations have a black-box formulation. A variety of algorithms have been developed over the last several decades to address the inherent challenges such as computationally expensive function evaluations, low amplitude noise, nonsmoothness, nonconvexity, and disconnected feasible regions. Hybrid methods are emerging within the direct search community as new tools to overcome weaknesses while exploiting strengths of several methods working together. In this work, we extend the capabilities of a parallel implementation of the generating set search (GSS) method, which is a fast local derivative-free approach, to handle integer variables. This is achieved with a hybrid approach that uses a genetic algorithm (GA) to handle the integer variables. Promising points are selected as starting points for the GSS local search with the integer variables held fixed before being passed back to the GA for the standard selection, mutation and crossover operations for the next iteration. We provide promising numerical results on three mixed integer problems; one based on the design of a compression spring, a simulation-based problem from hydrology, and a standard problem taken from the literature. © 2011 Yokohama Publishers.

More Details

MapReduce in MPI for Large-scale graph algorithms

Parallel Computing

Plimpton, Steven J.; Devine, Karen D.

We describe a parallel library written with message-passing (MPI) calls that allows algorithms to be expressed in the MapReduce paradigm. This means the calling program does not need to include explicit parallel code, but instead provides "map" and "reduce" functions that operate independently on elements of a data set distributed across processors. The library performs needed data movement between processors. We describe how typical MapReduce functionality can be implemented in an MPI context, and also in an out-of-core manner for data sets that do not fit within the aggregate memory of a parallel machine. Our motivation for creating this library was to enable graph algorithms to be written as MapReduce operations, allowing processing of terabyte-scale data sets on traditional MPI-based clusters. We outline MapReduce versions of several such algorithms: vertex ranking via PageRank, triangle finding, connected component identification, Luby's algorithm for maximally independent sets, and single-source shortest-path calculation. To test the algorithms on arbitrarily large artificial graphs we generate randomized R-MAT matrices in parallel; a MapReduce version of this operation is also described. Performance and scalability results for the various algorithms are presented for varying size graphs on a distributed-memory cluster. For some cases, we compare the results with non-MapReduce algorithms, different machines, and different MapReduce software, namely Hadoop. Our open-source library is written in C++, is callable from C++, C, Fortran, or scripting languages such as Python, and can run on any parallel platform that supports MPI. © 2011 Elsevier B.V. All rights reserved.

More Details

Interaction between metamaterial resonators and inter-subband transitions in quantum wells

2011 Conference on Lasers and Electro-Optics: Laser Science to Photonic Applications, CLEO 2011

Gabbay, Alon; Reno, J.L.; Wendt, J.R.; Gin, Aaron G.; Wanke, Michael C.; Sinclair, Michael B.; Shaner, Eric A.; Brener, Igal

Interaction between metamaterial elements and intersubband transitions in GaAs/AlGaAs quantum wells is observed in the mid-infrared. Transmission measurements were performed through metamaterial arrays, each having a different resonance frequency. © 2011 OSA.

More Details

Barrier/bonding layers on bismuth telluride (Bi 2Te 3) for high temperature thermoelectric modules

Journal of Materials Science: Materials in Electronics

Lin, Wen P.; Wesolowski, Daniel E.; Lee, Chin C.

In this research, a fundamental study is conducted to identify the materials and develop the processes for producing barrier/bonding composite on Bi 2Te 3 for high temperature thermoelectric applications. The composite must meet four basic requirements: (a) prevent interdiffusion between the electrode material, for our design, silver(Ag) and Bi 2Te 3, (b) bond well to Bi 2Te 3, (c) bond well to Ag electrode, and (d) do not themselves diffuse into Bi 2Te 3. The composites investigated include palladium (Pd), nickel/gold (Ni/Au), Ag, and titanium/gold (Ti/Au). After annealing at 250 °C for 200 h, only the Ti/Au design meets all four requirements. The thickness of Ti and Au, respectively, is only 100 nm. Other than meeting these four requirements, the Ti/Au layers exhibit excellent step coverage on the rough Bi 2Te 3 surface even after the annealing process. © The Author(s) 2011.

More Details

The structure and energetics of, and the plasticity caused by, Eshelby dislocations

International Journal of Plasticity

Weinberger, Christopher R.

The structure of coaxial, or Eshelby, dislocations are computed using isotropic elasticity for arrays of up to 500 dislocations. The energies of these arrays are determined in order to predict the lowest energy configuration and multiple meta-stable configurations are often found. The energy from these elasticity predictions shows good agreement with molecular statics simulations of aluminum. From these simulations, the torque-twist curves are predicted and compared with molecular dynamics simulations.© 2011 Elsevier Ltd. All rights reserved.

More Details

Quantification of margins and uncertainties: Alternative representations of epistemic uncertainty

Reliability Engineering and System Safety

Helton, Jon C.; Johnson, Jay D.

In 2001, the National Nuclear Security Administration of the U.S. Department of Energy in conjunction with the national security laboratories (i.e., Los Alamos National Laboratory, Lawrence Livermore National Laboratory and Sandia National Laboratories) initiated development of a process designated Quantification of Margins and Uncertainties (QMU) for the use of risk assessment methodologies in the certification of the reliability and safety of the nations nuclear weapons stockpile. A previous presentation, Quantification of Margins and Uncertainties: Conceptual and Computational Basis, describes the basic ideas that underlie QMU and illustrates these ideas with two notional examples that employ probability for the representation of aleatory and epistemic uncertainty. The current presentation introduces and illustrates the use of interval analysis, possibility theory and evidence theory as alternatives to the use of probability theory for the representation of epistemic uncertainty in QMU-type analyses. The following topics are considered: the mathematical structure of alternative representations of uncertainty, alternative representations of epistemic uncertainty in QMU analyses involving only epistemic uncertainty, and alternative representations of epistemic uncertainty in QMU analyses involving a separation of aleatory and epistemic uncertainty. Analyses involving interval analysis, possibility theory and evidence theory are illustrated with the same two notional examples used in the presentation indicated above to illustrate the use of probability to represent aleatory and epistemic uncertainty in QMU analyses. © 2011 Elsevier Ltd. All rights reserved.

More Details

Quantification of margins and uncertainties: Conceptual and computational basis

Reliability Engineering and System Safety

Helton, Jon C.

In 2001, the National Nuclear Security Administration of the U.S. Department of Energy in conjunction with the national security laboratories (i.e., Los Alamos National Laboratory, Lawrence Livermore National Laboratory and Sandia National Laboratories) initiated development of a process designated Quantification of Margins and Uncertainties (QMU) for the use of risk assessment methodologies in the certification of the reliability and safety of the nations nuclear weapons stockpile. This presentation discusses and illustrates the conceptual and computational basis of QMU in analyses that use computational models to predict the behavior of complex systems. The following topics are considered: (i) the role of aleatory and epistemic uncertainty in QMU, (ii) the representation of uncertainty with probability, (iii) the probabilistic representation of uncertainty in QMU analyses involving only epistemic uncertainty, and (iv) the probabilistic representation of uncertainty in QMU analyses involving aleatory and epistemic uncertainty. © 2011 Elsevier Ltd. All rights reserved.

More Details

Quantification of margins and uncertainties: Example analyses from reactor safety and radioactive waste disposal involving the separation of aleatory and epistemic uncertainty

Reliability Engineering and System Safety

Helton, Jon C.; Johnson, Jay D.; Sallaberry, Cédric J.

In 2001, the National Nuclear Security Administration (NNSA) of the U.S. Department of Energy (DOE) in conjunction with the national security laboratories (i.e., Los Alamos National Laboratory, Lawrence Livermore National Laboratory, and Sandia National Laboratories) initiated development of a process designated quantification of margins and uncertainties (QMU) for the use of risk assessment methodologies in the certification of the reliability and safety of the nations nuclear weapons stockpile. A previous presentation, Quantification of Margins and Uncertainties: Conceptual and Computational Basis, describes the basic ideas that underlie QMU and illustrates these ideas with two notional examples. The basic ideas and challenges that underlie NNSAs mandate for QMU are present, and have been successfully addressed, in a number of past analyses for complex systems. To provide perspective on the implementation of a requirement for QMU in the analysis of a complex system, three past analyses are presented as examples: (i) the probabilistic risk assessment carried out for the Surry Nuclear Power Station as part of the U.S. Nuclear Regulatory Commissions (NRCs) reassessment of the risk from commercial nuclear power in the United States (i.e., the NUREG-1150 study), (ii) the performance assessment for the Waste Isolation Pilot Plant carried out by the DOE in support of a successful compliance certification application to the U.S. Environmental Agency, and (iii) the performance assessment for the proposed high-level radioactive waste repository at Yucca Mountain, Nevada, carried out by the DOE in support of a license application to the NRC. Each of the preceding analyses involved a detailed treatment of uncertainty and produced results used to establish compliance with specific numerical requirements on the performance of the system under study. As a result, these studies illustrate the determination of both margins and the uncertainty in margins in real analyses. © 2011 Elsevier Ltd. All rights reserved.

More Details

Methanol production from CO 2 using solar-thermal energy: Process development and techno-economic analysis

Energy and Environmental Science

Kim, Jiyong; Henao, Carlos A.; Johnson, Terry A.; Dedrick, Daniel E.; Miller, James E.; Stechel-Speicher, Ellen B.; Maravelias, Christos T.

We describe a novel solar-based process for the production of methanol from carbon dioxide and water. The system utilizes concentrated solar energy in a thermochemical reactor to reenergize CO 2 into CO and then water gas shift (WGS) to produce syngas (a mixture of CO and H 2) to feed a methanol synthesis reactor. Aside from the thermochemical reactor, which is currently under development, the full system is based on well-established industrial processes and component designs. This work presents an initial assessment of energy efficiency and economic feasibility of this baseline configuration for an industrial-scale methanol plant. Using detailed sensitivity calculations, we determined that a break-even price of the methanol produced using this approach would be 1.22 USD/kg; which while higher than current market prices is comparable to other renewable-resource-based alternatives. We also determined that if solar power is the sole primary energy source, then an overall process energy efficiency (solar-to-fuel) of 7.1% could be achieved, assuming the solar collector, solar thermochemical reactor sub-system operates at 20% sunlight to chemical energy efficiency. This 7.1% system efficiency is significantly higher than can currently be achieved with photosynthesis-based processes, and illustrates the potential for solar thermochemical based strategies to overcome the resource limitations that arise for low-efficiency approaches. Importantly, the analysis here identifies the primary economic drivers as the high capital investment associated with the solar concentrator/reactor sub-system, and the high utility consumption for CO/CO 2 separation. The solar concentrator/reactor sub-system accounts for more than 90% of the capital expenditure. A life cycle assessment verifies the opportunity for significant improvements over the conventional process for manufacturing methanol from natural gas in global warming potential, acidification potential and non-renewable primary energy requirement provided balance of plant utilities for the solar thermal process are also from renewable (solar) resources. The analysis indicates that a solar-thermochemical pathway to fuels has significant potential, and points towards future research opportunities to increase efficiency, reduce balance of plant utilities, and reduce cost from this baseline. Particularly, it is evident that there is much room for improvement in the development of a less expensive solar concentrator/reactor sub-system; an opportunity that will benefit from the increasing deployment of concentrated solar power (electricity). In addition, significant advances are achievable through improved separations, combined CO 2 and H 2O splitting, different end products, and greater process integration and distribution. The baseline investigation here establishes a methodology for identifying opportunities, comparison, and assessment of impact on the efficiency, lifecycle impact, and economics for advanced system designs. © 2011 The Royal Society of Chemistry.

More Details

Capacitive frequency tuning of ALN micromechanical resonators

2011 16th International Solid-State Sensors, Actuators and Microsystems Conference, TRANSDUCERS'11

Kim, Bongsang K.; Olsson, Roy H.; Wojciechowski, Kenneth W.

Frequency tuning of aluminum nitride (AlN) micromechanical resonators has been demonstrated by reactance manipulation via termination with variable capacitors. Shunting one electrode with a variable capacitor in a 13 MHz fourth overtone length-extensional mode resonator effected resonator stiffening to yield a ∼600 ppm frequency shift. Tunability could be further increased by dedicating two electrodes for tuning doubling the frequency tuning range to ∼1500 ppm. A tunable bandwidth balun filter has been constructed by parallel coupling of independently tunable resonators demonstrating almost three-fold increase in the bandwidth from 12 kHz to 33 kHz. Also a voltage-controlled frequency tuning printed circuit board (PCB) was implemented. © 2011 IEEE.

More Details

Packaging strategies for printed circuit board components. Volume I, materials & thermal stresses

Spangler, Scott W.; Austin, Kevin N.; Neidigk, Matthew N.; Neilsen, Michael K.; Chambers, Robert S.

Decisions on material selections for electronics packaging can be quite complicated by the need to balance the criteria to withstand severe impacts yet survive deep thermal cycles intact. Many times, material choices are based on historical precedence perhaps ignorant of whether those initial choices were carefully investigated or whether the requirements on the new component match those of previous units. The goal of this program focuses on developing both increased intuition for generic packaging guidelines and computational methodologies for optimizing packaging in specific components. Initial efforts centered on characterization of classes of materials common to packaging strategies and computational analyses of stresses generated during thermal cycling to identify strengths and weaknesses of various material choices. Future studies will analyze the same example problems incorporating the effects of curing stresses as needed and analyzing dynamic loadings to compare trends with the quasi-static conclusions.

More Details

Potential hazards of compressed air energy storage in depleted natural gas reservoirs

Bauer, Stephen J.; Grubelich, Mark C.

This report is a preliminary assessment of the ignition and explosion potential in a depleted hydrocarbon reservoir from air cycling associated with compressed air energy storage (CAES) in geologic media. The study identifies issues associated with this phenomenon as well as possible mitigating measures that should be considered. Compressed air energy storage (CAES) in geologic media has been proposed to help supplement renewable energy sources (e.g., wind and solar) by providing a means to store energy when excess energy is available, and to provide an energy source during non-productive or low productivity renewable energy time periods. Presently, salt caverns represent the only proven underground storage used for CAES. Depleted natural gas reservoirs represent another potential underground storage vessel for CAES because they have demonstrated their container function and may have the requisite porosity and permeability; however reservoirs have yet to be demonstrated as a functional/operational storage media for compressed air. Specifically, air introduced into a depleted natural gas reservoir presents a situation where an ignition and explosion potential may exist. This report presents the results of an initial study identifying issues associated with this phenomena as well as possible mitigating measures that should be considered.

More Details

An overview of component qualification using Bayesian statistics and energy methods

Dohner, Jeffrey L.

The below overview is designed to give the reader a limited understanding of Bayesian and Maximum Likelihood (MLE) estimation; a basic understanding of some of the mathematical tools to evaluate the quality of an estimation; an introduction to energy methods and a limited discussion of damage potential. This discussion then goes on to presented a limited presentation as to how energy methods and Bayesian estimation are used together to qualify components. Example problems with solutions have been supplied as a learning aid. Bold letters are used to represent random variables. Un-bolded letter represent deterministic values. A concluding section presents a discussion of attributes and concerns.

More Details

Understanding the function and performance of carbon-enhanced lead-acid batteries : milestone report for the DOE energy storage systems program (FY11 Quarter 3: April through June 2011)

Enos, David E.; Ferreira, Summer R.

This report describes the status of research being performed under CRADA No. SC10/01771.00 (Lead/Carbon Functionality in VRLA Batteries) between Sandia National Laboratories and East Penn Manufacturing, conducted for the U.S. Department of Energy's Energy Storage Systems Program. The Quarter 3 Milestone was completed on time. The milestone entails an ex situ analysis of a control as well as three carbon-containing negative plates in the raw, as cast form as well as after formation. The morphology, porosity, and porosity distribution within each plate was evaluated. In addition, baseline electrochemical measurements were performed on each battery to establish their initial performance. These measurements included capacity, internal resistance, and float current. The results obtained for the electrochemical testing were in agreement with previous evaluations performed at East Penn manufacturing. Cycling on a subset of the received East Penn cells containing different carbons (and a control) has been initiated.

More Details

Finite element analysis of multilayer coextrusion

Rao, Rekha R.; Mondy, L.A.; Schunk, Randy; Hopkins, Matthew M.

Multilayer coextrusion has become a popular commercial process for producing complex polymeric products from soda bottles to reflective coatings. A numerical model of a multilayer coextrusion process is developed based on a finite element discretization and two different free-surface methods, an arbitrary-Lagrangian-Eulerian (ALE) moving mesh implementation and an Eulerian level set method, to understand the moving boundary problem associated with the polymer-polymer interface. The goal of this work is to have a numerical capability suitable for optimizing and troubleshooting the coextrusion process, circumventing flow instabilities such as ribbing and barring, and reducing variability in layer thickness. Though these instabilities can be both viscous and elastic in nature, for this work a generalized Newtonian description of the fluid is used. Models of varying degrees of complexity are investigated including stability analysis and direct three-dimensional finite element free surface approaches. The results of this work show how critical modeling can be to reduce build test cycles, improve material choices, and guide mold design.

More Details

Thin magnetic conductor substrate for placement-immune, electrically-small antennas

Eubanks, Travis W.; Loui, Hung L.; McDonald, Jacob J.

An antenna is considered to be placement-immune when the antenna operates effectively regardless of where it is placed. By building antennas on magnetic conductor materials, the radiated fields will be positively reinforced in the desired radiation direction instead of being negatively affected by the environment. Although this idea has been discussed thoroughly in theoretical research, the difficulty in building thin magnetic conductor materials necessary for in-phase field reflections prevents this technology from becoming more widespread. This project's purpose is to build and measure an electrically-small antenna on a new type of non-metallic, thin magnetic conductor. This problem has not been previously addressed because non-metallic, thin magnetic conductor materials have not yet been discovered. This work proposed the creation of an artificial magnetic conductor (AMC) with in-phase field reflections without using internal electric conductors, the placement of an electrically-small antenna on this magnetic conductor, and the development of a transmit-receive system that utilizes the substrate and electrically-small antenna. By not using internal electric conductors to create the AMC, the substrate thickness can be minimized. The electrically-small antenna will demonstrate the substrate's ability to make an antenna placement immune, and the transmit-receive system combines both the antenna and the substrate while adding a third layer of system complexity to demonstrate the complete idea.

More Details

A life cycle cost analysis framework for geologic storage of hydrogen : a user's tool

Lord, Anna S.; Kobos, Peter H.; Klise, Geoffrey T.; Borns, David J.

The U.S. Department of Energy (DOE) has an interest in large scale hydrogen geostorage, which could offer substantial buffer capacity to meet possible disruptions in supply or changing seasonal demands. The geostorage site options being considered are salt caverns, depleted oil/gas reservoirs, aquifers and hard rock caverns. The DOE has an interest in assessing the geological, geomechanical and economic viability for these types of geologic hydrogen storage options. This study has developed an economic analysis methodology and subsequent spreadsheet analysis to address costs entailed in developing and operating an underground geologic storage facility. This year the tool was updated specifically to (1) incorporate more site-specific model input assumptions for the wells and storage site modules, (2) develop a version that matches the general format of the HDSAM model developed and maintained by Argonne National Laboratory, and (3) incorporate specific demand scenarios illustrating the model's capability. Four general types of underground storage were analyzed: salt caverns, depleted oil/gas reservoirs, aquifers, and hard rock caverns/other custom sites. Due to the substantial lessons learned from the geological storage of natural gas already employed, these options present a potentially sizable storage option. Understanding and including these various geologic storage types in the analysis physical and economic framework will help identify what geologic option would be best suited for the storage of hydrogen. It is important to note, however, that existing natural gas options may not translate to a hydrogen system where substantial engineering obstacles may be encountered. There are only three locations worldwide that currently store hydrogen underground and they are all in salt caverns. Two locations are in the U.S. (Texas), and are managed by ConocoPhillips and Praxair (Leighty, 2007). The third is in Teeside, U.K., managed by Sabic Petrochemicals (Crotogino et al., 2008; Panfilov et al., 2006). These existing H{sub 2} facilities are quite small by natural gas storage standards. The second stage of the analysis involved providing ANL with estimated geostorage costs of hydrogen within salt caverns for various market penetrations for four representative cities (Houston, Detroit, Pittsburgh and Los Angeles). Using these demand levels, the scale and cost of hydrogen storage necessary to meet 10%, 25% and 100% of vehicle summer demands was calculated.

More Details

Repository performance confirmation

Hansen, Francis D.

Repository performance confirmation links the technical bases of repository science and societal acceptance. This paper explores the myriad aspects of what has been labeled performance confirmation in U.S. programs, which involves monitoring as a collection of distinct activities combining technical and social significance in radioactive waste management. This paper is divided into four parts: (1) A distinction is drawn between performance confirmation monitoring and other testing and monitoring objectives; (2) A case study illustrates confirmation activities integrated within a long-term testing and monitoring strategy for Yucca Mountain; (3) A case study reviews compliance monitoring developed and implemented for the Waste Isolation Pilot Plant; and (4) An approach for developing, evaluating and implementing the next generation of performance confirmation monitoring is presented. International interest in repository monitoring is exhibited by the European Commission Seventh Framework Programme 'Monitoring Developments for Safe Repository Operation and Staged Closure' (MoDeRn) Project. The MoDeRn partners are considering the role of monitoring in a phased approach to the geological disposal of radioactive waste. As repository plans advance in different countries, the need to consider monitoring strategies within a controlled framework has become more apparent. The MoDeRn project pulls together technical and societal experts to assimilate a common understanding of a process that could be followed to develop a monitoring program. A fundamental consideration is the differentiation of confirmation monitoring from the many other testing and monitoring activities. Recently, the license application for Yucca Mountain provided a case study including a technical process for meeting regulatory requirements to confirm repository performance as well as considerations related to the preservation of retrievability. The performance confirmation plan developed as part of the Yucca Mountain license application identified a broad suite of monitoring activities. A revision of the plan was expected to winnow the number of activities down to a manageable size. As a result, an objective process for the next stage of performance confirmation planning was developed as an integral part of an overarching long-term testing and monitoring strategy. The Waste Isolation Pilot Plant compliance monitoring program at once reflects its importance to stakeholders while demonstrating adequate understanding of relevant monitoring parameters. The compliance criteria were stated by regulation and are currently monitored as part of the regulatory rule for disposal. At the outset, the screening practice and parameter selection were not predicated on a direct or indirect correlation to system performance metrics, as was the case for Yucca Mountain. Later on, correlation to performance was established, and the Waste Isolation Pilot Plant continues to monitor ten parameters originally identified in the compliance certification documentation. The monitoring program has proven to be effective for the technical intentions and societal or public assurance. The experience with performance confirmation in the license application process for Yucca Mountain helped identify an objective, quantitative methodology for this purpose. Revision of the existing plan would be based on findings of the total system performance assessment. Identification and prioritization of confirmation activities would then derive from performance metrics associated with performance assessment. Given the understanding of repository performance confirmation, as reviewed in this paper, it is evident that the performance confirmation program for the Yucca Mountain project could be readily re-engaged if licensing activities resumed.

More Details

Develop feedback system for intelligent dynamic resource allocation to improve application performance

Brandt, James M.; Gentile, Ann C.; Thompson, David C.

This report provides documentation for the completion of the Sandia Level II milestone 'Develop feedback system for intelligent dynamic resource allocation to improve application performance'. This milestone demonstrates the use of a scalable data collection analysis and feedback system that enables insight into how an application is utilizing the hardware resources of a high performance computing (HPC) platform in a lightweight fashion. Further we demonstrate utilizing the same mechanisms used for transporting data for remote analysis and visualization to provide low latency run-time feedback to applications. The ultimate goal of this body of work is performance optimization in the face of the ever increasing size and complexity of HPC systems.

More Details

Measurements of Magneto-Rayleigh-Taylor instability growth in initially solid liners on the Z facility

Sinars, Daniel S.; Edens, Aaron E.; Lopez, Mike R.; Smith, Ian C.; Slutz, Stephen A.; Shores, Jonathon S.; Bennett, Guy R.; Atherton, B.W.; Savage, Mark E.; Stygar, William A.; Leifeste, Gordon T.; Herrmann, Mark H.; Cuneo, M.E.; Peterson, Kyle J.; McBride, Ryan D.; Jennings, Christopher A.; Vesey, Roger A.; Nakhleh, Charles N.

Abstract not provided.

Results 65201–65400 of 96,771
Results 65201–65400 of 96,771