Publications

Results 8351–8375 of 9,998

Search results

Jump to search filters

Peridynamic theory of solid mechanics

Proposed for publication in Advances in Applied Mechanics.

Silling, Stewart A.; Lehoucq, Richard B.

The peridynamic theory of mechanics attempts to unite the mathematical modeling of continuous media, cracks, and particles within a single framework. It does this by replacing the partial differential equations of the classical theory of solid mechanics with integral or integro-differential equations. These equations are based on a model of internal forces within a body in which material points interact with each other directly over finite distances. The classical theory of solid mechanics is based on the assumption of a continuous distribution of mass within a body. It further assumes that all internal forces are contact forces that act across zero distance. The mathematical description of a solid that follows from these assumptions relies on partial differential equations that additionally assume sufficient smoothness of the deformation for the PDEs to make sense in either their strong or weak forms. The classical theory has been demonstrated to provide a good approximation to the response of real materials down to small length scales, particularly in single crystals, provided these assumptions are met. Nevertheless, technology increasingly involves the design and fabrication of devices at smaller and smaller length scales, even interatomic dimensions. Therefore, it is worthwhile to investigate whether the classical theory can be extended to permit relaxed assumptions of continuity, to include the modeling of discrete particles such as atoms, and to allow the explicit modeling of nonlocal forces that are known to strongly influence the behavior of real materials.

More Details

Elastic wave propagation in variable media using a discontinuous Galerkin method

Society of Exploration Geophysicists International Exposition and 80th Annual Meeting 2010, SEG 2010

Smith, Thomas M.; Collis, Samuel S.; Ober, Curtis C.; Overfelt, James R.; Schwaiger, Hans F.

Motivated by the needs of seismic inversion and building on our prior experience for fluid-dynamics systems, we present a high-order discontinuous Galerkin (DG) Runge-Kutta method applied to isotropic, linearized elasto-dynamics. Unlike other DG methods recently presented in the literature, our method allows for inhomogeneous material variations within each element that enables representation of realistic earth models — a feature critical for future use in seismic inversion. Likewise, our method supports curved elements and hybrid meshes that include both simplicial and nonsimplicial elements. We demonstrate the capabilities of this method through a series of numerical experiments including hybrid mesh discretizations of the Marmousi2 model as well as a modified Marmousi2 model with a oscillatory ocean bottom that is exactly captured by our discretization.

More Details

A cognitive-consistency based model of population wide attitude change

AAAI Fall Symposium - Technical Report

Lakkaraju, Kiran L.; Speed, Ann S.

Attitudes play a significant role in determining how individuals process information and behave. In this paper we have developed a new computational model of population wide attitude change that captures the social level: how individuals interact and communicate information, and the cognitive level: how attitudes and concept interact with each other. The model captures the cognitive aspect by representing each individuals as a parallel constraint satisfaction network. The dynamics of this model are explored through a simple attitude change experiment where we vary the social network and distribution of attitudes in a population. Copyright © 2010, Association for the Advancement of Artificial Intelligence. All rights reserved.

More Details

Simulation of dynamic fracture using peridynamics, finite element modeling, and contact

ASME International Mechanical Engineering Congress and Exposition, Proceedings (IMECE)

Littlewood, David J.

Peridynamics is a nonlocal extension of classical solid mechanics that allows for the modeling of bodies in which discontinuities occur spontaneously. Because the peridynamic expression for the balance of linear momentum does not contain spatial derivatives and is instead based on an integral equation, it is well suited for modeling phenomena involving spatial discontinuities such as crack formation and fracture. In this study, both peridynamics and classical finite element analysis are applied to simulate material response under dynamic blast loading conditions. A combined approach is utilized in which the portion of the simulation modeled with peridynamics interacts with the finite element portion of the model via a contact algorithm. The peridynamic portion of the analysis utilizes an elastic-plastic constitutive model with linear hardening. The peridynamic interface to the constitutive model is based on the calculation of an approximate deformation gradient, requiring the suppression of possible zero-energy modes. The classical finite element portion of the model utilizes a Johnson-Cook constitutive model. Simulation results are validated by direct comparison to expanding tube experiments. The coupled modeling approach successfully captures material response at the surface of the tube and the emerging fracture pattern. Copyright © 2010 by ASME.

More Details

Formulation and optimization of robust sensor placement problems for drinking water contamination warning systems

Journal of Infrastructure Systems

Watson, Jean P.; Murray, Regan; Hart, William E.

The sensor placement problem in contamination warning system design for municipal water distribution networks involves maximizing the protection level afforded by limited numbers of sensors, typically quantified as the expected impact of a contamination event; the issue of how to mitigate against high-consequence events is either handled implicitly or ignored entirely. Consequently, expected-case sensor placements run the risk of failing to protect against high-consequence 9/11-style attacks. In contrast, robust sensor placements address this concern by focusing strictly on high-consequence events and placing sensors to minimize the impact of these events. We introduce several robust variations of the sensor placement problem, distinguished by how they quantify the potential damage due to high-consequence events. We explore the nature of robust versus expected-case sensor placements on three real-world large-scale distribution networks. We find that robust sensor placements can yield large reductions in the number and magnitude of high-consequence events, with only modest increases in expected impact. The ability to trade-off between robust and expected-case impacts is a key unexplored dimension in contamination warning system design. © 2009 ASCE.

More Details

Probabilistic methods in model validation

Conference Proceedings of the Society for Experimental Mechanics Series

Paez, Thomas L.; Swiler, Laura P.

Extensive experimentation over the past decade has shown that fabricated physical systems that are intended to be identical, and are nominally identical, in fact, differ from one another, and sometimes substantially. This fact makes it difficult to validate a mathematical model for any system and results in the requirement to characterize physical system behavior using the tools of uncertainty quantification. Further, because of the existence of system, component, and material uncertainty, the mathematical models of these elements sometimes seek to reflect the uncertainty. This presentation introduces some of the methods of probability and statistics, and shows how they can be applied in engineering modeling and data analysis. The ideas of randomness and some basic means for measuring and modeling it are presented. The ideas of random experiment, random variable, mean, variance and standard deviation, and probability distribution are introduced. The ideas are introduced in the framework of a practical, yet simple, example; measured data are included. This presentation is the third in a sequence of tutorial discussions on mathematical model validation. The example introduced here is also used in later presentations. © 2009 Society for Experimental Mechanics Inc.

More Details

Density functional theory (DFT) simulations of shocked liquid xenon

AIP Conference Proceedings

Mattsson, Thomas M.; Magyar, Rudolph J.

Xenon is not only a technologically important element used in laser technologies and jet propulsion, but it is also one of the most accessible materials in which to study the metal-insulator transition with increasing pressure. Because of its closed shell electronic configuration, xenon is often assumed to be chemically inert, interacting almost entirely through the van der Waals interaction, and at liquid density, is typically modeled well using Leonard-Jones potentials. However, such modeling has a limited range of validity as xenon is known to form compounds under normal conditions and likely exhibits considerably more chemistry at higher densities when hybridization of occupied orbitals becomes significant. We present DFT-MD simulations of shocked liquid xenon with the goal of developing an improved equation of state. The calculated Hugoniot to 2 MPa compares well with available experimental shock data. Sandia is a mul-tiprogram laboratory operated by Sandia Corporation, a Lockheed Martin Company, for the United States Department of Energy's National Nuclear Security Administration under contract DE-AC04-94AL85000. © 2009 American Institute of Physics.

More Details

Density functional theory (DFT) simulations of shocked liquid xenon

AIP Conference Proceedings

Mattsson, Thomas M.; Magyar, Rudolph J.

Xenon is not only a technologically important element used in laser technologies and jet propulsion, but it is also one of the most accessible materials in which to study the metal-insulator transition with increasing pressure. Because of its closed shell electronic configuration, xenon is often assumed to be chemically inert, interacting almost entirely through the van der Waals interaction, and at liquid density, is typically modeled well using Leonard-Jones potentials. However, such modeling has a limited range of validity as xenon is known to form compounds under normal conditions and likely exhibits considerably more chemistry at higher densities when hybridization of occupied orbitals becomes significant. We present DFT-MD simulations of shocked liquid xenon with the goal of developing an improved equation of state. The calculated Hugoniot to 2 MPa compares well with available experimental shock data. Sandia is a mul-tiprogram laboratory operated by Sandia Corporation, a Lockheed Martin Company, for the United States Department of Energy's National Nuclear Security Administration under contract DE-AC04-94AL85000. © 2009 American Institute of Physics.

More Details

Label-invariant mesh quality metrics

Proceedings of the 18th International Meshing Roundtable, IMR 2009

Knupp, Patrick K.

Mappings from a master element to the physical mesh element, in conjunction with local metrics such as those appearing in the Target-matrix paradigm, are used to measure quality at points within an element. The approach is applied to both linear and quadratic triangular elements; this enables, for example, one to measure quality within a quadratic finite element. Quality within an element may also be measured on a set of symmetry points, leading to so-called symmetry metrics. An important issue having to do with the labeling of the element vertices is relevant to mesh quality tools such as Verdict and Mesquite. Certain quality measures like area, volume, and shape should be label-invariant, while others such as aspect ratio and orientation should not. It is shown that local metrics whose Jacobian matrix is non-constant are label-invariant only at the center of the element, while symmetry metrics can be label-invariant anywhere within the element, provided the reference element is properly restricted.

More Details

Analysis of micromixers and biocidal coatings on water-treatment membranes to minimize biofouling

Altman, Susan J.; Clem, Paul G.; Cook, Adam W.; Hart, William E.; Hibbs, Michael R.; Ho, Clifford K.; Jones, Howland D.; Sun, Amy C.; Webb, Stephen W.

Biofouling, the unwanted growth of biofilms on a surface, of water-treatment membranes negatively impacts in desalination and water treatment. With biofouling there is a decrease in permeate production, degradation of permeate water quality, and an increase in energy expenditure due to increased cross-flow pressure needed. To date, a universal successful and cost-effect method for controlling biofouling has not been implemented. The overall goal of the work described in this report was to use high-performance computing to direct polymer, material, and biological research to create the next generation of water-treatment membranes. Both physical (micromixers - UV-curable epoxy traces printed on the surface of a water-treatment membrane that promote chaotic mixing) and chemical (quaternary ammonium groups) modifications of the membranes for the purpose of increasing resistance to biofouling were evaluated. Creation of low-cost, efficient water-treatment membranes helps assure the availability of fresh water for human use, a growing need in both the U. S. and the world.

More Details

Summary of the CSRI Workshop on Combinatorial Algebraic Topology (CAT): Software, Applications, & Algorithms

Mitchell, Scott A.; Bennett, Janine C.; Day, David M.

This report summarizes the Combinatorial Algebraic Topology: software, applications & algorithms workshop (CAT Workshop). The workshop was sponsored by the Computer Science Research Institute of Sandia National Laboratories. It was organized by CSRI staff members Scott Mitchell and Shawn Martin. It was held in Santa Fe, New Mexico, August 29-30. The CAT Workshop website has links to some of the talk slides and other information, http://www.cs.sandia.gov/CSRI/Workshops/2009/CAT/index.html. The purpose of the report is to summarize the discussions and recap the sessions. There is a special emphasis on technical areas that are ripe for further exploration, and the plans for follow-up amongst the workshop participants. The intended audiences are the workshop participants, other researchers in the area, and the workshop sponsors.

More Details

Host suppression and bioinformatics for sequence-based characterization of unknown pathogens

Misra, Milind; Patel, Kamlesh P.; Kaiser, Julia N.; Meagher, Robert M.; Branda, Steven B.; Schoeniger, Joseph S.

Bioweapons and emerging infectious diseases pose formidable and growing threats to our national security. Rapid advances in biotechnology and the increasing efficiency of global transportation networks virtually guarantee that the United States will face potentially devastating infectious disease outbreaks caused by novel ('unknown') pathogens either intentionally or accidentally introduced into the population. Unfortunately, our nation's biodefense and public health infrastructure is primarily designed to handle previously characterized ('known') pathogens. While modern DNA assays can identify known pathogens quickly, identifying unknown pathogens currently depends upon slow, classical microbiological methods of isolation and culture that can take weeks to produce actionable information. In many scenarios that delay would be costly, in terms of casualties and economic damage; indeed, it can mean the difference between a manageable public health incident and a full-blown epidemic. To close this gap in our nation's biodefense capability, we will develop, validate, and optimize a system to extract nucleic acids from unknown pathogens present in clinical samples drawn from infected patients. This system will extract nucleic acids from a clinical sample, amplify pathogen and specific host response nucleic acid sequences. These sequences will then be suitable for ultra-high-throughput sequencing (UHTS) carried out by a third party. The data generated from UHTS will then be processed through a new data assimilation and Bioinformatic analysis pipeline that will allow us to characterize an unknown pathogen in hours to days instead of weeks to months. Our methods will require no a priori knowledge of the pathogen, and no isolation or culturing; therefore it will circumvent many of the major roadblocks confronting a clinical microbiologist or virologist when presented with an unknown or engineered pathogen.

More Details

Xyce parallel electronic simulator : users' guide. Version 5.1

Keiter, Eric R.; Mei, Ting M.; Russo, Thomas V.; Pawlowski, Roger P.; Schiek, Richard S.; Santarelli, Keith R.; Coffey, Todd S.; Thornquist, Heidi K.

This manual describes the use of the Xyce Parallel Electronic Simulator. Xyce has been designed as a SPICE-compatible, high-performance analog circuit simulator, and has been written to support the simulation needs of the Sandia National Laboratories electrical designers. This development has focused on improving capability over the current state-of-the-art in the following areas: (1) Capability to solve extremely large circuit problems by supporting large-scale parallel computing platforms (up to thousands of processors). Note that this includes support for most popular parallel and serial computers. (2) Improved performance for all numerical kernels (e.g., time integrator, nonlinear and linear solvers) through state-of-the-art algorithms and novel techniques. (3) Device models which are specifically tailored to meet Sandia's needs, including some radiation-aware devices (for Sandia users only). (4) Object-oriented code design and implementation using modern coding practices that ensure that the Xyce Parallel Electronic Simulator will be maintainable and extensible far into the future. Xyce is a parallel code in the most general sense of the phrase - a message passing parallel implementation - which allows it to run efficiently on the widest possible number of computing platforms. These include serial, shared-memory and distributed-memory parallel as well as heterogeneous platforms. Careful attention has been paid to the specific nature of circuit-simulation problems to ensure that optimal parallel efficiency is achieved as the number of processors grows. The development of Xyce provides a platform for computational research and development aimed specifically at the needs of the Laboratory. With Xyce, Sandia has an 'in-house' capability with which both new electrical (e.g., device model development) and algorithmic (e.g., faster time-integration methods, parallel solver algorithms) research and development can be performed. As a result, Xyce is a unique electrical simulation capability, designed to meet the unique needs of the laboratory.

More Details

Xyce™ Parallel Electronic Simulator: Reference Guide, Version 5.1

Keiter, Eric R.; Mei, Ting M.; Russo, Thomas V.; Pawlowski, Roger P.; Schiek, Richard S.; Santarelli, Keith R.; Coffey, Todd S.; Thornquist, Heidi K.

This document is a reference guide to the Xyce Parallel Electronic Simulator, and is a companion document to the Xyce Users’ Guide. The focus of this document is (to the extent possible) exhaustively list device parameters, solver options, parser options, and other usage details of Xyce. This document is not intended to be a tutorial. Users who are new to circuit simulation are better served by the Xyce Users’ Guide.

More Details

Modeling aspects of human memory for scientific study

Bernard, Michael L.; Morrow, James D.; Taylor, Shawn E.; Verzi, Stephen J.; Vineyard, Craig M.

Working with leading experts in the field of cognitive neuroscience and computational intelligence, SNL has developed a computational architecture that represents neurocognitive mechanisms associated with how humans remember experiences in their past. The architecture represents how knowledge is organized and updated through information from individual experiences (episodes) via the cortical-hippocampal declarative memory system. We compared the simulated behavioral characteristics with those of humans measured under well established experimental standards, controlling for unmodeled aspects of human processing, such as perception. We used this knowledge to create robust simulations of & human memory behaviors that should help move the scientific community closer to understanding how humans remember information. These behaviors were experimentally validated against actual human subjects, which was published. An important outcome of the validation process will be the joining of specific experimental testing procedures from the field of neuroscience with computational representations from the field of cognitive modeling and simulation.

More Details

Crossing the mesoscale no-mans land via parallel kinetic Monte Carlo

Plimpton, Steven J.; Battaile, Corbett C.; Chandross, M.; Holm, Elizabeth A.; Thompson, Aidan P.; Tikare, Veena T.; Wagner, Gregory J.; Webb, Edmund B.; Zhou, Xiaowang Z.

The kinetic Monte Carlo method and its variants are powerful tools for modeling materials at the mesoscale, meaning at length and time scales in between the atomic and continuum. We have completed a 3 year LDRD project with the goal of developing a parallel kinetic Monte Carlo capability and applying it to materials modeling problems of interest to Sandia. In this report we give an overview of the methods and algorithms developed, and describe our new open-source code called SPPARKS, for Stochastic Parallel PARticle Kinetic Simulator. We also highlight the development of several Monte Carlo models in SPPARKS for specific materials modeling applications, including grain growth, bubble formation, diffusion in nanoporous materials, defect formation in erbium hydrides, and surface growth and evolution.

More Details

Evaluation of the impact chip multiprocessors have on SNL application performance

Doerfler, Douglas W.

This report describes trans-organizational efforts to investigate the impact of chip multiprocessors (CMPs) on the performance of important Sandia application codes. The impact of CMPs on the performance and applicability of Sandia's system software was also investigated. The goal of the investigation was to make algorithmic and architectural recommendations for next generation platform acquisitions.

More Details

Decision support for integrated water-energy planning

Tidwell, Vincent C.; Kobos, Peter H.; Malczynski, Leonard A.; Hart, William E.; Castillo, Cesar R.

Currently, electrical power generation uses about 140 billion gallons of water per day accounting for over 39% of all freshwater withdrawals thus competing with irrigated agriculture as the leading user of water. Coupled to this water use is the required pumping, conveyance, treatment, storage and distribution of the water which requires on average 3% of all electric power generated. While water and energy use are tightly coupled, planning and management of these fundamental resources are rarely treated in an integrated fashion. Toward this need, a decision support framework has been developed that targets the shared needs of energy and water producers, resource managers, regulators, and decision makers at the federal, state and local levels. The framework integrates analysis and optimization capabilities to identify trade-offs, and 'best' alternatives among a broad list of energy/water options and objectives. The decision support framework is formulated in a modular architecture, facilitating tailored analyses over different geographical regions and scales (e.g., national, state, county, watershed, NERC region). An interactive interface allows direct control of the model and access to real-time results displayed as charts, graphs and maps. Ultimately, this open and interactive modeling framework provides a tool for evaluating competing policy and technical options relevant to the energy-water nexus.

More Details
Results 8351–8375 of 9,998
Results 8351–8375 of 9,998