Publications

Results 8801–9000 of 9,998

Search results

Jump to search filters

A mesh optimization algorithm to decrease the maximum error in finite element computations

Proceedings of the 17th International Meshing Roundtable, IMR 2008

Hetmaniuk, U.; Knupp, Patrick K.

We present a mesh optimization algorithm for adaptively improving the finite element interpolation of a function of interest. The algorithm minimizes an objective function by swapping edges and moving nodes. Numerical experiments are performed on model problems. The results illustrate that the mesh optimization algorithm can reduce the W1,∞ semi-norm of the interpolation error. For these examples, the L2, L∞, and H1 norms decreased also.

More Details

A unified architecture for cognition and motor control based on neuroanatomy, psychophysical experiments, and cognitive behaviors

AAAI Fall Symposium - Technical Report

Rohrer, Brandon R.

A Brain-Emulating Cognition and Control Architecture (BECCA) is presented. It is consistent with the hypothesized functions of pervasive intra-cortical and cortico-subcortical neural circuits. It is able to reproduce many salient aspects of human voluntary movement and motor learning. It also provides plausible mechanisms for many phenomena described in cognitive psychology, including perception and mental modeling. Both "inputs" (afferent channels) and "outputs"' (efferent channels) are treated as neural signals; they are all binary (either on or off) and there is no meaning, information, or tag associated with any of them. Although BECCA initially has no internal models, it learns complex interrelations between outputs and inputs through which it bootstraps a model of the system it is controlling and the outside world. BECCA uses two key algorithms to accomplish this: S-Learning and Context-Based Similarity (CBS).

More Details

Individual and group electronic brainstorming in an industrial setting

Proceedings of the Human Factors and Ergonomics Society

Dornburg, Courtney S.; Hendrickson, Stacey M.; Davidson, George S.

An experiment was conducted comparing the effectiveness of individual versus group electronic brainstorming in addressing real-world "wickedly difficult" challenges. Previous laboratory research has engaged small groups of students in answering questions irrelevant to an industrial setting. The current experiment extended this research to larger, real-world employee groups engaged in addressing organizationrelevant challenges. Within the present experiment, the data demonstrated that individuals performed at least as well as groups in terms of number of ideas produced and significantly (p<.02) outperformed groups in terms of the quality of those ideas (as measured along the dimensions of originality, feasibility, and effectiveness).

More Details

Understanding virulence mechanisms in M. tuberculosis infection via a circuit-based simulation framework

Proceedings of the 30th Annual International Conference of the IEEE Engineering in Medicine and Biology Society, EMBS'08 - "Personalized Healthcare through Technology"

May, Elebeoba E.; Leitao, Andrei; Faulon, Jean-Loup M.; Joo, Jaewook J.; Misra, Milind; Oprea, Tudor I.

Tuberculosis (TB), caused by the bacterium Mycobacterium tuberculosis (Mtb), is a growing international health crisis. Mtb is able to persist in host tissues in a nonreplicating persistent (NRP) or latent state. This presents a challenge in the treatment of TB. Latent TB can re-activate in 10% of individuals with normal immune systems, higher for those with compromised immune systems. A quantitative understanding of latency-associated virulence mechanisms may help researchers develop more effective methods to battle the spread and reduce TB associated fatalities. Leveraging BioXyce's ability to simulate whole-cell and multi-cellular systems we are developing a circuit-based framework to investigate the impact of pathogenicity-associated pathways on the latency/reactivation phase of tuberculosis infection. We discuss efforts to simulate metabolic pathways that potentially impact the ability of Mtb to persist within host immune cells. We demonstrate how simulation studies can provide insight regarding the efficacy of potential anti-TB agents on biological networks critical to Mtb pathogenicity using a systems chemical biology approach. © 2008 IEEE.

More Details

Model calibration under uncertainty: Matching distribution information

12th AIAA/ISSMO Multidisciplinary Analysis and Optimization Conference, MAO

Swiler, Laura P.; Adams, Brian M.; Eldred, Michael S.

We develop an approach for estimating model parameters which result in the "best distribution fit" between experimental and simulation data. Best distribution fit means matching moments of experimental data to those of a simulation (and possibly matching a full probability distribution). This approach extends typical nonlinear least squares methods which identify parameters maximizing agreement between experimental points and computational simulation results. Several analytic formulations for the distribution matching problem are provided, along with results for solving test problems and comparisons of this parameter estimation technique with a deterministic least squares approach. Copyright © 2008 by the American Institute of Aeronautics and Astronautics, Inc.

More Details

Scheduling manual sampling for contamination detection in municipal water networks

8th Annual Water Distribution Systems Analysis Symposium 2006

Berry, Jonathan W.; Lin, Henry; Lauer, Erik; Phillips, Cynthia

Cities without an early warning system of indwelling sensors can consider monitoring their networks manually, especially during times of heightened security levels. We consider the problem of calculating an optimal schedule for manual sampling in a municipal water network. Preliminary computations with a small-scale example indicate that during normal times, manual sampling can provide some benefit, but it is far inferior to an indwelling sensor network. However, given information that significantly constrains the nature of an imminent threat, manual sampling can perform as well as a small sensor network designed to handle normal threats. Copyright ASCE 2006.

More Details

Variational multiscale residual-based turbulence modeling for large eddy simulation of incompressible flows

Computer Methods in Applied Mechanics and Engineering

Bazilevs, Y.; Calo, V.M.; Cottrell, J.A.; Hughes, T.J.R.; Reali, A.; Scovazzi, Guglielmo S.

We present an LES-type variational multiscale theory of turbulence. Our approach derives completely from the incompressible Navier-Stokes equations and does not employ any ad hoc devices, such as eddy viscosities. We tested the formulation on forced homogeneous isotropic turbulence and turbulent channel flows. In the calculations, we employed linear, quadratic and cubic NURBS. A dispersion analysis of simple model problems revealed NURBS elements to be superior to classical finite elements in approximating advective and diffusive processes, which play a significant role in turbulence computations. The numerical results are very good and confirm the viability of the theoretical framework. © 2007 Elsevier B.V. All rights reserved.

More Details

Evaluating NIC hardware requirements to achieve high message rate PGAS support on multi-core processors

Proceedings of the 2007 ACM/IEEE Conference on Supercomputing, SC'07

Underwood, Keith; Levenhagen, Michael J.; Brightwell, Ronald B.

Partitioned global address space (PGAS) programming models have been identified as one of the few viable approaches for dealing with emerging many-core systems. These models tend to generate many small messages, which requires specific support from the network interface hardware to enable efficient execution. In the past, Cray included E-registers on the Cray T3E to support the SHMEM API; however, with the advent of multi-core processors, the balance of computation to communication capabilities has shifted toward computation. This paper explores the message rates that are achievable with multi-core processors and simplified PGAS support on a more conventional network interface. For message rate tests, we find that simple network interface hardware is more than sufficient. We also find that even typical data distributions, such as cyclic or block-cyclic, do not need specialized hardware support. Finally, we assess the impact of such support on the well known RandomAccess benchmark. (c) 2007 ACM.

More Details

EXACT: The experimental algorithmics computational toolkit

Proceedings of the 2007 Workshop on Experimental Computer Science

Hart, William E.; Berry, Jonathan W.; Heaphy, Robert T.; Phillips, Cynthia A.

In this paper, we introduce EXACT, the EXperimental Algorithmics Computational Toolkit. EXACT is a software framework for describing, controlling, and analyzing computer experiments. It provides the experimentalist with convenient software tools to ease and organize the entire experimental process, including the description of factors and levels, the design of experiments, the control of experimental runs, the archiving of results, and analysis of results. As a case study for EXACT, we describe its interaction with FAST, the Sandia Framework for Agile Software Testing. EXACT and FAST now manage the nightly testing of several large software projects at Sandia. We also discuss EXACT's advanced features, which include a driver module that controls complex experiments such as comparisons of parallel algorithms. Copyright 2007 ACM.

More Details

Optimal monitoring location selection for water quality issues

Restoring Our Natural Habitat - Proceedings of the 2007 World Environmental and Water Resources Congress

Boccelli, Dominic L.; Hart, William E.

Recently, extensive focus has been placed on determining the optimal locations of sensors within a distribution system to minimize the impact on public health from intentional intrusion events. Modified versions of these tools may have additional benefits for determining monitoring locations for other more common objectives associated with distribution systems. A modified Sensor Placement Optimization Tool (SPOT) is presented that can be used for satisfying more generic location problems such as determining monitoring locations for tracer tests or disinfectant byproduct sampling. The utility for the modified SPOT algorithm is discussed with respect to implementing a distribution system field-scale tracer study. © 2007 ASCE.

More Details

On the effects of memory latency and bandwidth on supercomputer application performance

Proceedings of the 2007 IEEE International Symposium on Workload Characterization, IISWC

Murphy, Richard C.

Since the first vector supercomputers in the mid-1970's, the largest scale applications have traditionally been floating point oriented numerical codes, which can be broadly characterized as the simulation of physics on a computer. Supercomputer architectures have evolved to meet the needs of those applications. Specifically, the computational work of the application tends to be floating point oriented, and the decomposition of the problem two or three dimensional. Today, an emerging class of critical applications may change those assumptions: they are combinatorial in nature, integer oriented, and irregular. The performance of both classes of applications is dominated by the performance of the memory system. This paper compares the memory performance sensitivity of both traditional and emerging HPC applications, and shows that the new codes are significantly more sensitive to memory latency and bandwidth than their traditional counterparts. Additionally, these codes exhibit lower base-line performance, which only exacerbates the problem. As a result, the construction of future supercomputer architectures to support these applications will most likely be different from those used to support traditional codes. Quantitatively understanding the difference between the two workloads will form the basis for future design choices. ©2007 IEEE.

More Details

An extended finite element method formulation for modeling the response of polycrystalline materials to dynamic loading

AIP Conference Proceedings

Robbins, Joshua R.; Voth, Thomas E.

The extended Finite Element Method (X-FEM) is a finite-element based discretization technique developed originally to model dynamic crack propagation [1]. Since that time the method has been used for modeling physics ranging from static meso-scale material failure to dendrite growth. Here we adapt the recent advances of Vitali and Benson [2] and Song et. al. [3] to model dynamic loading of a polycry stalline material. We use demonstration problems to examine the method's efficacy for modeling the dynamic response of polycrystalline materials at the meso-scale. Specifically, we use the X-FEM to model grain boundaries. This approach allows us to i) eliminate ad-hoc mixture rules for multi-material elements and ii) avoid explicitly meshing grain boundaries. © 2007 American Institute of Physics.

More Details

Toward a more rigorous application of margins and uncertainties within the nuclear weapons life cycle : a Sandia perspective

Diegert, Kathleen V.; Klenke, S.E.; Paulsen, Robert A.; Pilch, Martin P.; Trucano, Timothy G.

This paper presents the conceptual framework that is being used to define quantification of margins and uncertainties (QMU) for application in the nuclear weapons (NW) work conducted at Sandia National Laboratories. The conceptual framework addresses the margins and uncertainties throughout the NW life cycle and includes the definition of terms related to QMU and to figures of merit. Potential applications of QMU consist of analyses based on physical data and on modeling and simulation. Appendix A provides general guidelines for addressing cases in which significant and relevant physical data are available for QMU analysis. Appendix B gives the specific guidance that was used to conduct QMU analyses in cycle 12 of the annual assessment process. Appendix C offers general guidelines for addressing cases in which appropriate models are available for use in QMU analysis. Appendix D contains an example that highlights the consequences of different treatments of uncertainty in model-based QMU analyses.

More Details

On the effects of memory latency and bandwidth on supercomputer application performance

Proceedings of the 2007 IEEE International Symposium on Workload Characterization, IISWC

Murphy, Richard C.

Since the first vector supercomputers in the mid-1970's, the largest scale applications have traditionally been floating point oriented numerical codes, which can be broadly characterized as the simulation of physics on a computer. Supercomputer architectures have evolved to meet the needs of those applications. Specifically, the computational work of the application tends to be floating point oriented, and the decomposition of the problem two or three dimensional. Today, an emerging class of critical applications may change those assumptions: they are combinatorial in nature, integer oriented, and irregular. The performance of both classes of applications is dominated by the performance of the memory system. This paper compares the memory performance sensitivity of both traditional and emerging HPC applications, and shows that the new codes are significantly more sensitive to memory latency and bandwidth than their traditional counterparts. Additionally, these codes exhibit lower base-line performance, which only exacerbates the problem. As a result, the construction of future supercomputer architectures to support these applications will most likely be different from those used to support traditional codes. Quantitatively understanding the difference between the two workloads will form the basis for future design choices. ©2007 IEEE.

More Details

The analysis of a sparse grid stochastic collocation method for partial differential equations with high-dimensional random input data

Webster, Clayton G.

This work describes the convergence analysis of a Smolyak-type sparse grid stochastic collocation method for the approximation of statistical quantities related to the solution of partial differential equations with random coefficients and forcing terms (input data of the model). To compute solution statistics, the sparse grid stochastic collocation method uses approximate solutions, produced here by finite elements, corresponding to a deterministic set of points in the random input space. This naturally requires solving uncoupled deterministic problems and, as such, the derived strong error estimates for the fully discrete solution are used to compare the computational efficiency of the proposed method with the Monte Carlo method. Numerical examples illustrate the theoretical results and are used to compare this approach with several others, including the standard Monte Carlo.

More Details

Searching for globally optimal functional forms for interatomic potentials using genetic programming with parallel tempering

Journal of Computational Chemistry

Slepoy, Alexander S.; Peters, Michael D.; Thompson, Aidan P.

Molecular dynamics and other molecular simulation methods rely on a potential energy function, based only on the relative coordinates of the atomic nuclei. Such a function, called a force field, approximately represents the electronic structure interactions of a condensed matter system. Developing such approximate functions and fitting their parameters remains an arduous, time-consuming process, relying on expert physical intuition. To address this problem, a functional programming methodology was developed that may enable automated discovery of entirely new force-field functional forms, while simultaneously fitting parameter values. The method uses a combination of genetic programming, Metropolis Monte Carlo importance sampling and parallel tempering, to efficiently search a large space of candidate functional forms and parameters. The methodology was tested using a nontrivial problem with a well-defined globally optimal solution: a small set of atomic configurations was generated and the energy of each configuration was calculated using the Lennard-Jones pair potential. Starting with a population of random functions, our fully automated, massively parallel implementation of the method reproducibly discovered the original Lennard-Jones pair potential by searching for several hours on 100 processors, sampling only a minuscule portion of the total search space. This result indicates that, with further improvement, the method may be suitable for unsupervised development of more accurate force fields with completely new functional forms. © 2007 Wiley Periodicals, Inc.

More Details

Microstructural modeling of ferroic switching and phase transitions in PZT

Proceedings of SPIE - The International Society for Optical Engineering

Robbins, Joshua R.; Khraishi, Tariq A.; Chaplya, Pavel

Niobium doped Lead Zirconate Titanate (PZT) with a Zr/Ti ratio of 95/5 (i.e., PZT 95/5-2Nb) is a ferroelectric with a rhombohedral structure at room temperature. A crystal (or a subdomain within a crystal) exhibits a spontaneous polarization in any one of eight crystallographically equivalent directions. Such a material becomes polarized when subjected to a large electric field. When the electric field is removed, a remanent polarization remains and a bound charge is stored. A displacive phase transition from a rhombohedral ferroelectric phase to an orthorhombic anti-ferroelectric phase can be induced with the application of a mechanical load. When this occurs, the material becomes depoled and the bound charge is released. The polycrystalline character of PZT 95/5-2Nb leads to highly non-uniform fields at the grain scale. These local fields lead to very complex material behavior during mechanical depoling that has important implications to device design and performance. This paper presents a microstructurally based numerical model that describes the 3D non-linear behavior of ferroelectric ceramics. The model resolves the structure of polycrystals directly in the topology of the problem domain and uses the extended finite element method (X-FEM) to solve the governing equations of electromechanics. The material response is computed from anisotropic single crystal constants and the volume fractions of the various polarization variants (i.e., three variants for rhombohedral anti-ferroelectric and eight for rhomobohedral ferroelectric ceramic). Evolution of the variant volume fractions is governed by the minimization of internally stored energy and accounts for ferroelectric and ferroelastic domain switching and phase transitions in response to the applied loads. The developed model is used to examine hydrostatic depoling in PZT 95/5-2Nb.

More Details

Final report on LDRD project : coupling strategies for multi-physics applications

Hopkins, Matthew M.; Pawlowski, Roger P.; Moffat, Harry K.; Carnes, Brian C.; Hooper, Russell H.

Many current and future modeling applications at Sandia including ASC milestones will critically depend on the simultaneous solution of vastly different physical phenomena. Issues due to code coupling are often not addressed, understood, or even recognized. The objectives of the LDRD has been both in theory and in code development. We will show that we have provided a fundamental analysis of coupling, i.e., when strong coupling vs. a successive substitution strategy is needed. We have enabled the implementation of tighter coupling strategies through additions to the NOX and Sierra code suites to make coupling strategies available now. We have leveraged existing functionality to do this. Specifically, we have built into NOX the capability to handle fully coupled simulations from multiple codes, and we have also built into NOX the capability to handle Jacobi Free Newton Krylov simulations that link multiple applications. We show how this capability may be accessed from within the Sierra Framework as well as from outside of Sierra. The critical impact from this LDRD is that we have shown how and have delivered strategies for enabling strong Newton-based coupling while respecting the modularity of existing codes. This will facilitate the use of these codes in a coupled manner to solve multi-physic applications.

More Details

The acquisition of dangerous biological materials: Technical facts sheets to assist risk assessments of 46 potential BW agents

Astuto Gribble, Lisa A.; Gaudioso, Jennifer M.

Numerous terrorist organizations have openly expressed interest in producing and deploying biological weapons. However, a limiting factor for many terrorists has been the acquisition of dangerous biological agents, as evidenced by the very few successful instances of biological weapons use compared to the number of documented hoaxes. Biological agents vary greatly in their ability to cause loss of life and economic damage. Some agents, if released properly, can kill many people and cause an extensive number of secondary infections; other agents will sicken only a small number of people for a short period of time. Consequently, several biological agents can potentially be used to perpetrate a bioterrorism attack but few are likely capable of causing a high consequence event. It is crucial, from a US national security perspective, to more deeply understand the likelihood that terrorist organizations can acquire the range of these agents. Few studies have attempted to comprehensively compile the technical information directly relevant to the acquisition of dangerous bacteria, viruses and toxins. In this report, technical fact sheets were assembled for 46 potentially dangerous biological agents. Much of the information was taken from various research sources which could ultimately and significantly expedite and improve bioterrorism threat assessments. By systematically examining a number of specific agent characteristics included in these fact sheets, it may be possible to detect, target, and implement measures to thwart future terrorist acquisition attempts. In addition, the information in these fact sheets may be used as a tool to help laboratories gain a rudimentary understanding of how attractive a method laboratory theft is relative to other potential acquisition modes.

More Details

Behavior-aware decision support systems : LDRD final report

Backus, George A.; Strip, David R.

As Sandia National Laboratories serves its mission to provide support for the security-related interests of the United States, it is faced with considering the behavioral responses that drive problems, mitigate interventions, or lead to unintended consequences. The effort described here expands earlier works in using healthcare simulation to develop behavior-aware decision support systems. This report focuses on using qualitative choice techniques and enhancing two analysis models developed in a sister project.

More Details

Accommodating complexity and human behaviors in decision analysis

Backus, George A.; Strip, David R.; Siirola, John D.; Bastian, Mark S.; Schoenwald, David A.; Braithwaite, Karl R.

This is the final report for a LDRD effort to address human behavior in decision support systems. One sister LDRD effort reports the extension of this work to include actual human choices and additional simulation analyses. Another provides the background for this effort and the programmatic directions for future work. This specific effort considered the feasibility of five aspects of model development required for analysis viability. To avoid the use of classified information, healthcare decisions and the system embedding them became the illustrative example for assessment.

More Details

Large-scale transient sensitivity analysis of a radiation damaged bipolar junction transistor

Bartlett, Roscoe B.; Hoekstra, Robert J.

Automatic differentiation (AD) is useful in transient sensitivity analysis of a computational simulation of a bipolar junction transistor subject to radiation damage. We used forward-mode AD, implemented in a new Trilinos package called Sacado, to compute analytic derivatives for implicit time integration and forward sensitivity analysis. Sacado addresses element-based simulation codes written in C++ and works well with forward sensitivity analysis as implemented in the Trilinos time-integration package Rythmos. The forward sensitivity calculation is significantly more efficient and robust than finite differencing.

More Details

Electron transport in zinc-blende wurtzite biphasic gallium nitride nanowires and GaNFETs

Nanotechnology

Jacobs, Benjamin W.; Ayres, Virginia M.; Stallcup, Richard E.; Hartman, Alan; Tupta, Mary A.; Baczewski, Andrew D.; Crimp, Martin A.; Halpern, Joshua B.; He, Maoqi; Shaw, Harry C.

Two-point and four-point probe electrical measurements of a biphasic gallium nitride nanowire and current–voltage characteristics of a gallium nitride nanowire based field effect transistor are reported. The biphasic gallium nitride nanowires have a crystalline homostructure consisting of wurtzite and zinc-blende phases that grow simultaneously in the longitudinal direction. There is a sharp transition of one to a few atomic layers between each phase. Here, all measurements showed high current densities. Evidence of single-phase current transport in the biphasic nanowire structure is discussed.

More Details

Predictive Capability Maturity Model for computational modeling and simulation

Pilch, Martin P.; Oberkampf, William L.; Trucano, Timothy G.

The Predictive Capability Maturity Model (PCMM) is a new model that can be used to assess the level of maturity of computational modeling and simulation (M&S) efforts. The development of the model is based on both the authors experience and their analysis of similar investigations in the past. The perspective taken in this report is one of judging the usefulness of a predictive capability that relies on the numerical solution to partial differential equations to better inform and improve decision making. The review of past investigations, such as the Software Engineering Institute's Capability Maturity Model Integration and the National Aeronautics and Space Administration and Department of Defense Technology Readiness Levels, indicates that a more restricted, more interpretable method is needed to assess the maturity of an M&S effort. The PCMM addresses six contributing elements to M&S: (1) representation and geometric fidelity, (2) physics and material model fidelity, (3) code verification, (4) solution verification, (5) model validation, and (6) uncertainty quantification and sensitivity analysis. For each of these elements, attributes are identified that characterize four increasing levels of maturity. Importantly, the PCMM is a structured method for assessing the maturity of an M&S effort that is directed toward an engineering application of interest. The PCMM does not assess whether the M&S effort, the accuracy of the predictions, or the performance of the engineering system satisfies or does not satisfy specified application requirements.

More Details

A mathematical framework for multiscale science and engineering : the variational multiscale method and interscale transfer operators

Bochev, Pavel B.; Collis, Samuel S.; Jones, Reese E.; Lehoucq, Richard B.; Parks, Michael L.; Scovazzi, Guglielmo S.; Silling, Stewart A.; Templeton, Jeremy A.; Wagner, Gregory J.

This report is a collection of documents written as part of the Laboratory Directed Research and Development (LDRD) project A Mathematical Framework for Multiscale Science and Engineering: The Variational Multiscale Method and Interscale Transfer Operators. We present developments in two categories of multiscale mathematics and analysis. The first, continuum-to-continuum (CtC) multiscale, includes problems that allow application of the same continuum model at all scales with the primary barrier to simulation being computing resources. The second, atomistic-to-continuum (AtC) multiscale, represents applications where detailed physics at the atomistic or molecular level must be simulated to resolve the small scales, but the effect on and coupling to the continuum level is frequently unclear.

More Details

Implementing wide baseline matching algorithms on a graphics processing unit

Myers, Daniel S.; Gonzales, Antonio G.; Rothganger, Fredrick R.; Larson, K.W.

Wide baseline matching is the state of the art for object recognition and image registration problems in computer vision. Though effective, the computational expense of these algorithms limits their application to many real-world problems. The performance of wide baseline matching algorithms may be improved by using a graphical processing unit as a fast multithreaded co-processor. In this paper, we present an implementation of the difference of Gaussian feature extractor, based on the CUDA system of GPU programming developed by NVIDIA, and implemented on their hardware. For a 2000x2000 pixel image, the GPU-based method executes nearly thirteen times faster than a comparable CPU-based method, with no significant loss of accuracy.

More Details

Massive graph visualization : LDRD final report

Moreland, Kenneth D.; Wylie, Brian N.

Graphs are a vital way of organizing data with complex correlations. A good visualization of a graph can fundamentally change human understanding of the data. Consequently, there is a rich body of work on graph visualization. Although there are many techniques that are effective on small to medium sized graphs (tens of thousands of nodes), there is a void in the research for visualizing massive graphs containing millions of nodes. Sandia is one of the few entities in the world that has the means and motivation to handle data on such a massive scale. For example, homeland security generates graphs from prolific media sources such as television, telephone, and the Internet. The purpose of this project is to provide the groundwork for visualizing such massive graphs. The research provides for two major feature gaps: a parallel, interactive visualization framework and scalable algorithms to make the framework usable to a practical application. Both the frameworks and algorithms are designed to run on distributed parallel computers, which are already available at Sandia. Some features are integrated into the ThreatView{trademark} application and future work will integrate further parallel algorithms.

More Details

Statistical coarse-graining of molecular dynamics into peridynamics

Lehoucq, Richard B.; Silling, Stewart A.

This paper describes an elegant statistical coarse-graining of molecular dynamics at finite temperature into peridynamics, a continuum theory. Peridynamics is an efficient alternative to molecular dynamics enabling dynamics at larger length and time scales. In direct analogy with molecular dynamics, peridynamics uses a nonlocal model of force and does not employ stress/strain relationships germane to classical continuum mechanics. In contrast with classical continuum mechanics, the peridynamic representation of a system of linear springs and masses is shown to have the same dispersion relation as the original spring-mass system.

More Details

Titanium cholla : lightweight, high-strength structures for aerospace applications

Gill, David D.; Atwood, Clinton J.; Robbins, Joshua R.; Voth, Thomas E.

Aerospace designers seek lightweight, high-strength structures to lower launch weight while creating structures that are capable of withstanding launch loadings. Most 'light-weighting' is done through an expensive, time-consuming, iterative method requiring experience and a repeated design/test/redesign sequence until an adequate solution is obtained. Little successful work has been done in the application of generalized 3D optimization due to the difficulty of analytical solutions, the large computational requirements of computerized solutions, and the inability to manufacture many optimized structures with conventional machining processes. The Titanium Cholla LDRD team set out to create generalized 3D optimization routines, a set of analytically optimized 3D structures for testing the solutions, and a method of manufacturing these complex optimized structures. The team developed two new computer optimization solutions: Advanced Topological Optimization (ATO) and FlexFEM, an optimization package utilizing the eXtended Finite Element Method (XFEM) software for stress analysis. The team also developed several new analytically defined classes of optimized structures. Finally, the team developed a 3D capability for the Laser Engineered Net Shaping{trademark} (LENS{reg_sign}) additive manufacturing process including process planning for 3D optimized structures. This report gives individual examples as well as one generalized example showing the optimized solutions and an optimized metal part.

More Details

Simulation of neutron radiation damage in silicon semiconductor devices

Hoekstra, Robert J.; Castro, Joseph P.; Shadid, John N.; Fixel, Deborah A.

A code, Charon, is described which simulates the effects that neutron damage has on silicon semiconductor devices. The code uses a stabilized, finite-element discretization of the semiconductor drift-diffusion equations. The mathematical model used to simulate semiconductor devices in both normal and radiation environments will be described. Modeling of defect complexes is accomplished by adding an additional drift-diffusion equation for each of the defect species. Additionally, details are given describing how Charon can efficiently solve very large problems using modern parallel computers. Comparison between Charon and experiment will be given, as well as comparison with results from commercially-available TCAD codes.

More Details

Portable, chronic neural interface system design for sensory augmentation

Proceedings of the 3rd International IEEE EMBS Conference on Neural Engineering

Olsson, Roy H.; Wojciechowski, Kenneth W.; Yepez, Esteban Y.; Novick, David K.; Peterson, K.A.; Turner, Timothy S.; Wheeler, Jason W.; Rohrer, Brandon R.; Kholwadwala, Deepesh K.

While existing work in neural interfaces is largely geared toward the restoration of lost function in amputees or victims of neurological injuries, similar technology may also facilitate augmentation of healthy subjects. One example is the potential to learn a new, unnatural sense through a neural interface. The use of neural interfaces in healthy subjects would require an even greater level of safety and convenience than in disabled subjects, including reliable, robust bidirectional implants with highly-portable components outside the skin. We present our progress to date in the development of a bidirectional neural interface system intended for completely untethered use. The system consists of a wireless stimulating and recording peripheral nerve implant powered by a rechargeable battery, and a wearable package that communicates wirelessly both with the implant and with a computer or a network of independent sensor nodes. Once validated, such a system could permit the exploration of increasingly realistic use of neural interfaces both for restoration and for augmentation. © 2007 IEEE.

More Details

Coupling volume-of-fluid based interface reconstructions with the extended finite element method

Computer Methods in Applied Mechanics and Engineering

Voth, Thomas E.; Mosso, Stewart J.; Robbins, Joshua R.

Here, we examine the coupling of the patterned-interface-reconstruction (PIR) algorithm with the extended finite element method (X-FEM) for general multi-material problems over structured and unstructured meshes. The coupled method offers the advantages of allowing for local, element-based reconstructions of the interface, and facilitates the imposition of discrete conservation laws. Of particular note is the use of an interface representation that is volume-of-fluid based, giving rise to a segmented interface representation that is not continuous across element boundaries. In conjunction with such a representation, we employ enrichment with the ridge function for treating material interfaces and an analog to Heaviside enrichment for treating free surfaces. We examine a series of benchmark problems that quantify the convergence aspects of the coupled method and examine the sensitivity to noise in the interface reconstruction. Finally, the fidelity of a remapping strategy is also examined for a moving interface problem.

More Details

Yellow sticky, PHP software for an electronic brainstorming experiment

Dornburg, Courtney S.; Davidson, George S.; Forsythe, James C.

A web-based brainstorm was conducted in the summer of 2007 within the Sandia Restricted Network. This brainstorming experiment was modeled around the 'yellow sticky' brainstorms that are used in many face-to-face meetings at Sandia National Laboratories. This document discusses the implementation and makes suggestions for future implementations.

More Details

Architectural considerations for agent-based national scale policy models : LDRD final report

Strip, David R.; Backus, George A.

The need to anticipate the consequences of policy decisions becomes ever more important as the magnitude of the potential consequences grows. The multiplicity of connections between the components of society and the economy makes intuitive assessments extremely unreliable. Agent-based modeling has the potential to be a powerful tool in modeling policy impacts. The direct mapping between agents and elements of society and the economy simplify the mapping of real world functions into the world of computation assessment. Our modeling initiative is motivated by the desire to facilitate informed public debate on alternative policies for how we, as a nation, provide healthcare to our population. We explore the implications of this motivation on the design and implementation of a model. We discuss the choice of an agent-based modeling approach and contrast it to micro-simulation and systems dynamics approaches.

More Details

Massively parallel collaboration : a literature review

Dornburg, Courtney S.; Adams, Susan S.; Forsythe, James C.; Davidson, George S.

The present paper explores group dynamics and electronic communication, two components of wicked problem solving that are inherent to the national security environment (as well as many other business environments). First, because there can be no ''right'' answer or solution without first having agreement about the definition of the problem and the social meaning of a ''right solution'', these problems (often) fundamentally relate to the social aspects of groups, an area with much empirical research and application still needed. Second, as computer networks have been increasingly used to conduct business with decreased costs, increased information accessibility, and rapid document, database, and message exchange, electronic communication enables a new form of problem solving group that has yet to be well understood, especially as it relates to solving wicked problems.

More Details

Assessing the effectiveness of electronic brainstorming in an industrial setting : experimental design document

Adams, Susan S.; Davidson, George S.; Dornburg, Courtney S.; Forsythe, James C.

An experiment is proposed which will compare the effectiveness of individual versus group brainstorming in addressing difficult, real world challenges. Previous research into electronic brainstorming has largely been limited to laboratory experiments using small groups of students answering questions irrelevant to an industrial setting. The proposed experiment attempts to extend current findings to real-world employees and organization-relevant challenges. Our employees will brainstorm ideas over the course of several days, echoing the real-world scenario in an industrial setting. The methodology and hypotheses to be tested are presented along with two questions for the experimental brainstorming sessions. One question has been used in prior work and will allow calibration of the new results with existing work. The second question qualifies as a complicated, perhaps even wickedly hard, question, with relevance to modern management practices.

More Details

LDRD final report : robust analysis of large-scale combinatorial applications

Hart, William E.; Carr, Robert D.; Phillips, Cynthia A.; Watson, Jean-Paul W.

Discrete models of large, complex systems like national infrastructures and complex logistics frameworks naturally incorporate many modeling uncertainties. Consequently, there is a clear need for optimization techniques that can robustly account for risks associated with modeling uncertainties. This report summarizes the progress of the Late-Start LDRD 'Robust Analysis of Largescale Combinatorial Applications'. This project developed new heuristics for solving robust optimization models, and developed new robust optimization models for describing uncertainty scenarios.

More Details

Post-processing V&V level II ASC milestone (2360) results

Moreland, Kenneth D.; Chavez, Elmer A.; Weirs, Vincent G.; Brunner, Thomas A.; Trucano, Timothy G.; Karelitz, David B.

The 9/30/2007 ASC Level 2 Post-Processing V&V Milestone (Milestone 2360) contains functionality required by the user community for certain verification and validation tasks. These capabilities include loading of edge and face data on an Exodus mesh, run-time computation of an exact solution to a verification problem, delivery of results data from the server to the client, computation of an integral-based error metric, simultaneous loading of simulation and test data, and comparison of that data using visual and quantitative methods. The capabilities were tested extensively by performing a typical ALEGRA HEDP verification task. In addition, a number of stretch criteria were met including completion of a verification task on a 13 million element mesh.

More Details

Improving human effectiveness for extreme-scale problem solving : final report (assessing the effectiveness of electronic brainstorming in an industrial setting)

Davidson, George S.; Dornburg, Courtney S.; Adams, Susan S.; Hendrickson, Stacey M.; Bauer, Travis L.; Forsythe, James C.

An experiment was conducted comparing the effectiveness of individual versus group electronic brainstorming in order to address difficult, real world challenges. While industrial reliance on electronic communications has become ubiquitous, empirical and theoretical understanding of the bounds of its effectiveness have been limited. Previous research using short-term, laboratory experiments have engaged small groups of students in answering questions irrelevant to an industrial setting. The current experiment extends current findings beyond the laboratory to larger groups of real-world employees addressing organization-relevant challenges over the course of four days. Findings are twofold. First, the data demonstrate that (for this design) individuals perform at least as well as groups in producing quantity of electronic ideas, regardless of brainstorming duration. However, when judged with respect to quality along three dimensions (originality, feasibility, and effectiveness), the individuals significantly (p<0.05) out performed the group working together. The theoretical and applied (e.g., cost effectiveness) implications of this finding are discussed. Second, the current experiment yielded several viable solutions to the wickedly difficult problem that was posed.

More Details

Interactomes to Biological Phase Space: a call to begin thinking at a new level in computational biology

Davidson, George S.; Brown, William M.

Techniques for high throughput determinations of interactomes, together with high resolution protein collocalizations maps within organelles and through membranes will soon create a vast resource. With these data, biological descriptions, akin to the high dimensional phase spaces familiar to physicists, will become possible. These descriptions will capture sufficient information to make possible realistic, system-level models of cells. The descriptions and the computational models they enable will require powerful computing techniques. This report is offered as a call to the computational biology community to begin thinking at this scale and as a challenge to develop the required algorithms and codes to make use of the new data.3

More Details

LDRD 102610 final report new processes for innovative microsystems engineering with predictive simulation

Wills, Ann E.

This LDRD Final report describes work that Stephen W. Thomas performed in 2006. The initial problem was to develop a modeling, simulation, and optimization strategy for the design of a high speed microsystem switch. The challenge was to model the right phenomena at the right level of fidelity, and capture the right design parameters. This effort focused on the design context, in contrast to other Sandia efforts focus on high-fidelity assessment. This report contains the initial proposal and the annual progress report. This report also describes exploratory work on micromaching using femtosecond lasers. Steve's time developing a proposal and collaboration on this topic was partly funded by this LDRD.

More Details
Results 8801–9000 of 9,998
Results 8801–9000 of 9,998