We present a mesh optimization algorithm for adaptively improving the finite element interpolation of a function of interest. The algorithm minimizes an objective function by swapping edges and moving nodes. Numerical experiments are performed on model problems. The results illustrate that the mesh optimization algorithm can reduce the W1,∞ semi-norm of the interpolation error. For these examples, the L2, L∞, and H1 norms decreased also.
A Brain-Emulating Cognition and Control Architecture (BECCA) is presented. It is consistent with the hypothesized functions of pervasive intra-cortical and cortico-subcortical neural circuits. It is able to reproduce many salient aspects of human voluntary movement and motor learning. It also provides plausible mechanisms for many phenomena described in cognitive psychology, including perception and mental modeling. Both "inputs" (afferent channels) and "outputs"' (efferent channels) are treated as neural signals; they are all binary (either on or off) and there is no meaning, information, or tag associated with any of them. Although BECCA initially has no internal models, it learns complex interrelations between outputs and inputs through which it bootstraps a model of the system it is controlling and the outside world. BECCA uses two key algorithms to accomplish this: S-Learning and Context-Based Similarity (CBS).
An experiment was conducted comparing the effectiveness of individual versus group electronic brainstorming in addressing real-world "wickedly difficult" challenges. Previous laboratory research has engaged small groups of students in answering questions irrelevant to an industrial setting. The current experiment extended this research to larger, real-world employee groups engaged in addressing organizationrelevant challenges. Within the present experiment, the data demonstrated that individuals performed at least as well as groups in terms of number of ideas produced and significantly (p<.02) outperformed groups in terms of the quality of those ideas (as measured along the dimensions of originality, feasibility, and effectiveness).
Proceedings of the 30th Annual International Conference of the IEEE Engineering in Medicine and Biology Society, EMBS'08 - "Personalized Healthcare through Technology"
Cities without an early warning system of indwelling sensors can consider monitoring their networks manually, especially during times of heightened security levels. We consider the problem of calculating an optimal schedule for manual sampling in a municipal water network. Preliminary computations with a small-scale example indicate that during normal times, manual sampling can provide some benefit, but it is far inferior to an indwelling sensor network. However, given information that significantly constrains the nature of an imminent threat, manual sampling can perform as well as a small sensor network designed to handle normal threats. Copyright ASCE 2006.
Partitioned global address space (PGAS) programming models have been identified as one of the few viable approaches for dealing with emerging many-core systems. These models tend to generate many small messages, which requires specific support from the network interface hardware to enable efficient execution. In the past, Cray included E-registers on the Cray T3E to support the SHMEM API; however, with the advent of multi-core processors, the balance of computation to communication capabilities has shifted toward computation. This paper explores the message rates that are achievable with multi-core processors and simplified PGAS support on a more conventional network interface. For message rate tests, we find that simple network interface hardware is more than sufficient. We also find that even typical data distributions, such as cyclic or block-cyclic, do not need specialized hardware support. Finally, we assess the impact of such support on the well known RandomAccess benchmark. (c) 2007 ACM.
In this paper, we introduce EXACT, the EXperimental Algorithmics Computational Toolkit. EXACT is a software framework for describing, controlling, and analyzing computer experiments. It provides the experimentalist with convenient software tools to ease and organize the entire experimental process, including the description of factors and levels, the design of experiments, the control of experimental runs, the archiving of results, and analysis of results. As a case study for EXACT, we describe its interaction with FAST, the Sandia Framework for Agile Software Testing. EXACT and FAST now manage the nightly testing of several large software projects at Sandia. We also discuss EXACT's advanced features, which include a driver module that controls complex experiments such as comparisons of parallel algorithms. Copyright 2007 ACM.
This paper presents the conceptual framework that is being used to define quantification of margins and uncertainties (QMU) for application in the nuclear weapons (NW) work conducted at Sandia National Laboratories. The conceptual framework addresses the margins and uncertainties throughout the NW life cycle and includes the definition of terms related to QMU and to figures of merit. Potential applications of QMU consist of analyses based on physical data and on modeling and simulation. Appendix A provides general guidelines for addressing cases in which significant and relevant physical data are available for QMU analysis. Appendix B gives the specific guidance that was used to conduct QMU analyses in cycle 12 of the annual assessment process. Appendix C offers general guidelines for addressing cases in which appropriate models are available for use in QMU analysis. Appendix D contains an example that highlights the consequences of different treatments of uncertainty in model-based QMU analyses.
This work describes the convergence analysis of a Smolyak-type sparse grid stochastic collocation method for the approximation of statistical quantities related to the solution of partial differential equations with random coefficients and forcing terms (input data of the model). To compute solution statistics, the sparse grid stochastic collocation method uses approximate solutions, produced here by finite elements, corresponding to a deterministic set of points in the random input space. This naturally requires solving uncoupled deterministic problems and, as such, the derived strong error estimates for the fully discrete solution are used to compare the computational efficiency of the proposed method with the Monte Carlo method. Numerical examples illustrate the theoretical results and are used to compare this approach with several others, including the standard Monte Carlo.
Niobium doped Lead Zirconate Titanate (PZT) with a Zr/Ti ratio of 95/5 (i.e., PZT 95/5-2Nb) is a ferroelectric with a rhombohedral structure at room temperature. A crystal (or a subdomain within a crystal) exhibits a spontaneous polarization in any one of eight crystallographically equivalent directions. Such a material becomes polarized when subjected to a large electric field. When the electric field is removed, a remanent polarization remains and a bound charge is stored. A displacive phase transition from a rhombohedral ferroelectric phase to an orthorhombic anti-ferroelectric phase can be induced with the application of a mechanical load. When this occurs, the material becomes depoled and the bound charge is released. The polycrystalline character of PZT 95/5-2Nb leads to highly non-uniform fields at the grain scale. These local fields lead to very complex material behavior during mechanical depoling that has important implications to device design and performance. This paper presents a microstructurally based numerical model that describes the 3D non-linear behavior of ferroelectric ceramics. The model resolves the structure of polycrystals directly in the topology of the problem domain and uses the extended finite element method (X-FEM) to solve the governing equations of electromechanics. The material response is computed from anisotropic single crystal constants and the volume fractions of the various polarization variants (i.e., three variants for rhombohedral anti-ferroelectric and eight for rhomobohedral ferroelectric ceramic). Evolution of the variant volume fractions is governed by the minimization of internally stored energy and accounts for ferroelectric and ferroelastic domain switching and phase transitions in response to the applied loads. The developed model is used to examine hydrostatic depoling in PZT 95/5-2Nb.
Many current and future modeling applications at Sandia including ASC milestones will critically depend on the simultaneous solution of vastly different physical phenomena. Issues due to code coupling are often not addressed, understood, or even recognized. The objectives of the LDRD has been both in theory and in code development. We will show that we have provided a fundamental analysis of coupling, i.e., when strong coupling vs. a successive substitution strategy is needed. We have enabled the implementation of tighter coupling strategies through additions to the NOX and Sierra code suites to make coupling strategies available now. We have leveraged existing functionality to do this. Specifically, we have built into NOX the capability to handle fully coupled simulations from multiple codes, and we have also built into NOX the capability to handle Jacobi Free Newton Krylov simulations that link multiple applications. We show how this capability may be accessed from within the Sierra Framework as well as from outside of Sierra. The critical impact from this LDRD is that we have shown how and have delivered strategies for enabling strong Newton-based coupling while respecting the modularity of existing codes. This will facilitate the use of these codes in a coupled manner to solve multi-physic applications.
Numerous terrorist organizations have openly expressed interest in producing and deploying biological weapons. However, a limiting factor for many terrorists has been the acquisition of dangerous biological agents, as evidenced by the very few successful instances of biological weapons use compared to the number of documented hoaxes. Biological agents vary greatly in their ability to cause loss of life and economic damage. Some agents, if released properly, can kill many people and cause an extensive number of secondary infections; other agents will sicken only a small number of people for a short period of time. Consequently, several biological agents can potentially be used to perpetrate a bioterrorism attack but few are likely capable of causing a high consequence event. It is crucial, from a US national security perspective, to more deeply understand the likelihood that terrorist organizations can acquire the range of these agents. Few studies have attempted to comprehensively compile the technical information directly relevant to the acquisition of dangerous bacteria, viruses and toxins. In this report, technical fact sheets were assembled for 46 potentially dangerous biological agents. Much of the information was taken from various research sources which could ultimately and significantly expedite and improve bioterrorism threat assessments. By systematically examining a number of specific agent characteristics included in these fact sheets, it may be possible to detect, target, and implement measures to thwart future terrorist acquisition attempts. In addition, the information in these fact sheets may be used as a tool to help laboratories gain a rudimentary understanding of how attractive a method laboratory theft is relative to other potential acquisition modes.