Can Scientific Software Development Use the Outsourcing Model Successfully
Abstract not provided.
Abstract not provided.
The Dakota toolkit provides a flexible and extensible interface between simulation codes and iterative analysis methods. Dakota contains algorithms for optimization with gradient and nongradient-based methods; uncertainty quantification with sampling, reliability, and stochastic expansion methods; parameter estimation with nonlinear least squares methods; and sensitivity/variance analysis with design of experiments and parameter study methods. These capabilities may be used on their own or as components within advanced strategies such as surrogate-based optimization, mixed integer nonlinear programming, or optimization under uncertainty. By employing object-oriented design to implement abstractions of the key components required for iterative systems analyses, the Dakota toolkit provides a flexible and extensible problem-solving environment for design and performance analysis of computational models on high performance computers. This report serves as a user's manual for the Dakota software and provides capability overviews and procedures for software execution, as well as a variety of example studies.
Abstract not provided.
Abstract not provided.
Abstract not provided.
Abstract not provided.
Abstract not provided.
Abstract not provided.
Reverse engineering (RE) analysts struggle to address critical questions about the safety of binary code accurately and promptly, and their supporting program analysis tools are simply wrong sometimes. The analysis tools have to approximate in order to provide any information at all, but this means that they introduce uncertainty into their results. And those uncertainties chain from analysis to analysis. We hypothesize that exposing sources, impacts, and control of uncertainty to human binary analysts will allow the analysts to approach their hardest problems with high-powered analytic techniques that they know when to trust. Combining expertise in binary analysis algorithms, human cognition, uncertainty quantification, verification and validation, and visualization, we pursue research that should benefit binary software analysis efforts across the board. We find a strong analogy between RE and exploratory data analysis (EDA); we begin to characterize sources and types of uncertainty found in practice in RE (both in the process and in supporting analyses); we explore a domain-specific focus on uncertainty in pointer analysis, showing that more precise models do help analysts answer small information flow questions faster and more accurately; and we test a general population with domain-general sudoku problems, showing that adding "knobs" to an analysis does not significantly slow down performance. This document describes our explorations in uncertainty in binary analysis.
Abstract not provided.
Abstract not provided.
Abstract not provided.
Abstract not provided.
Abstract not provided.
Abstract not provided.
The TChem open-source software is a toolkit for computing thermodynamic properties, source term, and source term’s Jacobian matrix for chemical kinetic models that involve gas and surface reactions.
Abstract not provided.
Computational Particle Mechanics
The peridynamic theory of solid mechanics is applied to modeling the deformation and fracture of micrometer-sized particles made of organic crystalline material. A new peridynamic material model is proposed to reproduce the elastic–plastic response, creep, and fracture that are observed in experiments. The model is implemented in a three-dimensional, meshless Lagrangian simulation code. In the small deformation, elastic regime, the model agrees well with classical Hertzian contact analysis for a sphere compressed between rigid plates. Under higher load, material and geometrical nonlinearity is predicted, leading to fracture. The material parameters for the energetic material CL-20 are evaluated from nanoindentation test data on the cyclic compression and failure of micrometer-sized grains.
Abstract not provided.
Abstract not provided.
This work, building on previous efforts, develops a suite of new graph neural network machine learning architectures that generate data-driven prolongators for use in Algebraic Multigrid (AMG). Algebraic Multigrid is a powerful and common technique for solving large, sparse linear systems. Its effectiveness is problem dependent and heavily depends on the choice of the prolongation operator, which interpolates the coarse mesh results onto a finer mesh. Previous work has used recent developments in graph neural networks to learn a prolongation operator from a given coefficient matrix. In this paper, we expand on previous work by exploring architectural enhancements of graph neural networks. A new method for generating a training set is developed which more closely aligns to the test set. Asymptotic error reduction factors are compared on a test suite of 3-dimensional Poisson problems with varying degrees of element stretching. Results show modest improvements in asymptotic error factor over both commonly chosen baselines and learning methods from previous work.
Abstract not provided.
Abstract not provided.
Abstract not provided.
Abstract not provided.