Publications

10 Results

Search results

Jump to search filters

Dakota, a multilevel parallel object-oriented framework for design optimization, parameter estimation, uncertainty quantification, and sensitivity analysis version 6.0 theory manual

Adams, Brian M.; Jakeman, John D.; Swiler, Laura P.; Stephens, John A.; Vigil, Dena; Wildey, Timothy; Bauman, Lara E.; Bohnhoff, William J.; Dalbey, Keith; Eddy, John P.; Ebeida, Mohamed; Eldred, Michael; Hough, Patricia D.; Hu, Kenneth

The Dakota (Design Analysis Kit for Optimization and Terascale Applications) toolkit provides a exible and extensible interface between simulation codes and iterative analysis methods. Dakota contains algorithms for optimization with gradient and nongradient-based methods; uncertainty quanti cation with sampling, reliability, and stochastic expansion methods; parameter estimation with nonlinear least squares methods; and sensitivity/variance analysis with design of experiments and parameter study methods. These capabilities may be used on their own or as components within advanced strategies such as surrogate-based optimization, mixed integer nonlinear programming, or optimization under uncertainty. By employing object-oriented design to implement abstractions of the key components required for iterative systems analyses, the Dakota toolkit provides a exible and extensible problem-solving environment for design and performance analysis of computational models on high performance computers. This report serves as a theoretical manual for selected algorithms implemented within the Dakota software. It is not intended as a comprehensive theoretical treatment, since a number of existing texts cover general optimization theory, statistical analysis, and other introductory topics. Rather, this manual is intended to summarize a set of Dakota-related research publications in the areas of surrogate-based optimization, uncertainty quanti cation, and optimization under uncertainty that provide the foundation for many of Dakota's iterative analysis capabilities.

More Details

Dakota, a multilevel parallel object-oriented framework for design optimization, parameter estimation, uncertainty quantification, and sensitivity analysis :

Adams, Brian M.; Jakeman, John D.; Swiler, Laura P.; Stephens, John A.; Vigil, Dena; Wildey, Timothy; Bauman, Lara E.; Bohnhoff, William J.; Dalbey, Keith; Eddy, John P.; Ebeida, Mohamed; Eldred, Michael; Hough, Patricia D.; Hu, Kenneth

The Dakota (Design Analysis Kit for Optimization and Terascale Applications) toolkit provides a exible and extensible interface between simulation codes and iterative analysis methods. Dakota contains algorithms for optimization with gradient and nongradient-based methods; uncertainty quanti cation with sampling, reliability, and stochastic expansion methods; parameter estimation with nonlinear least squares methods; and sensitivity/variance analysis with design of experiments and parameter study methods. These capabilities may be used on their own or as components within advanced strategies such as surrogate-based optimization, mixed integer nonlinear programming, or optimization under uncertainty. By employing object-oriented design to implement abstractions of the key components required for iterative systems analyses, the Dakota toolkit provides a exible and extensible problem-solving environment for design and performance analysis of computational models on high performance computers. This report serves as a user's manual for the Dakota software and provides capability overviews and procedures for software execution, as well as a variety of example studies.

More Details

New Methods of Uncertainty Quantification for Mixed Discrete-Continuous Variable Models

Bauman, Lara E.

The scale and complexity of problems such as designing power grids or planning for climate change is growing rapidly, driving the development of complicated computer models. More complex models have longer run times and incorporate larger numbers of inputs, both continuous and discrete. For example, a detailed physics model may have continuous variables such as temperature, height or pressure along with discrete variables that indicate the choice of a material for a particular piece or the model to be used to calculate air flow. A power grid design model may have continuous variables such as generation capacity, power flow or demand along with discrete variables such as number of generators, number of transmission lines or binary variables to indicate whether or not a node is chosen for generation expansion. A growing awareness of uncertainty and the desire to make risk-informed decisions is causing uncertainty quantification (UQ) to be more routine and often required. UQ provides the underpinnings necessary to establish confidence in models and their use; therefore, much time and effort is being invested in creating efficient approaches for UQ. However, these efforts have been focused on models that take continuous variables as inputs. When discrete inputs are thrown into the mix, the basic approach is to repeat the UQ analysis for each combination of discrete inputs or some subset thereof; this rapidly becomes intractable. Because of the computational complexity inherent in mixed discrete-continuous models, researchers will focus on the uncertainty in their particular problem finding ways to take advantage of symmetries, simplifications or structures. For example, uncertainty propagation in certain dynamical systems can be efficiently carried out after various decomposition steps or uncertainty propagation in stochastic programming is confined to scenario generation. Unfortunately models are not always available for such machinations: models may be embedded in legacy codes, may utilize commercial off the shelf codes or may be created by stringing a series of codes together. It is also time consuming to start each problem from scratch; worse there may not be any simplifications or symmetries to take advantage of. For these situations a UQ method developed for any black box function is necessary. This report documents a new conceptual model for performing UQ for mixed discrete-continuous models which not only applies to any simulator function, but allows the use of the efficient UQ methods that have been developed for continuous inputs only. The conceptual model is presented and an estimation procedure is fleshed out for one class of problems. This is applied to variations of a mixed discrete-continuous optimization test problem. This procedure provides comparable results to a benchmark solution with fewer function evaluations.

More Details

A robust Approach to QMU, Validation, and Conservative Prediction

Segalman, Daniel J.; Bauman, Lara E.

A systematic approach to defining margin in a manner that incorporates statistical information and accommodates data uncertainty, but does not require assumptions about specific forms of the tails of distributions is developed. This approach extends to calculations underlying validation assessment and quantitatively conservative predictions.

More Details
10 Results
10 Results