Publications

10 Results
Skip to search filters

Dakota, a multilevel parallel object-oriented framework for design optimization, parameter estimation, uncertainty quantification, and sensitivity analysis :

Adams, Brian M.; Jakeman, John D.; Swiler, Laura P.; Stephens, John A.; Vigil, Dena V.; Wildey, Timothy M.; Bauman, Lara E.; Bohnhoff, William J.; Dalbey, Keith D.; Eddy, John P.; Ebeida, Mohamed S.; Eldred, Michael S.; Hough, Patricia D.; Hu, Kenneth H.

The Dakota (Design Analysis Kit for Optimization and Terascale Applications) toolkit provides a exible and extensible interface between simulation codes and iterative analysis methods. Dakota contains algorithms for optimization with gradient and nongradient-based methods; uncertainty quanti cation with sampling, reliability, and stochastic expansion methods; parameter estimation with nonlinear least squares methods; and sensitivity/variance analysis with design of experiments and parameter study methods. These capabilities may be used on their own or as components within advanced strategies such as surrogate-based optimization, mixed integer nonlinear programming, or optimization under uncertainty. By employing object-oriented design to implement abstractions of the key components required for iterative systems analyses, the Dakota toolkit provides a exible and extensible problem-solving environment for design and performance analysis of computational models on high performance computers. This report serves as a user's manual for the Dakota software and provides capability overviews and procedures for software execution, as well as a variety of example studies.

More Details

Dakota, a multilevel parallel object-oriented framework for design optimization, parameter estimation, uncertainty quantification, and sensitivity analysis version 6.0 theory manual

Adams, Brian M.; Jakeman, John D.; Swiler, Laura P.; Stephens, John A.; Vigil, Dena V.; Wildey, Timothy M.; Bauman, Lara E.; Bohnhoff, William J.; Dalbey, Keith D.; Eddy, John P.; Ebeida, Mohamed S.; Eldred, Michael S.; Hough, Patricia D.; Hu, Kenneth H.

The Dakota (Design Analysis Kit for Optimization and Terascale Applications) toolkit provides a exible and extensible interface between simulation codes and iterative analysis methods. Dakota contains algorithms for optimization with gradient and nongradient-based methods; uncertainty quanti cation with sampling, reliability, and stochastic expansion methods; parameter estimation with nonlinear least squares methods; and sensitivity/variance analysis with design of experiments and parameter study methods. These capabilities may be used on their own or as components within advanced strategies such as surrogate-based optimization, mixed integer nonlinear programming, or optimization under uncertainty. By employing object-oriented design to implement abstractions of the key components required for iterative systems analyses, the Dakota toolkit provides a exible and extensible problem-solving environment for design and performance analysis of computational models on high performance computers. This report serves as a theoretical manual for selected algorithms implemented within the Dakota software. It is not intended as a comprehensive theoretical treatment, since a number of existing texts cover general optimization theory, statistical analysis, and other introductory topics. Rather, this manual is intended to summarize a set of Dakota-related research publications in the areas of surrogate-based optimization, uncertainty quanti cation, and optimization under uncertainty that provide the foundation for many of Dakota's iterative analysis capabilities.

More Details

ALEGRA Update: Modernization and Resilience Progress

Robinson, Allen C.; Petney, Sharon P.; Drake, Richard R.; Weirs, Vincent G.; Adams, Brian M.; Vigil, Dena V.; Carpenter, John H.; Garasi, Christopher J.; Wong, Michael K.; Robbins, Joshua R.; Siefert, Christopher S.; Strack, Otto E.; Wills, Ann E.; Trucano, Timothy G.; Bochev, Pavel B.; Summers, Randall M.; Stewart, James R.; Ober, Curtis C.; Rider, William J.; Haill, Thomas A.; Lemke, Raymond W.; Cochrane, Kyle C.; Desjarlais, Michael P.; Love, Edward L.; Voth, Thomas E.; Mosso, Stewart J.; Niederhaus, John H.

Abstract not provided.

DAKOTA : a multilevel parallel object-oriented framework for design optimization, parameter estimation, uncertainty quantification, and sensitivity analysis

Adams, Brian M.; Bohnhoff, William J.; Dalbey, Keith D.; Eddy, John P.; Eldred, Michael S.; Hough, Patricia D.; Lefantzi, Sophia L.; Swiler, Laura P.; Vigil, Dena V.

The DAKOTA (Design Analysis Kit for Optimization and Terascale Applications) toolkit provides a flexible and extensible interface between simulation codes and iterative analysis methods. DAKOTA contains algorithms for optimization with gradient and nongradient-based methods; uncertainty quantification with sampling, reliability, and stochastic expansion methods; parameter estimation with nonlinear least squares methods; and sensitivity/variance analysis with design of experiments and parameter study methods. These capabilities may be used on their own or as components within advanced strategies such as surrogate-based optimization, mixed integer nonlinear programming, or optimization under uncertainty. By employing object-oriented design to implement abstractions of the key components required for iterative systems analyses, the DAKOTA toolkit provides a flexible and extensible problem-solving environment for design and performance analysis of computational models on high performance computers. This report serves as a theoretical manual for selected algorithms implemented within the DAKOTA software. It is not intended as a comprehensive theoretical treatment, since a number of existing texts cover general optimization theory, statistical analysis, and other introductory topics. Rather, this manual is intended to summarize a set of DAKOTA-related research publications in the areas of surrogate-based optimization, uncertainty quantification, and optimization under uncertainty that provide the foundation for many of DAKOTA's iterative analysis capabilities.

More Details

NEAMS Nuclear Waste Management IPSC : evaluation and selection of tools for the quality environment

Vigil, Dena V.; Edwards, Harold C.; Bouchard, Julie F.; Stubblefield, W.A.

The objective of the U.S. Department of Energy Office of Nuclear Energy Advanced Modeling and Simulation Nuclear Waste Management Integrated Performance and Safety Codes (NEAMS Nuclear Waste Management IPSC) is to provide an integrated suite of computational modeling and simulation (M&S) capabilities to quantitatively assess the long-term performance of waste forms in the engineered and geologic environments of a radioactive-waste storage facility or disposal repository. These M&S capabilities are to be managed, verified, and validated within the NEAMS Nuclear Waste Management IPSC quality environment. M&S capabilities and the supporting analysis workflow and simulation data management tools will be distributed to end-users from this same quality environment. The same analysis workflow and simulation data management tools that are to be distributed to end-users will be used for verification and validation (V&V) activities within the quality environment. This strategic decision reduces the number of tools to be supported, and increases the quality of tools distributed to end users due to rigorous use by V&V activities. This report documents an evaluation of the needs, options, and tools selected for the NEAMS Nuclear Waste Management IPSC quality environment. The objective of the U.S. Department of Energy (DOE) Office of Nuclear Energy Advanced Modeling and Simulation Nuclear Waste Management Integrated Performance and Safety Codes (NEAMS Nuclear Waste Management IPSC) program element is to provide an integrated suite of computational modeling and simulation (M&S) capabilities to assess quantitatively the long-term performance of waste forms in the engineered and geologic environments of a radioactive-waste storage facility or disposal repository. This objective will be fulfilled by acquiring and developing M&S capabilities, and establishing a defensible level of confidence in these M&S capabilities. The foundation for assessing the level of confidence is based upon the rigor and results from verification, validation, and uncertainty quantification (V&V and UQ) activities. M&S capabilities are to be managed, verified, and validated within the NEAMS Nuclear Waste Management IPSC quality environment. M&S capabilities and the supporting analysis workflow and simulation data management tools will be distributed to end-users from this same quality environment. The same analysis workflow and simulation data management tools that are to be distributed to end-users will be used for verification and validation (V&V) activities within the quality environment. This strategic decision reduces the number of tools to be supported, and increases the quality of tools distributed to end users due to rigorous use by V&V activities. NEAMS Nuclear Waste Management IPSC V&V and UQ practices and evidence management goals are documented in the V&V Plan. This V&V plan includes a description of the quality environment into which M&S capabilities are imported and V&V and UQ activities are managed. The first phase of implementing the V&V plan is to deploy an initial quality environment through the acquisition and integration of a set of software tools. An evaluation of the needs, options, and tools selected for the quality environment is given in this report.

More Details
10 Results
10 Results