Publications

Results 1–25 of 46

Search results

Jump to search filters

Dakota, A Multilevel Parallel Object-Oriented Framework for Design Optimization Parameter Estimation Uncertainty Quantification and Sensitivity Analysis: Version 6.12 User's Manual

Adams, Brian M.; Bohnhoff, William J.; Dalbey, Keith D.; Ebeida, Mohamed S.; Eddy, John P.; Eldred, Michael S.; Hooper, Russell H.; Hough, Patricia D.; Hu, Kenneth H.; Jakeman, John D.; Khalil, Mohammad K.; Maupin, Kathryn A.; Monschke, Jason A.; Ridgway, Elliott M.; Rushdi, Ahmad R.; Seidl, Daniel T.; Stephens, John A.; Swiler, Laura P.; Winokur, Justin W.

The Dakota toolkit provides a flexible and extensible interface between simulation codes and iterative analysis methods. Dakota contains algorithms for optimization with gradient and nongradient-based methods; uncertainty quantification with sampling, reliability, and stochastic expansion methods; parameter estimation with nonlinear least squares methods; and sensitivity/variance analysis with design of experiments and parameter study methods. These capabilities may be used on their own or as components within advanced strategies such as surrogate-based optimization, mixed integer nonlinear programming, or optimization under uncertainty. By employing object-oriented design to implement abstractions of the key components required for iterative systems analyses, the Dakota toolkit provides a flexible and extensible problem-solving environment for design and performance analysis of computational models on high performance computers. This report serves as a user's manual for the Dakota software and provides capability overviews and procedures for software execution, as well as a variety of example studies.

More Details

Stacking Fault Energy Based Alloy Screening for Hydrogen Compatibility

JOM. Journal of the Minerals, Metals & Materials Society

Gibbs, Paul J.; Hough, Patricia D.; Thurmer, Konrad T.; Somerday, Brian P.; San Marchi, Christopher W.; Zimmerman, Jonathan A.

The selection of austenitic stainless steels for hydrogen service is challenging since there are few intrinsic metrics that relate alloy composition to hydrogen degradation. One such metric, explored here, is intrinsic stacking fault energy. Stacking fault energy has an influence on the character and structure of dislocations and on the formation of secondary crystalline phases created during mechanical deformation in austenitic alloys. In this work, a data-driven model for the intrinsic stacking fault energy of common austenitic stainless steel alloys is applied to compare the relative degradation of tensile performance in the presence of hydrogen. A transition in the tensile reduction of area of both 300-series and manganese stabilized stainless steels is observed at a calculated stacking fault energy of approximately 43 mJ m-2, below which pronounced hydrogen degradation on tensile ductility is observed. The model is also applied to suggest alloying strategies for low nickel austenitic stainless steels for hydrogen service. Lastly, through this investigation, we find that calculated intrinsic stacking fault energy is a high-throughput screening metric that enables the ranking of the performance of a diverse range of austenitic stainless steel compositions, as well as the identification of new alloys, with regard to hydrogen compatibility.

More Details

Final Review Memo from ATDM L2 Milestone Review Panel to ATDM L2 Milestone Team and Associated Management

Hough, Patricia D.; Barone, Matthew F.; Barrett, Richard F.; Mish, Kyran D.; Thornquist, Heidi K.

On Thursday, August 25, 2016, the ATDM L2 milestone review panel met with the milestone team to conduct a final assessment of the completeness and quality of the work performed. First and foremost, the panel would like to congratulate and commend the milestone team for a job well done. The team completed a significant body of high-quality work toward very ambitious goals. Additionally, their persistence in working through the technical challenges associated with evolving technology, the nontechnical challenges associated with integrating across multiple software development teams, and the many demands on their time speaks volumes about their commitment to delivering the best work possible to advance the ATDM program. The panel’s comments on the individual completion criteria appear in the last section of this memo.

More Details

ASME V\&V challenge problem: Surrogate-based V&V

Journal of Verification, Validation and Uncertainty Quantification

Beghini, Lauren L.; Hough, Patricia D.

The process of verification and validation can be resource intensive. From the computational model perspective, the resource demand typically arises from long simulation run times on multiple cores coupled with the need to characterize and propagate uncertainties. In addition, predictive computations performed for safety and reliability analyses have similar resource requirements. For this reason, there is a tradeoff between the time required to complete the requisite studies and the fidelity or accuracy of the results that can be obtained. At a high level, our approach is cast within a validation hierarchy that provides a framework in which we perform sensitivity analysis, model calibration, model validation, and prediction. The evidence gathered as part of these activities is mapped into the Predictive Capability Maturity Model to assess credibility of the model used for the reliability predictions. With regard to specific technical aspects of our analysis, we employ surrogate-based methods, primarily based on polynomial chaos expansions and Gaussian processes, for model calibration, sensitivity analysis, and uncertainty quantification in order to reduce the number of simulations that must be done. The goal is to tip the tradeoff balance to improving accuracy without increasing the computational demands.

More Details

Dakota, a multilevel parallel object-oriented framework for design optimization, parameter estimation, uncertainty quantification, and sensitivity analysis :

Adams, Brian M.; Jakeman, John D.; Swiler, Laura P.; Stephens, John A.; Vigil, Dena V.; Wildey, Timothy M.; Bauman, Lara E.; Bohnhoff, William J.; Dalbey, Keith D.; Eddy, John P.; Ebeida, Mohamed S.; Eldred, Michael S.; Hough, Patricia D.; Hu, Kenneth H.

The Dakota (Design Analysis Kit for Optimization and Terascale Applications) toolkit provides a exible and extensible interface between simulation codes and iterative analysis methods. Dakota contains algorithms for optimization with gradient and nongradient-based methods; uncertainty quanti cation with sampling, reliability, and stochastic expansion methods; parameter estimation with nonlinear least squares methods; and sensitivity/variance analysis with design of experiments and parameter study methods. These capabilities may be used on their own or as components within advanced strategies such as surrogate-based optimization, mixed integer nonlinear programming, or optimization under uncertainty. By employing object-oriented design to implement abstractions of the key components required for iterative systems analyses, the Dakota toolkit provides a exible and extensible problem-solving environment for design and performance analysis of computational models on high performance computers. This report serves as a user's manual for the Dakota software and provides capability overviews and procedures for software execution, as well as a variety of example studies.

More Details

Dakota, a multilevel parallel object-oriented framework for design optimization, parameter estimation, uncertainty quantification, and sensitivity analysis version 6.0 theory manual

Adams, Brian M.; Jakeman, John D.; Swiler, Laura P.; Stephens, John A.; Vigil, Dena V.; Wildey, Timothy M.; Bauman, Lara E.; Bohnhoff, William J.; Dalbey, Keith D.; Eddy, John P.; Ebeida, Mohamed S.; Eldred, Michael S.; Hough, Patricia D.; Hu, Kenneth H.

The Dakota (Design Analysis Kit for Optimization and Terascale Applications) toolkit provides a exible and extensible interface between simulation codes and iterative analysis methods. Dakota contains algorithms for optimization with gradient and nongradient-based methods; uncertainty quanti cation with sampling, reliability, and stochastic expansion methods; parameter estimation with nonlinear least squares methods; and sensitivity/variance analysis with design of experiments and parameter study methods. These capabilities may be used on their own or as components within advanced strategies such as surrogate-based optimization, mixed integer nonlinear programming, or optimization under uncertainty. By employing object-oriented design to implement abstractions of the key components required for iterative systems analyses, the Dakota toolkit provides a exible and extensible problem-solving environment for design and performance analysis of computational models on high performance computers. This report serves as a theoretical manual for selected algorithms implemented within the Dakota software. It is not intended as a comprehensive theoretical treatment, since a number of existing texts cover general optimization theory, statistical analysis, and other introductory topics. Rather, this manual is intended to summarize a set of Dakota-related research publications in the areas of surrogate-based optimization, uncertainty quanti cation, and optimization under uncertainty that provide the foundation for many of Dakota's iterative analysis capabilities.

More Details

Computational solution verification and validation applied to a thermal model of a ruggedized instrumentation package

WIT Transactions on Modelling and Simulation

Scott, Sarah N.; Templeton, Jeremy A.; Ruthruff, Joseph R.; Hough, Patricia D.; Peterson, Jerrod P.

This study details a methodology for quantification of errors and uncertainties of a finite element heat transfer model applied to a Ruggedized Instrumentation Package (RIP). The proposed verification and validation (V&V) process includes solution verification to examine errors associated with the code's solution techniques, and model validation to assess the model's predictive capability for quantities of interest. The model was subjected to mesh resolution and numerical parameters sensitivity studies to determine reasonable parameter values and to understand how they change the overall model response and performance criteria. To facilitate quantification of the uncertainty associated with the mesh, automatic meshing and mesh refining/coarsening algorithms were created and implemented on the complex geometry of the RIP. Automated software to vary model inputs was also developed to determine the solution’s sensitivity to numerical and physical parameters. The model was compared with an experiment to demonstrate its accuracy and determine the importance of both modelled and unmodelled physics in quantifying the results' uncertainty. An emphasis is placed on automating the V&V process to enable uncertainty quantification within tight development schedules.

More Details

Computational solution verification applied to a thermal model of a ruggedized instrumentation package

WIT Transactions on Modelling and Simulation

Scott, Sarah N.; Templeton, Jeremy A.; Ruthruff, Joseph R.; Hough, Patricia D.; Peterson, Jerrod P.

This paper details a methodology for quantification of errors and uncertainties of a finite element heat transfer model applied to a Ruggedized Instrumentation Package (RIP). The proposed verification process includes solution verification, which examines the errors associated with the code's solution techniques. The model was subjected to mesh resolution and numerical parameters sensitivity studies to determine reasonable parameter values and to understand how they change the overall model response and performance criteria. To facilitate quantification of the uncertainty associated with the mesh, automatic meshing and mesh refining/coarsening algorithms were created and implemented on the complex geometry of the RIP. Similarly, highly automated software to vary model inputs was also developed for the purpose of assessing the solution's sensitivity to numerical parameters. The model was subjected to mesh resolution and numerical parameters sensitivity studies. This process not only tests the robustness of the numerical parameters, but also allows for the optimization of robustness and numerical error with computation time. Agglomeration of these studies provides a bound for the uncertainty due to numerical error for the model. An emphasis is placed on the automation of solution verification to allow a rigorous look at uncertainty to be performed even within a tight design and development schedule. © 2013 WIT Press.

More Details

Computational solution verification applied to a thermal model of a ruggedized instrumentation package

WIT Transactions on Modelling and Simulation

Scott, S.N.; Templeton, Jeremy A.; Ruthruff, Joseph R.; Hough, Patricia D.; Peterson, Jerrod P.

This paper details a methodology for quantification of errors and uncertainties of a finite element heat transfer model applied to a Ruggedized Instrumentation Package (RIP). The proposed verification process includes solution verification, which examines the errors associated with the code's solution techniques. The model was subjected to mesh resolution and numerical parameters sensitivity studies to determine reasonable parameter values and to understand how they change the overall model response and performance criteria. To facilitate quantification of the uncertainty associated with the mesh, automatic meshing and mesh refining/coarsening algorithms were created and implemented on the complex geometry of the RIP. Similarly, highly automated software to vary model inputs was also developed for the purpose of assessing the solution's sensitivity to numerical parameters. The model was subjected to mesh resolution and numerical parameters sensitivity studies. This process not only tests the robustness of the numerical parameters, but also allows for the optimization of robustness and numerical error with computation time. Agglomeration of these studies provides a bound for the uncertainty due to numerical error for the model. An emphasis is placed on the automation of solution verification to allow a rigorous look at uncertainty to be performed even within a tight design and development schedule. © 2013 WIT Press.

More Details

Optimization of large-scale heterogeneous system-of-systems models

Gray, Genetha A.; Hart, William E.; Hough, Patricia D.; Parekh, Ojas D.; Phillips, Cynthia A.; Siirola, John D.; Swiler, Laura P.; Watson, Jean-Paul W.

Decision makers increasingly rely on large-scale computational models to simulate and analyze complex man-made systems. For example, computational models of national infrastructures are being used to inform government policy, assess economic and national security risks, evaluate infrastructure interdependencies, and plan for the growth and evolution of infrastructure capabilities. A major challenge for decision makers is the analysis of national-scale models that are composed of interacting systems: effective integration of system models is difficult, there are many parameters to analyze in these systems, and fundamental modeling uncertainties complicate analysis. This project is developing optimization methods to effectively represent and analyze large-scale heterogeneous system of systems (HSoS) models, which have emerged as a promising approach for describing such complex man-made systems. These optimization methods enable decision makers to predict future system behavior, manage system risk, assess tradeoffs between system criteria, and identify critical modeling uncertainties.

More Details

DAKOTA : a multilevel parallel object-oriented framework for design optimization, parameter estimation, uncertainty quantification, and sensitivity analysis

Adams, Brian M.; Bohnhoff, William J.; Dalbey, Keith D.; Eddy, John P.; Eldred, Michael S.; Hough, Patricia D.; Lefantzi, Sophia L.; Swiler, Laura P.; Vigil, Dena V.

The DAKOTA (Design Analysis Kit for Optimization and Terascale Applications) toolkit provides a flexible and extensible interface between simulation codes and iterative analysis methods. DAKOTA contains algorithms for optimization with gradient and nongradient-based methods; uncertainty quantification with sampling, reliability, and stochastic expansion methods; parameter estimation with nonlinear least squares methods; and sensitivity/variance analysis with design of experiments and parameter study methods. These capabilities may be used on their own or as components within advanced strategies such as surrogate-based optimization, mixed integer nonlinear programming, or optimization under uncertainty. By employing object-oriented design to implement abstractions of the key components required for iterative systems analyses, the DAKOTA toolkit provides a flexible and extensible problem-solving environment for design and performance analysis of computational models on high performance computers. This report serves as a theoretical manual for selected algorithms implemented within the DAKOTA software. It is not intended as a comprehensive theoretical treatment, since a number of existing texts cover general optimization theory, statistical analysis, and other introductory topics. Rather, this manual is intended to summarize a set of DAKOTA-related research publications in the areas of surrogate-based optimization, uncertainty quantification, and optimization under uncertainty that provide the foundation for many of DAKOTA's iterative analysis capabilities.

More Details
Results 1–25 of 46
Results 1–25 of 46