Publications

12 Results
Skip to search filters

Parameterization of Large Variability Using the Hyper-Dual Meta-Model

Journal of Verification, Validation and Uncertainty Quantification

Bonney, Matthew B.; Kammer, Daniel C.

One major problem in the design of aerospace components is the nonlinear changes in the response due to a change in the geometry and material properties. Many of these components have small nominal values and any change can lead to a large variability. In order to characterize this large variability, traditional methods require either many simulation runs or the calculations of many higher order derivatives. Each of these paths requires a large amount of computational power to evaluate the response curve. In order to perform uncertainty quantification analysis, even more simulation runs are required. The hyper-dual meta-model is introduced and used to characterize the response curve with the use of basis functions. The information of the response is generated with the utilization of the hyper-dual step to determine the sensitivities at a few number of simulation runs to greatly enrich the response space. This study shows the accuracy of this method for two different systems with parameterizations at different stages in the design analysis.

More Details

Determining Reduced Order Models for Optimal Stochastic Reduced Order Models

Bonney, Matthew B.; Brake, Matthew R.

The use of parameterized reduced order models(PROMs) within the stochastic reduced order model (SROM) framework is a logical progression for both methods. In this report, five different parameterized reduced order models are selected and critiqued against the other models along with truth model for the example of the Brake-Reuss beam. The models are: a Taylor series using finite difference, a proper orthogonal decomposition of the the output, a Craig-Bampton representation of the model, a method that uses Hyper-Dual numbers to determine the sensitivities, and a Meta-Model method that uses the Hyper-Dual results and constructs a polynomial curve to better represent the output data. The methods are compared against a parameter sweep and a distribution propagation where the first four statistical moments are used as a comparison. Each method produces very accurate results with the Craig-Bampton reduction having the least accurate results. The models are also compared based on time requirements for the evaluation of each model where the Meta- Model requires the least amount of time for computation by a significant amount. Each of the five models provided accurate results in a reasonable time frame. The determination of which model to use is dependent on the availability of the high-fidelity model and how many evaluations can be performed. Analysis of the output distribution is examined by using a large Monte-Carlo simulation along with a reduced simulation using Latin Hypercube and the stochastic reduced order model sampling technique. Both techniques produced accurate results. The stochastic reduced order modeling technique produced less error when compared to an exhaustive sampling for the majority of methods.

More Details
12 Results
12 Results