We develop a method for constructing tolerance bounds for functional data with random warping variability. In particular, we define a generative, probabilistic model for the amplitude and phase components of such observations, which parsimoniously characterizes variability in the baseline data. Based on the proposed model, we define two different types of tolerance bounds that are able to measure both types of variability, and as a result, identify when the data has gone beyond the bounds of amplitude and/or phase. The first functional tolerance bounds are computed via a bootstrap procedure on the geometric space of amplitude and phase functions. The second functional tolerance bounds utilize functional Principal Component Analysis to construct a tolerance factor. This work is motivated by two main applications: process control and disease monitoring. The problem of statistical analysis and modeling of functional data in process control is important in determining when a production has moved beyond a baseline. Similarly, in biomedical applications, doctors use long, approximately periodic signals (such as the electrocardiogram) to diagnose and monitor diseases. In this context, it is desirable to identify abnormalities in these signals. We additionally consider a simulated example to assess our approach and compare it to two existing methods.
Pyromark® 2500, manufactured by Tempil, is currently the industry standard for high solar absorptive receiver coatings for concentrating solar power towers. However, Pyromark has been reported to degrade if not applied properly or exposed to temperatures exceeding 700 °C over a period of time. However, it is not apparent if such degradation is due to a particular aspect or aspects of the deposition process, which may vary from plant to plant. Many variables factor in to the performance of Pyromark, e.g. deposition method, drying time, curing parameters (ramp rate, homogeneous heating, time at temperature.), and coating thickness. Identifying the factors with the most influence on coating performance and durability will help guide the application of Pyromark to receivers to minimize degradation over time. The relationships between coating quality and optical properties with deposition/curing parameters on Haynes 230 substrates were assessed using statistical analysis of variance (ANOVA) techniques for repeated measures. These ANOVA techniques were designed to detect differences in treatment effects on the response at each of the aging cycles. The analyses found that coating thickness, curing ramp rate, and dwell time had the most effect on coating quality.
We study regression using functional predictors in situations where these functions contains both phase and amplitude variability. In other words, the functions are misaligned due to errors in time measurements, and these errors can significantly degrade both model estimation and prediction performance. The current techniques either ignore the phase variability, or handle it via preprocessing, that is, use an off–the–shelf technique for functional alignment and phase removal. We develop a functional principal component regression model which has a comprehensive approach in handling phase and amplitude variability. The model utilizes a mathematical representation of the data known as the square–root slope function. These functions preserve the L2 norm under warping and are ideally suited for simultaneous estimation of regression and warping parameters. Furthermore, using both simulated and real–world data sets, we demonstrate our approach and evaluate its prediction performance relative to current models. In addition, we propose an extension to functional logistic and multinomial logistic regression.
This paper examines the variability of predicted responses when multiple stress-strain curves (reflecting variability from replicate material tests) are propagated through a finite element model of a ductile steel can being slowly crushed. Over 140 response quantities of interest (including displacements, stresses, strains, and calculated measures of material damage) are tracked in the simulations. Each response quantity’s behavior varies according to the particular stress-strain curves used for the materials in the model. We desire to estimate response variability when only a few stress-strain curve samples are available from material testing. Propagation of just a few samples will usually result in significantly underestimated response uncertainty relative to propagation of a much larger population that adequately samples the presiding random-function source. A simple classical statistical method, Tolerance Intervals, is tested for effectively treating sparse stress-strain curve data. The method is found to perform well on the highly nonlinear input-to-output response mappings and non-standard response distributions in the can-crush problem. The results and discussion in this paper support a proposition that the method will apply similarly well for other sparsely sampled random variable or function data, whether from experiments or models. Finally, the simple Tolerance Interval method is also demonstrated to be very economical.
inverse prediction is important in a variety of scientific and engineering applications, such as to predict properties/characteristics of an object by using multiple measurements obtained from it. Inverse prediction can be accomplished by inverting parameterized forward models that relate the measurements (responses) to the properties/characteristics of interest. Sometimes forward models are computational/science based; but often, forward models are empirically based response surface models, obtained by using the results of controlled experimentation. For empirical models, it is important that the experiments provide a sound basis to develop accurate forward models in terms of the properties/characteristics (factors). While nature dictates the causal relationships between factors and responses, experimenters can control the complexity, accuracy, and precision of forward models constructed via selection of factors, factor levels, and the set of trials that are performed. Recognition of the uncertainty in the estimated forward models leads to an errors-in-variables approach for inverse prediction. The forward models (estimated by experiments or science based) can also be used to analyze how well candidate responses complement one another for inverse prediction over the range of the factor space of interest. One may find that some responses are complementary, redundant, or noninformative. Simple analysis and examples illustrate how an informative and discriminating subset of responses could be selected among candidates in cases where the number of responses that can be acquired during inverse prediction is limited by difficulty, expense, and/or availability of material.
The outputs available in the xLPR Version 2.0 code can be analyzed using statistical techniques that have been developed to compare sampling scheme selection, identify inputs for importance sampling, and assess result convergence and uncertainty. These techniques were developed and piloted for both the xLPR Scenario Analysis (SA) Report and the xLPR Sensitivity Analysis Template. This document provides a walk-through of the post-processing R code that was used to generate the results and figures presented in these documents. This page intentionally left blank.