Browse Publications Technical Papers 2008-01-1368

Type X and Y Errors and Data & Model Conditioning for Systematic Uncertainty in Model Calibration, Validation, and Extrapolation 1 2008-01-1368

This paper introduces and develops the concept of “Type X” and “Type Y” errors in model validation and calibration, and their implications on extrapolative prediction. Type X error is non-detection of model bias because it is effectively hidden by the uncertainty in the experiments. Possible deleterious effects of Type X error can be avoided by mapping uncertainty into the model until it envelopes the potential model bias, but this likely assigns a larger uncertainty than is needed to account for the actual bias (Type Y error). A philosophy of Best Estimate + Uncertainty modeling and prediction is probably best supported by taking the conservative choice of guarding against Type X error while accepting the downside of incurring Type Y error. An associated methodology involving data- and model- conditioning is presented and tested on a simple but rich test problem. The methodology is shown to appropriately contend with model bias under conditions of systematic experimental input uncertainty in the test problem. The methodology effectively bounds the uncertain model bias and brings a correction into the model that extrapolates very well under a large variety of extrapolation conditions. The methodology has been straightforwardly applied to considerably more complex real problems where system response is likewise jointly monotonic in the input uncertainties. The methodology also allows for other types of systematic and random uncertainty in the experiments and model as discussed herein.


Subscribers can view annotate, and download all of SAE's content. Learn More »


Members save up to 40% off list price.
Login to see discount.
Special Offer: With TechSelect, you decide what SAE Technical Papers you need, when you need them, and how much you want to pay.