Refine Your Search

Search Results

Journal Article

A New Metamodeling Approach for Time-Dependent Reliability of Dynamic Systems with Random Parameters Excited by Input Random Processes

2014-04-01
2014-01-0717
We propose a new metamodeling method to characterize the output (response) random process of a dynamic system with random parameters, excited by input random processes. The metamodel can be then used to efficiently estimate the time-dependent reliability of a dynamic system using analytical or simulation-based methods. The metamodel is constructed by decomposing the input random processes using principal components or wavelets and then using a few simulations to estimate the distributions of the decomposition coefficients. A similar decomposition is also performed on the output random process. A kriging model is then established between the input and output decomposition coefficients and subsequently used to quantify the output random process corresponding to a realization of the input random parameters and random processes. What distinguishes our approach from others in metamodeling is that the system input is not deterministic but random.
Journal Article

Reliability and Cost Trade-Off Analysis of a Microgrid

2018-04-03
2018-01-0619
Optimizing the trade-off between reliability and cost of operating a microgrid, including vehicles as both loads and sources, can be a challenge. Optimal energy management is crucial to develop strategies to improve the efficiency and reliability of microgrids, as well as new communication networks to support optimal and reliable operation. Prior approaches modeled the grid using MATLAB, but did not include the detailed physics of loads and sources, and therefore missed the transient effects that are present in real-time operation of a microgrid. This article discusses the implementation of a physics-based detailed microgrid model including a diesel generator, wind turbine, photovoltaic array, and utility. All elements are modeled as sources in Simulink. Various loads are also implemented including an asynchronous motor. We show how a central control algorithm optimizes the microgrid by trying to maximize reliability while reducing operational cost.
Journal Article

Reanalysis of Linear Dynamic Systems using Modified Combined Approximations with Frequency Shifts

2016-04-05
2016-01-1338
Weight reduction is very important in automotive design because of stringent demand on fuel economy. Structural optimization of dynamic systems using finite element (FE) analysis plays an important role in reducing weight while simultaneously delivering a product that meets all functional requirements for durability, crash and NVH. With advancing computer technology, the demand for solving large FE models has grown. Optimization is however costly due to repeated full-order analyses. Reanalysis methods can be used in structural vibrations to reduce the analysis cost from repeated eigenvalue analyses for both deterministic and probabilistic problems. Several reanalysis techniques have been introduced over the years including Parametric Reduced Order Modeling (PROM), Combined Approximations (CA) and the Epsilon algorithm, among others.
Journal Article

Efficient Re-Analysis Methodology for Probabilistic Vibration of Large-Scale Structures

2008-04-14
2008-01-0216
It is challenging to perform probabilistic analysis and design of large-scale structures because probabilistic analysis requires repeated finite element analyses of large models and each analysis is expensive. This paper presents a methodology for probabilistic analysis and reliability based design optimization of large scale structures that consists of two re-analysis methods; one for estimating the deterministic vibratory response and another for estimating the probability of the response exceeding a certain level. The deterministic re-analysis method can analyze efficiently large-scale finite element models consisting of tens or hundreds of thousand degrees of freedom and large numbers of design variables that vary in a wide range. The probabilistic re-analysis method calculates very efficiently the system reliability for many probability distributions of the design variables by performing a single Monte Carlo simulation.
Journal Article

Probabilistic Reanalysis Using Monte Carlo Simulation

2008-04-14
2008-01-0215
An approach for Probabilistic Reanalysis (PRA) of a system is presented. PRA calculates very efficiently the system reliability or the average value of an attribute of a design for many probability distributions of the input variables, by performing a single Monte Carlo simulation. In addition, PRA calculates the sensitivity derivatives of the reliability to the parameters of the probability distributions. The approach is useful for analysis problems where reliability bounds need to be calculated because the probability distribution of the input variables is uncertain or for design problems where the design variables are random. The accuracy and efficiency of PRA is demonstrated on vibration analysis of a car and on system reliability-based optimization (RBDO) of an internal combustion engine.
Journal Article

Reliability Estimation for Multiple Failure Region Problems using Importance Sampling and Approximate Metamodels

2008-04-14
2008-01-0217
An efficient reliability estimation method is presented for engineering systems with multiple failure regions and potentially multiple most probable points. The method can handle implicit, nonlinear limit-state functions, with correlated or non-correlated random variables, which can be described by any probabilistic distribution. It uses a combination of approximate or “accurate-on-demand,” global and local metamodels which serve as indicators to determine the failure and safe regions. Samples close to limit states define transition regions between safe and failure domains. A clustering technique identifies all transition regions which can be in general disjoint, and local metamodels of the actual limit states are generated for each transition region.
Journal Article

A Nonparametric Bootstrap Approach to Variable-size Local-domain Design Optimization and Computer Model Validation

2012-04-16
2012-01-0226
Design optimization often relies on computational models, which are subjected to a validation process to ensure their accuracy. Because validation of computer models in the entire design space can be costly, a recent approach was proposed where design optimization and model validation were concurrently performed using a sequential approach with both fixed and variable-size local domains. The variable-size approach used parametric distributions such as Gaussian to quantify the variability in test data and model predictions, and a maximum likelihood estimation to calibrate the prediction model. Also, a parametric bootstrap method was used to size each local domain. In this article, we generalize the variable-size approach, by not assuming any distribution such as Gaussian. A nonparametric bootstrap methodology is instead used to size the local domains. We expect its generality to be useful in applications where distributional assumptions are difficult to verify, or not met at all.
Journal Article

Multi-Objective Decision Making under Uncertainty and Incomplete Knowledge of Designer Preferences

2011-04-12
2011-01-1080
Multi-attribute decision making and multi-objective optimization complement each other. Often, while making design decisions involving multiple attributes, a Pareto front is generated using a multi-objective optimizer. The end user then chooses the optimal design from the Pareto front based on his/her preferences. This seemingly simple methodology requires sufficient modification if uncertainty is present. We explore two kinds of uncertainties in this paper: uncertainty in the decision variables which we call inherent design problem (IDP) uncertainty and that in knowledge of the preferences of the decision maker which we refer to as preference assessment (PA) uncertainty. From a purely utility theory perspective a rational decision maker maximizes his or her expected multi attribute utility.
Journal Article

Managing the Computational Cost of Monte Carlo Simulation with Importance Sampling by Considering the Value of Information

2013-04-08
2013-01-0943
Importance Sampling is a popular method for reliability assessment. Although it is significantly more efficient than standard Monte Carlo simulation if a suitable sampling distribution is used, in many design problems it is too expensive. The authors have previously proposed a method to manage the computational cost in standard Monte Carlo simulation that views design as a choice among alternatives with uncertain reliabilities. Information from simulation has value only if it helps the designer make a better choice among the alternatives. This paper extends their method to Importance Sampling. First, the designer estimates the prior probability density functions of the reliabilities of the alternative designs and calculates the expected utility of the choice of the best design. Subsequently, the designer estimates the likelihood function of the probability of failure by performing an initial simulation with Importance Sampling.
Journal Article

Warranty Forecasting of Repairable Systems for Different Production Patterns

2017-03-28
2017-01-0209
Warranty forecasting of repairable systems is very important for manufacturers of mass produced systems. It is desired to predict the Expected Number of Failures (ENF) after a censoring time using collected failure data before the censoring time. Moreover, systems may be produced with a defective component resulting in extensive warranty costs even after the defective component is detected and replaced with a new design. In this paper, we present a forecasting method to predict the ENF of a repairable system using observed data which is used to calibrate a Generalized Renewal Processes (GRP) model. Manufacturing of products may exhibit different production patterns with different failure statistics through time. For example, vehicles produced in different months may have different failure intensities because of supply chain differences or different skills of production workers, for example.
Journal Article

Efficient Probabilistic Reanalysis and Optimization of a Discrete Event System

2011-04-12
2011-01-1081
This paper presents a methodology to evaluate and optimize discrete event systems, such as an assembly line or a call center. First, the methodology estimates the performance of a system for a single probability distribution of the inputs. Probabilistic Reanalysis (PRRA) uses this information to evaluate the effect of changes in the system configuration on its performance. PRRA is integrated with a program to optimize the system. The proposed methodology is dramatically more efficient than one requiring a new Monte Carlo simulation each time we change the system. We demonstrate the approach on a drilling center and an electronic parts factory.
Technical Paper

An Efficient Possibility-Based Design Optimization Method for a Combination of Interval and Random Variables

2007-04-16
2007-01-0553
Reliability-based design optimization accounts for variation. However, it assumes that statistical information is available in the form of fully defined probabilistic distributions. This is not true for a variety of engineering problems where uncertainty is usually given in terms of interval ranges. In this case, interval analysis or possibility theory can be used instead of probability theory. This paper shows how possibility theory can be used in design and presents a computationally efficient sequential optimization algorithm. The algorithm handles problems with only uncertain or a combination of random and uncertain design variables and parameters. It consists of a sequence of cycles composed of a deterministic design optimization followed by a set of worst-case reliability evaluation loops. A crank-slider mechanism example demonstrates the accuracy and efficiency of the proposed sequential algorithm.
Technical Paper

Prediction of Tire-Snow Interaction Forces Using Metamodeling

2007-04-16
2007-01-1511
High-fidelity finite element (FE) tire-snow interaction models have the advantage of better understanding the physics of the tire-snow system. They can be used to develop semi-analytical models for vehicle design as well as to design and interpret field test results. For off-terrain conditions, there is a high level of uncertainties inherent in the system. The FE models are computationally intensive even when uncertainties of the system are not taken into account. On the other hand, field tests of tire-snow interaction are very costly. In this paper, dynamic metamodels are established to interpret interaction forces from FE simulation and to predict those forces by using part of the FE data as training data and part as validation data. Two metamodels are built based upon the Krieging principle: one has principal component analysis (PCA) taken into account and the other does not.
Technical Paper

Imprecise Reliability Assessment When the Type of the Probability Distribution of the Random Variables is Unknown

2009-04-20
2009-01-0199
In reliability design, often, there is scarce data for constructing probabilistic models. It is particularly challenging to model uncertainty in variables when the type of their probability distribution is unknown. Moreover, it is expensive to estimate the upper and lower bounds of the reliability of a system involving such variables. A method for modeling uncertainty by using Polynomial Chaos Expansion is presented. The method requires specifying bounds for statistical summaries such as the first four moments and credible intervals. A constrained optimization problem, in which decision variables are the coefficients of the Polynomial Chaos Expansion approximation, is formulated and solved in order to estimate the minimum and maximum values of a system’s reliability. This problem is solved efficiently by employing a probabilistic re-analysis approach to approximate the system reliability as a function of the moments of the random variables.
Technical Paper

Modeling Dependence and Assessing the Effect of Uncertainty in Dependence in Probabilistic Analysis and Decision Under Uncertainty

2010-04-12
2010-01-0697
A complete probabilistic model of uncertainty in probabilistic analysis and design problems is the joint probability distribution of the random variables. Often, it is impractical to estimate this joint probability distribution because the mechanism of the dependence of the variables is not completely understood. This paper proposes modeling dependence by using copulas and demonstrates their representational power. It also compares this representation with a Monte-Carlo simulation using dispersive sampling.
Technical Paper

Reliability Analysis Using Monte Carlo Simulation and Response Surface Methods

2004-03-08
2004-01-0431
An accurate and efficient Monte Carlo simulation (MCS) method is developed in this paper for limit state-based reliability analysis, especially at system levels, by using a response surface approximation of the failure indicator function. The Moving Least Squares (MLS) method is used to construct the response surface of the indicator function, along with an Optimum Symmetric Latin Hypercube (OSLH) as the sampling technique. Similar to MCS, the proposed method can easily handle implicit, highly nonlinear limit-state functions, with variables of any statistical distributions and correlations. However, the efficiency of MCS can be greatly improved. The method appears to be particularly efficient for multiple limit state and multiple design point problem. A mathematical example and a practical example are used to highlight the superior accuracy and efficiency of the proposed method over traditional reliability methods.
Technical Paper

Reliability and Resiliency Definitions for Smart Microgrids Based on Utility Theory

2017-03-28
2017-01-0205
Reliability and resiliency (R&R) definitions differ depending on the system under consideration. Generally, each engineering sector defines relevant R&R metrics pertinent to their system. While this can impede cross-disciplinary engineering projects as well as research, it is a necessary strategy to capture all the relevant system characteristics. This paper highlights the difficulties associated with defining performance of such systems while using smart microgrids as an example. Further, it develops metrics and definitions that are useful in assessing their performance, based on utility theory. A microgrid must not only anticipate load conditions but also tolerate partial failures and remain optimally operating. Many of these failures happen infrequently but unexpectedly and therefore are hard to plan for. We discuss real life failure scenarios and show how the proposed definitions and metrics are beneficial.
Technical Paper

Design Optimization and Reliability Estimation with Incomplete Uncertainty Information

2006-04-03
2006-01-0962
Existing methods for design optimization under uncertainty assume that a high level of information is available, typically in the form of data. In reality, however, insufficient data prevents correct inference of probability distributions, membership functions, or interval ranges. In this article we use an engine design example to show that optimal design decisions and reliability estimations depend strongly on uncertainty characterization. We contrast the reliability-based optimal designs to the ones obtained using worst-case optimization, and ask the question of how to obtain non-conservative designs with incomplete uncertainty information. We propose an answer to this question through the use of Bayesian statistics. We estimate the truck's engine reliability based only on available samples, and demonstrate that the accuracy of our estimates increases as more samples become available.
Technical Paper

Sensitivity Study of Staircase Fatigue Tests Using Monte Carlo Simulation

2005-04-11
2005-01-0803
The staircase fatigue test method is a well-established, but poorly understood probe for determining fatigue strength mean and standard deviation. The sensitivity of results to underlying distributions was studied using Monte Carlo simulation by repeatedly sampling known distributions of hypothetical fatigue strength data with the staircase test method. In this paper, the effects of the underlying distribution on staircase test results are presented with emphasis on original normal, lognormal, Weibull and bimodal data. The results indicate that the mean fatigue strength determined by the staircase testing protocol is largely unaffected by the underlying distribution, but the standard deviation is not. Suggestions for conducting staircase tests are provided.
Technical Paper

A Design Optimization Method Using Possibility Theory

2005-04-11
2005-01-0343
Early in the engineering design cycle, it is difficult to quantify product reliability or compliance to performance targets due to insufficient data or information for modeling the uncertainties. Design decisions are therefore, based on fuzzy information that is vague, imprecise qualitative, linguistic or incomplete. The uncertain information is usually available as intervals with lower and upper limits. In this work, the possibility theory is used to assess design reliability with incomplete information. The possibility theory can be viewed as a variant of fuzzy set theory. A possibility-based design optimization method is proposed where all design constraints are expressed possibilistically. It is shown that the method gives a conservative solution compared with all conventional reliability-based designs obtained with different probability distributions.
X