Search Results

Viewing 1 to 12 of 12
Journal Article

An Efficient Method to Calculate the Failure Rate of Dynamic Systems with Random Parameters Using the Total Probability Theorem

2015-04-14
2015-01-0425
Using the total probability theorem, we propose a method to calculate the failure rate of a linear vibratory system with random parameters excited by stationary Gaussian processes. The response of such a system is non-stationary because of the randomness of the input parameters. A space-filling design, such as optimal symmetric Latin hypercube sampling or maximin, is first used to sample the input parameter space. For each design point, the output process is stationary and Gaussian. We present two approaches to calculate the corresponding conditional probability of failure. A Kriging metamodel is then created between the input parameters and the output conditional probabilities allowing us to estimate the conditional probabilities for any set of input parameters. The total probability theorem is finally applied to calculate the time-dependent probability of failure and the failure rate of the dynamic system. The proposed method is demonstrated using a vibratory system.
Technical Paper

Imprecise Reliability Assessment When the Type of the Probability Distribution of the Random Variables is Unknown

2009-04-20
2009-01-0199
In reliability design, often, there is scarce data for constructing probabilistic models. It is particularly challenging to model uncertainty in variables when the type of their probability distribution is unknown. Moreover, it is expensive to estimate the upper and lower bounds of the reliability of a system involving such variables. A method for modeling uncertainty by using Polynomial Chaos Expansion is presented. The method requires specifying bounds for statistical summaries such as the first four moments and credible intervals. A constrained optimization problem, in which decision variables are the coefficients of the Polynomial Chaos Expansion approximation, is formulated and solved in order to estimate the minimum and maximum values of a system’s reliability. This problem is solved efficiently by employing a probabilistic re-analysis approach to approximate the system reliability as a function of the moments of the random variables.
Technical Paper

Reliability Estimation of Large-Scale Dynamic Systems by using Re-analysis and Tail Modeling

2009-04-20
2009-01-0200
Probabilistic studies can be prohibitively expensive because they require repeated finite element analyses of large models. Re-analysis methods have been proposed with the premise to estimate accurately the dynamic response of a structure after a baseline design has been modified, without recalculating the new response. Although these methods increase computational efficiency, they are still not efficient enough for probabilistic analysis of large-scale dynamic systems with low failure probabilities (less or equal to 10-3). This paper presents a methodology that uses deterministic and probabilistic re-analysis methods to generate sample points of the response. Subsequently, tail modeling is used to estimate the right tail of the response PDF and the probability of failure a highly reliable system. The methodology is demonstrated on probabilistic vibration analysis of a realistic vehicle FE model.
Journal Article

Efficient Re-Analysis Methodology for Probabilistic Vibration of Large-Scale Structures

2008-04-14
2008-01-0216
It is challenging to perform probabilistic analysis and design of large-scale structures because probabilistic analysis requires repeated finite element analyses of large models and each analysis is expensive. This paper presents a methodology for probabilistic analysis and reliability based design optimization of large scale structures that consists of two re-analysis methods; one for estimating the deterministic vibratory response and another for estimating the probability of the response exceeding a certain level. The deterministic re-analysis method can analyze efficiently large-scale finite element models consisting of tens or hundreds of thousand degrees of freedom and large numbers of design variables that vary in a wide range. The probabilistic re-analysis method calculates very efficiently the system reliability for many probability distributions of the design variables by performing a single Monte Carlo simulation.
Technical Paper

An Efficient Re-Analysis Methodology for Vibration of Large-Scale Structures

2007-05-15
2007-01-2326
Finite element analysis is a well-established methodology in structural dynamics. However, optimization and/or probabilistic studies can be prohibitively expensive because they require repeated FE analyses of large models. Various reanalysis methods have been proposed in order to calculate efficiently the dynamic response of a structure after a baseline design has been modified, without recalculating the new response. The parametric reduced-order modeling (PROM) and the combined approximation (CA) methods are two re-analysis methods, which can handle large model parameter changes in a relatively efficient manner. Although both methods are promising by themselves, they can not handle large FE models with large numbers of DOF (e.g. 100,000) with a large number of design parameters (e.g. 50), which are common in practice. In this paper, the advantages and disadvantages of the PROM and CA methods are first discussed in detail.
Technical Paper

An Efficient Possibility-Based Design Optimization Method for a Combination of Interval and Random Variables

2007-04-16
2007-01-0553
Reliability-based design optimization accounts for variation. However, it assumes that statistical information is available in the form of fully defined probabilistic distributions. This is not true for a variety of engineering problems where uncertainty is usually given in terms of interval ranges. In this case, interval analysis or possibility theory can be used instead of probability theory. This paper shows how possibility theory can be used in design and presents a computationally efficient sequential optimization algorithm. The algorithm handles problems with only uncertain or a combination of random and uncertain design variables and parameters. It consists of a sequence of cycles composed of a deterministic design optimization followed by a set of worst-case reliability evaluation loops. A crank-slider mechanism example demonstrates the accuracy and efficiency of the proposed sequential algorithm.
Technical Paper

System Reliability-Based Design using a Single-Loop Method

2007-04-16
2007-01-0555
An efficient approach for series system reliability-based design optimization (RBDO) is presented. The key idea is to apportion optimally the system reliability among the failure modes by considering the target values of the failure probabilities of the modes as design variables. Critical failure modes that contribute the most to the overall system reliability are identified. This paper proposes a computationally efficient, system RBDO approach using a single-loop method where the searches for the optimum design and for the most probable failure points proceed simultaneously. Specifically, at each iteration the optimizer uses approximated most probable failure points from the previous iteration to search for the optimum. A second-order Ditlevsen upper bound is used for the joint failure probability of failure modes. Also, an easy to implement active strategy set is employed to improve algorithmic stability.
Technical Paper

Prediction of Tire-Snow Interaction Forces Using Metamodeling

2007-04-16
2007-01-1511
High-fidelity finite element (FE) tire-snow interaction models have the advantage of better understanding the physics of the tire-snow system. They can be used to develop semi-analytical models for vehicle design as well as to design and interpret field test results. For off-terrain conditions, there is a high level of uncertainties inherent in the system. The FE models are computationally intensive even when uncertainties of the system are not taken into account. On the other hand, field tests of tire-snow interaction are very costly. In this paper, dynamic metamodels are established to interpret interaction forces from FE simulation and to predict those forces by using part of the FE data as training data and part as validation data. Two metamodels are built based upon the Krieging principle: one has principal component analysis (PCA) taken into account and the other does not.
Technical Paper

Design Optimization and Reliability Estimation with Incomplete Uncertainty Information

2006-04-03
2006-01-0962
Existing methods for design optimization under uncertainty assume that a high level of information is available, typically in the form of data. In reality, however, insufficient data prevents correct inference of probability distributions, membership functions, or interval ranges. In this article we use an engine design example to show that optimal design decisions and reliability estimations depend strongly on uncertainty characterization. We contrast the reliability-based optimal designs to the ones obtained using worst-case optimization, and ask the question of how to obtain non-conservative designs with incomplete uncertainty information. We propose an answer to this question through the use of Bayesian statistics. We estimate the truck's engine reliability based only on available samples, and demonstrate that the accuracy of our estimates increases as more samples become available.
Technical Paper

Simulation of Tire-Snow Interfacial Forces for a Range of Snow Densities with Uncertainty

2006-04-03
2006-01-0497
The objective of this paper is to assess the effect of snow density on tire-snow interaction in the presence of uncertainty. The snow-depth dependent finite element analysis (FEA) and semi-analytical models we have developed recently can predict tire-snow interfacial forces at a given density under combined slip conditions. One drawback of the models is that they are only applicable for fresh, low-density snow due to the unavailability of a density-dependent snow model. In reality, the snow density on the ground can vary between that of fresh snow to heavily compacted snow that is similar to ice. Even for fresh snow on the ground, as a vehicle moves forward, the rear wheels experience higher snow densities than the front wheels. In addition, being a natural material, snow's physical properties vary significantly even for the same density.
Technical Paper

Reliability Analysis Using Monte Carlo Simulation and Response Surface Methods

2004-03-08
2004-01-0431
An accurate and efficient Monte Carlo simulation (MCS) method is developed in this paper for limit state-based reliability analysis, especially at system levels, by using a response surface approximation of the failure indicator function. The Moving Least Squares (MLS) method is used to construct the response surface of the indicator function, along with an Optimum Symmetric Latin Hypercube (OSLH) as the sampling technique. Similar to MCS, the proposed method can easily handle implicit, highly nonlinear limit-state functions, with variables of any statistical distributions and correlations. However, the efficiency of MCS can be greatly improved. The method appears to be particularly efficient for multiple limit state and multiple design point problem. A mathematical example and a practical example are used to highlight the superior accuracy and efficiency of the proposed method over traditional reliability methods.
Technical Paper

Simulation-Based Reliability Analysis of Automotive Wind Noise Quality

2004-03-08
2004-01-0238
An efficient simulation-based method is proposed for the reliability analysis of a vehicle body-door subsystem with respect to an important quality issue -- wind noise. A nonlinear seal model is constructed for the automotive wind noise problem and the limit state function is evaluated using finite element analysis. Existing analytical as well as simulation-based methods are used to solve this problem. A multi-modal adaptive importance sampling method is then developed for reliability analysis at system level. It is demonstrated through this industrial application problem that the multi-modal adaptive importance sampling method is superior to existing methods in terms of efficiency and accuracy. The method can easily handle implicit limit-state functions, with variables of any statistical distributions.