Refine Your Search

Search Results

Technical Paper

A 3-D Joint Model for Automotive Structures

1992-06-01
921088
A simple, design-oriented model of joints in vehicles structures is developed. This model accounts for the flexibility, the offsets of rotation centers of joint branches from geometric center, and the coupling between rotations of a joint branch in different planes. A family of joint models with different levels of complexity is also defined. A probabilistic system identification is used to estimate the joint model parameters by using the measured displacements. Statistical tools which identify important parameters are also presented. The identification methodology is applied to the estimation of parameters of a B-pillar to rocker joint.
Technical Paper

Random Vibration Analysis Using Quasi-Random Bootstrapping

2018-04-03
2018-01-1104
Reliability analysis of engineering structures such as bridges, airplanes, and cars require calculation of small failure probabilities. These probabilities can be calculated using standard Monte Carlo simulation, but this method is impractical for most real-life systems because of its high computational cost. Many studies have focused on reducing the computational cost of a reliability assessment. These include bootstrapping, Separable Monte Carlo, Importance Sampling, and the Combined Approximations. The computational cost can also be reduced using an efficient method for deterministic analysis such as the mode superposition, mode acceleration, and the combined acceleration method. This paper presents and demonstrates a method that uses a combination of Sobol quasi-random sequences and bootstrapping to reduce the number of function calls. The study demonstrates that the use of quasi-random numbers in conjunction bootstrapping reduces dramatically computational cost.
Technical Paper

Reliability Estimation of Large-Scale Dynamic Systems by using Re-analysis and Tail Modeling

2009-04-20
2009-01-0200
Probabilistic studies can be prohibitively expensive because they require repeated finite element analyses of large models. Re-analysis methods have been proposed with the premise to estimate accurately the dynamic response of a structure after a baseline design has been modified, without recalculating the new response. Although these methods increase computational efficiency, they are still not efficient enough for probabilistic analysis of large-scale dynamic systems with low failure probabilities (less or equal to 10-3). This paper presents a methodology that uses deterministic and probabilistic re-analysis methods to generate sample points of the response. Subsequently, tail modeling is used to estimate the right tail of the response PDF and the probability of failure a highly reliable system. The methodology is demonstrated on probabilistic vibration analysis of a realistic vehicle FE model.
Technical Paper

Imprecise Reliability Assessment When the Type of the Probability Distribution of the Random Variables is Unknown

2009-04-20
2009-01-0199
In reliability design, often, there is scarce data for constructing probabilistic models. It is particularly challenging to model uncertainty in variables when the type of their probability distribution is unknown. Moreover, it is expensive to estimate the upper and lower bounds of the reliability of a system involving such variables. A method for modeling uncertainty by using Polynomial Chaos Expansion is presented. The method requires specifying bounds for statistical summaries such as the first four moments and credible intervals. A constrained optimization problem, in which decision variables are the coefficients of the Polynomial Chaos Expansion approximation, is formulated and solved in order to estimate the minimum and maximum values of a system’s reliability. This problem is solved efficiently by employing a probabilistic re-analysis approach to approximate the system reliability as a function of the moments of the random variables.
Technical Paper

Managing the Computational Cost in a Monte Carlo Simulation by Considering the Value of Information

2012-04-16
2012-01-0915
Monte Carlo simulation is a popular tool for reliability assessment because of its robustness and ease of implementation. A major concern with this method is its computational cost; standard Monte Carlo simulation requires quadrupling the number of replications for halving the standard deviation of the estimated failure probability. Efforts to increase efficiency focus on intelligent sampling procedures and methods for efficient calculation of the performance function of a system. This paper proposes a new method to manage cost that views design as a decision among alternatives with uncertain reliabilities. Information from a simulation has value only if it enables the designer to make a better choice among the alternative options. Consequently, the value of information from the simulation is equal to the gain from using this information to improve the decision. A designer can determine the number of replications that are worth performing by using the method.
Technical Paper

System Failure Identification using Linear Algebra: Application to Cost-Reliability Tradeoffs under Uncertain Preferences

2012-04-16
2012-01-0914
Reaching a system level reliability target is an inverse problem. Component level reliabilities are determined for a required system level reliability. Because this inverse problem does not have a unique solution, one approach is to tradeoff system reliability with cost and to allow the designer to select a design with a target system reliability, using his/her preferences. In this case, the component reliabilities are readily available from the calculation of the reliability-cost tradeoff. To arrive at the set of solutions to be traded off, one encounters two problems. First, the system reliability calculation is based on repeated system simulations where each system state, indicating which components work and which have failed, is tested to determine if it causes system failure, and second, the task of eliciting and encoding the decision maker's preferences is extremely difficult because of uncertainty in modeling the decision maker's preferences.
Technical Paper

Modeling Dependence and Assessing the Effect of Uncertainty in Dependence in Probabilistic Analysis and Decision Under Uncertainty

2010-04-12
2010-01-0697
A complete probabilistic model of uncertainty in probabilistic analysis and design problems is the joint probability distribution of the random variables. Often, it is impractical to estimate this joint probability distribution because the mechanism of the dependence of the variables is not completely understood. This paper proposes modeling dependence by using copulas and demonstrates their representational power. It also compares this representation with a Monte-Carlo simulation using dispersive sampling.
Technical Paper

Reliability Analysis of Composite Inflatable Space Structures Considering Fracture Failure

2014-04-01
2014-01-0715
Inflatable space structures can have lower launching cost and larger habitat volume than their conventional rigid counterparts. These structures are made of composite laminates, and they are flexible when folded and partially inflated. They contain light-activated resins, and can be cured with the sun light after being inflated in space. A spacecraft can burst due to cracks caused by meteor showers or debris. Therefore, it is critical to identify the important fracture failure modes, and assess their probability. This information will help a designer minimize the risk of failure and keep the mass and cost low. This paper presents a probabilistic approach for finding the required thickness of an inflatable habitat shell for a prescribed reliability level, and demonstrates the superiority of probabilistic design to its deterministic counterpart.
Technical Paper

Combined Approximation for Efficient Reliability Analysis of Linear Dynamic Systems

2015-04-14
2015-01-0424
The Combined Approximation (CA) method is an efficient reanalysis method that aims at reducing the cost of optimization problems. The CA uses results of a single exact analysis, and it is suitable for different types of structures and design variables. The second author utilized CA to calculate the frequency response function of a system at a frequency of interest by using the results at a frequency in the vicinity of that frequency. He showed that the CA yields accurate results for small frequency perturbations. This work demonstrates a methodology that utilizes CA to reduce the cost of Monte Carlo simulation (MCs) of linear systems under random dynamic loads. The main idea is to divide the power spectral density function (PSD) of the input load into several frequency bins before calculating the load realizations.
Technical Paper

Multi-Level Decoupled Optimization of Wind Turbine Structures

2015-04-14
2015-01-0434
This paper proposes a multi-level decoupled method for optimizing the structural design of a wind turbine blade. The proposed method reduces the design space by employing a two-level optimization process. At the high-level, the structural properties of each section are approximated by an exponential function of the distance of that section from the blade root. High-level design variables are the coefficients of this approximating function. Target values for the structural properties of the blade are determined at that level. At the low-level, sections are divided into small decoupled groups. For each section, the low-level optimizer finds the thickness of laminate layers with a minimum mass, whose structural properties meet the targets determined by the high-level optimizer. In the proposed method, each low-level optimizer only considers a small number of design variables for a particular section, while traditional, single-level methods consider all design variables simultaneously.
Technical Paper

Inverse Modeling: Theory and Engineering Examples

2016-04-05
2016-01-0267
Over the last two decades inverse problems have become increasingly popular due to their widespread applications. This popularity continuously demands designers to find alternative methods, to solve the inverse problems, which are efficient and accurate. It is important to use effective techniques that are both accurate and computationally efficient. This paper presents a method for solving inverse problems through Artificial Neural Network (ANN) theory. The paper also presents a method to apply Grey Wolf optimizer (GWO) algorithm to inverse problems. GWO is a recent optimization method producing superior results. Both methods are then compared to traditional methods such as Particle Swarm Optimization (PSO) and Markov Chain Monte Carlo (MCMC). Four typical engineering design problems are used to compare the four methods. The results show that the GWO outperforms other methods both in terms of efficiency and accuracy.
Technical Paper

An Efficient Re-Analysis Methodology for Vibration of Large-Scale Structures

2007-05-15
2007-01-2326
Finite element analysis is a well-established methodology in structural dynamics. However, optimization and/or probabilistic studies can be prohibitively expensive because they require repeated FE analyses of large models. Various reanalysis methods have been proposed in order to calculate efficiently the dynamic response of a structure after a baseline design has been modified, without recalculating the new response. The parametric reduced-order modeling (PROM) and the combined approximation (CA) methods are two re-analysis methods, which can handle large model parameter changes in a relatively efficient manner. Although both methods are promising by themselves, they can not handle large FE models with large numbers of DOF (e.g. 100,000) with a large number of design parameters (e.g. 50), which are common in practice. In this paper, the advantages and disadvantages of the PROM and CA methods are first discussed in detail.
Technical Paper

System Reliability-Based Design using a Single-Loop Method

2007-04-16
2007-01-0555
An efficient approach for series system reliability-based design optimization (RBDO) is presented. The key idea is to apportion optimally the system reliability among the failure modes by considering the target values of the failure probabilities of the modes as design variables. Critical failure modes that contribute the most to the overall system reliability are identified. This paper proposes a computationally efficient, system RBDO approach using a single-loop method where the searches for the optimum design and for the most probable failure points proceed simultaneously. Specifically, at each iteration the optimizer uses approximated most probable failure points from the previous iteration to search for the optimum. A second-order Ditlevsen upper bound is used for the joint failure probability of failure modes. Also, an easy to implement active strategy set is employed to improve algorithmic stability.
Technical Paper

Assessment of Imprecise Reliability Using Efficient Probabilistic Reanalysis

2007-04-16
2007-01-0552
In reliability design, often, there is scarce data for constructing probabilistic models. Probabilistic models whose parameters vary in known intervals could be more suitable than Bayesian models because the former models do not require making assumptions that are not supported by the available evidence. If we use models whose parameters vary in intervals we need to calculate upper and lower bounds of the failure probability (or reliability) of a system in order to make design decisions. Monte Carlo simulation can be used for this purpose, but it is too expensive for all but very simple systems. This paper proposes an efficient Monte-Carlo simulation approach for estimation of upper and lower probabilities. This approach is based on two ideas: a) use an efficient approach for reliability reanalysis of a system, which is introduced in this paper, and b) approximate the probability distribution of the minimum and maximum failure probabilities using extreme value statistics.
Technical Paper

Targeted Testing for Reliability Validation

2004-03-08
2004-01-0239
Methods for designing targeted tests for reliability validation of structures obtained from reliability-based design are presented. These methods optimize the test parameters to minimize the variance in the estimated reliability (or equivalently the failure probability) estimated from the tests. The tests are designed using information from analytical models used to design the structure. Both analytical tests, in which very detailed models are used as reference, or physical tests can be designed using the methods presented. The methods are demonstrated on examples and their robustness to errors in the analytical models used to design the tests is assessed.
Technical Paper

A New Approach for System Reliability-Based Design Optimization

2005-04-11
2005-01-0348
An efficient approach for Reliability-Based Design Optimization (RBDO) of series systems is presented. A modified formulation of the RBDO problem is employed in which the required reliabilities of the failure modes of a system are design variables. This allows for an optimal apportionment of the reliability of a system among its failure modes. A sequential optimization and reliability assessment method is used to efficiently determine the optimum design. Here, the constraints on the reliabilities of the failure modes of the RBDO problem are replaced with deterministic constraints. The method is demonstrated on an example problem that has been solved in a previous study that did not treat the required reliability levels of the failure modes as design variables. The new approach finds designs with lower mass than designs found in the previous study without reducing their system reliability.
Technical Paper

Evidence Theory Approach and Bayesian Approach for Modeling Uncertainty when Information is Imprecise

2003-03-03
2003-01-0144
This paper investigates the potential of Evidence Theory (ET) and Bayesian Theory (BT) for decision under uncertainty, when the evidence about uncertainty is imprecise. The basic concepts of ET and BT are introduced and the ways these theories model uncertainties, propagate them through systems and assess the safety of these systems are presented. ET and BT approaches are demonstrated and compared on examples involving an algebraic function when the evidence about the input variables consists of intervals provided by experts. It is recommended that a decision maker compute both the Bayesian probability of events and their lower and upper probabilities using ET when evidence from experts is imprecise. A large gap between the lower and upper probability suggests that more information should be collected before making a decision. If this is not feasible, then Bayesian probabilities can help make a decision.
Journal Article

Time-Dependent Reliability of Random Dynamic Systems Using Time-Series Modeling and Importance Sampling

2011-04-12
2011-01-0728
Reliability is an important engineering requirement for consistently delivering acceptable product performance through time. As time progresses, the product may fail due to time-dependent operating conditions and material properties, component degradation, etc. The reliability degradation with time may increase the lifecycle cost due to potential warranty costs, repairs and loss of market share. Reliability is the probability that the system will perform its intended function successfully for a specified time interval. In this work, we consider the first-passage reliability which accounts for the first time failure of non-repairable systems. Methods are available in the literature, which provide an upper bound to the true reliability which may overestimate the true value considerably. Monte-Carlo simulations are accurate but computationally expensive.
Journal Article

Probability of Failure of Dynamic Systems by Importance Sampling

2013-04-08
2013-01-0607
Estimation of the probability of failure of mechanical systems under random loads is computationally expensive, especially for very reliable systems with low probabilities of failure. Importance Sampling can be an efficient tool for static problems if a proper sampling distribution is selected. This paper presents a methodology to apply Importance Sampling to dynamic systems in which both the load and response are stochastic processes. The method is applicable to problems for which the input loads are stationary and Gaussian and are represented by power spectral density functions. Shinozuka's method is used to generate random time histories of excitation. The method is demonstrated on a linear quarter car model. This approach is more efficient than standard Monte Carlo simulation by several orders of magnitude.
Journal Article

Managing the Computational Cost of Monte Carlo Simulation with Importance Sampling by Considering the Value of Information

2013-04-08
2013-01-0943
Importance Sampling is a popular method for reliability assessment. Although it is significantly more efficient than standard Monte Carlo simulation if a suitable sampling distribution is used, in many design problems it is too expensive. The authors have previously proposed a method to manage the computational cost in standard Monte Carlo simulation that views design as a choice among alternatives with uncertain reliabilities. Information from simulation has value only if it helps the designer make a better choice among the alternatives. This paper extends their method to Importance Sampling. First, the designer estimates the prior probability density functions of the reliabilities of the alternative designs and calculates the expected utility of the choice of the best design. Subsequently, the designer estimates the likelihood function of the probability of failure by performing an initial simulation with Importance Sampling.
X