Refine Your Search

Topic

Search Results

Journal Article

A Re-Analysis Methodology for System RBDO Using a Trust Region Approach with Local Metamodels

2010-04-12
2010-01-0645
A simulation-based, system reliability-based design optimization (RBDO) method is presented that can handle problems with multiple failure regions and correlated random variables. Copulas are used to represent the correlation. The method uses a Probabilistic Re-Analysis (PRRA) approach in conjunction with a trust-region optimization approach and local metamodels covering each trust region. PRRA calculates very efficiently the system reliability of a design by performing a single Monte Carlo (MC) simulation per trust region. Although PRRA is based on MC simulation, it calculates “smooth” sensitivity derivatives, allowing therefore, the use of a gradient-based optimizer. The PRRA method is based on importance sampling. It provides accurate results, if the support of the sampling PDF contains the support of the joint PDF of the input random variables. The sequential, trust-region optimization approach satisfies this requirement.
Journal Article

Piston Design Using Multi-Objective Reliability-Based Design Optimization

2010-04-12
2010-01-0907
Piston design is a challenging engineering problem which involves complex physics and requires satisfying multiple performance objectives. Uncertainty in piston operating conditions and variability in piston design variables are inevitable and must be accounted for. The piston assembly can be a major source of engine mechanical friction and cold start noise, if not designed properly. In this paper, an analytical piston model is used in a deterministic and probabilistic (reliability-based) multi-objective design optimization process to obtain an optimal piston design. The model predicts piston performance in terms of scuffing, friction and noise, In order to keep the computational cost low, efficient and accurate metamodels of the piston performance metrics are used. The Pareto set of all optimal solutions is calculated allowing the designer to choose the “best” solution according to trade-offs among the multiple objectives.
Journal Article

An Efficient Method to Calculate the Failure Rate of Dynamic Systems with Random Parameters Using the Total Probability Theorem

2015-04-14
2015-01-0425
Using the total probability theorem, we propose a method to calculate the failure rate of a linear vibratory system with random parameters excited by stationary Gaussian processes. The response of such a system is non-stationary because of the randomness of the input parameters. A space-filling design, such as optimal symmetric Latin hypercube sampling or maximin, is first used to sample the input parameter space. For each design point, the output process is stationary and Gaussian. We present two approaches to calculate the corresponding conditional probability of failure. A Kriging metamodel is then created between the input parameters and the output conditional probabilities allowing us to estimate the conditional probabilities for any set of input parameters. The total probability theorem is finally applied to calculate the time-dependent probability of failure and the failure rate of the dynamic system. The proposed method is demonstrated using a vibratory system.
Journal Article

Bootstrapping and Separable Monte Carlo Simulation Methods Tailored for Efficient Assessment of Probability of Failure of Structural Systems

2015-04-14
2015-01-0420
There is randomness in both the applied loads and the strength of systems. Therefore, to account for the uncertainty, the safety of the system must be quantified using its reliability. Monte Carlo Simulation (MCS) is widely used for probabilistic analysis because of its robustness. However, the high computational cost limits the accuracy of MCS. Smarslok et al. [2010] developed an improved sampling technique for reliability assessment called Separable Monte Carlo (SMC) that can significantly increase the accuracy of estimation without increasing the cost of sampling. However, this method was applied to time-invariant problems involving two random variables. This paper extends SMC to problems with multiple random variables and develops a novel method for estimation of the standard deviation of the probability of failure of a structure. The method is demonstrated and validated on reliability assessment of an offshore wind turbine under turbulent wind loads.
Journal Article

Value of Information for Comparing Dependent Repairable Assemblies and Systems

2018-04-03
2018-01-1103
This article presents an approach for comparing alternative repairable systems and calculating the value of information obtained by testing a specified number of such systems. More specifically, an approach is presented to determine the value of information that comes from field testing a specified number of systems in order to appropriately estimate the reliability metric associated with each of the respective repairable systems. Here the reliability of a repairable system will be measured by its failure rate. In support of the decision-making effort, the failure rate is translated into an expected utility based on a utility curve that represents the risk tolerance of the decision-maker. The algorithm calculates the change of the expected value of the decision with the sample size. The change in the value of the decision represents the value of information obtained from testing.
Journal Article

Efficient Re-Analysis Methodology for Probabilistic Vibration of Large-Scale Structures

2008-04-14
2008-01-0216
It is challenging to perform probabilistic analysis and design of large-scale structures because probabilistic analysis requires repeated finite element analyses of large models and each analysis is expensive. This paper presents a methodology for probabilistic analysis and reliability based design optimization of large scale structures that consists of two re-analysis methods; one for estimating the deterministic vibratory response and another for estimating the probability of the response exceeding a certain level. The deterministic re-analysis method can analyze efficiently large-scale finite element models consisting of tens or hundreds of thousand degrees of freedom and large numbers of design variables that vary in a wide range. The probabilistic re-analysis method calculates very efficiently the system reliability for many probability distributions of the design variables by performing a single Monte Carlo simulation.
Journal Article

An RBDO Method for Multiple Failure Region Problems using Probabilistic Reanalysis and Approximate Metamodels

2009-04-20
2009-01-0204
A Reliability-Based Design Optimization (RBDO) method for multiple failure regions is presented. The method uses a Probabilistic Re-Analysis (PRRA) approach in conjunction with an approximate global metamodel with local refinements. The latter serves as an indicator to determine the failure and safe regions. PRRA calculates very efficiently the system reliability of a design by performing a single Monte Carlo (MC) simulation. Although PRRA is based on MC simulation, it calculates “smooth” sensitivity derivatives, allowing therefore, the use of a gradient-based optimizer. An “accurate-on-demand” metamodel is used in the PRRA that allows us to handle problems with multiple disjoint failure regions and potentially multiple most-probable points (MPP). The multiple failure regions are identified by using a clustering technique. A maximin “space-filling” sampling technique is used to construct the metamodel. A vibration absorber example highlights the potential of the proposed method.
Journal Article

Design under Uncertainty using a Combination of Evidence Theory and a Bayesian Approach

2008-04-14
2008-01-0377
Early in the engineering design cycle, it is difficult to quantify product reliability due to insufficient data or information to model uncertainties. Probability theory can not be therefore, used. Design decisions are usually based on fuzzy information which is imprecise and incomplete. Various design methods such as Possibility-Based Design Optimization (PBDO) and Evidence-Based Design Optimization (EBDO) have been developed to systematically treat design with non-probabilistic uncertainties. In practical engineering applications, information regarding the uncertain variables and parameters may exist in the form of sample points, and uncertainties with sufficient and insufficient information may exist simultaneously. Most of the existing optimal design methods under uncertainty can not handle this form of incomplete information. They have to either discard some valuable information or postulate the existence of additional information.
Journal Article

Probabilistic Reanalysis Using Monte Carlo Simulation

2008-04-14
2008-01-0215
An approach for Probabilistic Reanalysis (PRA) of a system is presented. PRA calculates very efficiently the system reliability or the average value of an attribute of a design for many probability distributions of the input variables, by performing a single Monte Carlo simulation. In addition, PRA calculates the sensitivity derivatives of the reliability to the parameters of the probability distributions. The approach is useful for analysis problems where reliability bounds need to be calculated because the probability distribution of the input variables is uncertain or for design problems where the design variables are random. The accuracy and efficiency of PRA is demonstrated on vibration analysis of a car and on system reliability-based optimization (RBDO) of an internal combustion engine.
Journal Article

Reliability Estimation for Multiple Failure Region Problems using Importance Sampling and Approximate Metamodels

2008-04-14
2008-01-0217
An efficient reliability estimation method is presented for engineering systems with multiple failure regions and potentially multiple most probable points. The method can handle implicit, nonlinear limit-state functions, with correlated or non-correlated random variables, which can be described by any probabilistic distribution. It uses a combination of approximate or “accurate-on-demand,” global and local metamodels which serve as indicators to determine the failure and safe regions. Samples close to limit states define transition regions between safe and failure domains. A clustering technique identifies all transition regions which can be in general disjoint, and local metamodels of the actual limit states are generated for each transition region.
Journal Article

Optimal and Robust Design of the PEM Fuel Cell Cathode Gas Diffusion Layer

2008-04-14
2008-01-1217
The cathode gas diffusion layer (GDL) is an important component of polymer electrolyte membrane (PEM) fuel cell. Its design parameters, including thickness, porosity and permeability, significantly affect the reactant transport and water management, thus impacting the fuel cell performance. This paper presents an optimization study of the GDL design parameters with the objective of maximizing the current density under a given voltage. A two-dimensional single-phase PEM fuel cell model is used. A multivariable optimization problem is formed to maximize the current density at the cathode under a given electrode voltage with respect to the GDL parameters. In order to reduce the computational effort and find the global optimum among the potential multiple optima, a global metamodel of the actual CFD-based fuel cell simulation, is adaptively generated using radial basis function approximations.
Journal Article

Time-Dependent Reliability of Random Dynamic Systems Using Time-Series Modeling and Importance Sampling

2011-04-12
2011-01-0728
Reliability is an important engineering requirement for consistently delivering acceptable product performance through time. As time progresses, the product may fail due to time-dependent operating conditions and material properties, component degradation, etc. The reliability degradation with time may increase the lifecycle cost due to potential warranty costs, repairs and loss of market share. Reliability is the probability that the system will perform its intended function successfully for a specified time interval. In this work, we consider the first-passage reliability which accounts for the first time failure of non-repairable systems. Methods are available in the literature, which provide an upper bound to the true reliability which may overestimate the true value considerably. Monte-Carlo simulations are accurate but computationally expensive.
Journal Article

Estimation of High-Cycle Fatigue Life by using Re-analysis

2012-04-16
2012-01-0066
In design of real-life systems, such as the suspension of a car, an offshore platform or a wind turbine, there are significant uncertainties in the model of the inputs. For example, scarcity of data leads to inaccuracies in the power spectral density function of the waves and the probability distribution of the wind speed. Therefore, it is necessary to evaluate the performance and safety of a system for different probability distributions. This is computationally expensive or even impractical. This paper presents a methodology to assess efficiently the fatigue life of structures for different power spectra of the applied loads. We accomplish that by reweighting the incremental damage calculated in one simulation. We demonstrate the accuracy and efficiency of the proposed method on an example which involves a nonlinear quarter car under a random dynamic load. The fatigue life of the suspension spring under loads generated by a sampling spectrum is calculated.
Journal Article

Efficient Random Vibration Analysis Using Markov Chain Monte Carlo Simulation

2012-04-16
2012-01-0067
Reliability assessment of dynamic systems with low failure probability can be very expensive. This paper presents and demonstrates a method that uses the Metropolis-Hastings algorithm to sample from an optimal probability density function (PDF) of the random variables. This function is the true PDF truncated over the failure region. For a system subjected to time varying excitation, Shinozuka's method is employed to generate time histories of the excitation. Random values of the frequencies and the phase angles of the excitation are drawn from the optimal PDF. It is shown that running the subset simulation by the proposed approach, which uses Shinozuka's method, is more efficient than the original subset simulation. The main reasons are that the approach involves only 10 to 20 random variables, and it takes advantage of the symmetry of the expression of the displacement as a function of the inputs. The paper demonstrates the method on two examples.
Journal Article

Probability of Failure of Dynamic Systems by Importance Sampling

2013-04-08
2013-01-0607
Estimation of the probability of failure of mechanical systems under random loads is computationally expensive, especially for very reliable systems with low probabilities of failure. Importance Sampling can be an efficient tool for static problems if a proper sampling distribution is selected. This paper presents a methodology to apply Importance Sampling to dynamic systems in which both the load and response are stochastic processes. The method is applicable to problems for which the input loads are stationary and Gaussian and are represented by power spectral density functions. Shinozuka's method is used to generate random time histories of excitation. The method is demonstrated on a linear quarter car model. This approach is more efficient than standard Monte Carlo simulation by several orders of magnitude.
Journal Article

Multi-Objective Decision Making under Uncertainty and Incomplete Knowledge of Designer Preferences

2011-04-12
2011-01-1080
Multi-attribute decision making and multi-objective optimization complement each other. Often, while making design decisions involving multiple attributes, a Pareto front is generated using a multi-objective optimizer. The end user then chooses the optimal design from the Pareto front based on his/her preferences. This seemingly simple methodology requires sufficient modification if uncertainty is present. We explore two kinds of uncertainties in this paper: uncertainty in the decision variables which we call inherent design problem (IDP) uncertainty and that in knowledge of the preferences of the decision maker which we refer to as preference assessment (PA) uncertainty. From a purely utility theory perspective a rational decision maker maximizes his or her expected multi attribute utility.
Journal Article

Managing the Computational Cost of Monte Carlo Simulation with Importance Sampling by Considering the Value of Information

2013-04-08
2013-01-0943
Importance Sampling is a popular method for reliability assessment. Although it is significantly more efficient than standard Monte Carlo simulation if a suitable sampling distribution is used, in many design problems it is too expensive. The authors have previously proposed a method to manage the computational cost in standard Monte Carlo simulation that views design as a choice among alternatives with uncertain reliabilities. Information from simulation has value only if it helps the designer make a better choice among the alternatives. This paper extends their method to Importance Sampling. First, the designer estimates the prior probability density functions of the reliabilities of the alternative designs and calculates the expected utility of the choice of the best design. Subsequently, the designer estimates the likelihood function of the probability of failure by performing an initial simulation with Importance Sampling.
Journal Article

Efficient Probabilistic Reanalysis and Optimization of a Discrete Event System

2011-04-12
2011-01-1081
This paper presents a methodology to evaluate and optimize discrete event systems, such as an assembly line or a call center. First, the methodology estimates the performance of a system for a single probability distribution of the inputs. Probabilistic Reanalysis (PRRA) uses this information to evaluate the effect of changes in the system configuration on its performance. PRRA is integrated with a program to optimize the system. The proposed methodology is dramatically more efficient than one requiring a new Monte Carlo simulation each time we change the system. We demonstrate the approach on a drilling center and an electronic parts factory.
Technical Paper

Optimal Engine Torque Management for Reducing Driveline Clunk Using Time - Dependent Metamodels

2007-05-15
2007-01-2236
Quality and performance are two important customer requirements in vehicle design. Driveline clunk negatively affects the perceived quality and must be therefore, minimized. This is usually achieved using engine torque management, which is part of engine calibration. During a tip-in event, the engine torque rate of rise is limited until all the driveline lash is taken up. However, the engine torque rise, and its rate can negatively affect the vehicle throttle response. Therefore, the engine torque management must be balanced against throttle response. In practice, the engine torque rate of rise is calibrated manually. This paper describes a methodology for calibrating the engine torque in order to minimize the clunk disturbance, while still meeting throttle response constraints. A set of predetermined engine torque profiles are calibrated in a vehicle and the transmission turbine speed is measured for each profile. The latter is used to quantify the clunk disturbance.
Technical Paper

An Efficient Re-Analysis Methodology for Vibration of Large-Scale Structures

2007-05-15
2007-01-2326
Finite element analysis is a well-established methodology in structural dynamics. However, optimization and/or probabilistic studies can be prohibitively expensive because they require repeated FE analyses of large models. Various reanalysis methods have been proposed in order to calculate efficiently the dynamic response of a structure after a baseline design has been modified, without recalculating the new response. The parametric reduced-order modeling (PROM) and the combined approximation (CA) methods are two re-analysis methods, which can handle large model parameter changes in a relatively efficient manner. Although both methods are promising by themselves, they can not handle large FE models with large numbers of DOF (e.g. 100,000) with a large number of design parameters (e.g. 50), which are common in practice. In this paper, the advantages and disadvantages of the PROM and CA methods are first discussed in detail.
X