Refine Your Search

Search Results

Viewing 1 to 20 of 20
Technical Paper

Imprecise Reliability Assessment When the Type of the Probability Distribution of the Random Variables is Unknown

2009-04-20
2009-01-0199
In reliability design, often, there is scarce data for constructing probabilistic models. It is particularly challenging to model uncertainty in variables when the type of their probability distribution is unknown. Moreover, it is expensive to estimate the upper and lower bounds of the reliability of a system involving such variables. A method for modeling uncertainty by using Polynomial Chaos Expansion is presented. The method requires specifying bounds for statistical summaries such as the first four moments and credible intervals. A constrained optimization problem, in which decision variables are the coefficients of the Polynomial Chaos Expansion approximation, is formulated and solved in order to estimate the minimum and maximum values of a system’s reliability. This problem is solved efficiently by employing a probabilistic re-analysis approach to approximate the system reliability as a function of the moments of the random variables.
Technical Paper

Reliability Based Design Optimization of Dynamic Vehicle Performance Using Bond Graphs and Time Dependent Metamodels

2006-04-03
2006-01-0109
A vehicle drivetrain is designed to meet specific vehicle performance criteria which usually involve trade-offs among conflicting performance measures. This paper describes a methodology to optimize the drivetrain design including the axle ratio, transmission shift points and transmission shift ratios considering uncertainty. A complete vehicle dynamic model is developed using the bond graph method. The model includes the vehicle, engine, transmission, torque converter, driveline, and transmission controller. An equivalent MATLAB Simulink model performs the nonlinear dynamic analysis. In order to reduce the computational effort, a time-dependent metamodel is developed based on principal component analysis using singular value decomposition. The optimization is performed using both the Simulink vehicle dynamic model and the metamodel. A deterministic optimization first determines the optimum design in terms of fuel economy, without considering variations or uncertainties.
Technical Paper

Optimal Engine Torque Management for Reducing Driveline Clunk Using Time - Dependent Metamodels

2007-05-15
2007-01-2236
Quality and performance are two important customer requirements in vehicle design. Driveline clunk negatively affects the perceived quality and must be therefore, minimized. This is usually achieved using engine torque management, which is part of engine calibration. During a tip-in event, the engine torque rate of rise is limited until all the driveline lash is taken up. However, the engine torque rise, and its rate can negatively affect the vehicle throttle response. Therefore, the engine torque management must be balanced against throttle response. In practice, the engine torque rate of rise is calibrated manually. This paper describes a methodology for calibrating the engine torque in order to minimize the clunk disturbance, while still meeting throttle response constraints. A set of predetermined engine torque profiles are calibrated in a vehicle and the transmission turbine speed is measured for each profile. The latter is used to quantify the clunk disturbance.
Technical Paper

An Efficient Re-Analysis Methodology for Vibration of Large-Scale Structures

2007-05-15
2007-01-2326
Finite element analysis is a well-established methodology in structural dynamics. However, optimization and/or probabilistic studies can be prohibitively expensive because they require repeated FE analyses of large models. Various reanalysis methods have been proposed in order to calculate efficiently the dynamic response of a structure after a baseline design has been modified, without recalculating the new response. The parametric reduced-order modeling (PROM) and the combined approximation (CA) methods are two re-analysis methods, which can handle large model parameter changes in a relatively efficient manner. Although both methods are promising by themselves, they can not handle large FE models with large numbers of DOF (e.g. 100,000) with a large number of design parameters (e.g. 50), which are common in practice. In this paper, the advantages and disadvantages of the PROM and CA methods are first discussed in detail.
Technical Paper

Design Optimization and Reliability Estimation with Incomplete Uncertainty Information

2006-04-03
2006-01-0962
Existing methods for design optimization under uncertainty assume that a high level of information is available, typically in the form of data. In reality, however, insufficient data prevents correct inference of probability distributions, membership functions, or interval ranges. In this article we use an engine design example to show that optimal design decisions and reliability estimations depend strongly on uncertainty characterization. We contrast the reliability-based optimal designs to the ones obtained using worst-case optimization, and ask the question of how to obtain non-conservative designs with incomplete uncertainty information. We propose an answer to this question through the use of Bayesian statistics. We estimate the truck's engine reliability based only on available samples, and demonstrate that the accuracy of our estimates increases as more samples become available.
Technical Paper

Design Optimization Under Uncertainty Using Evidence Theory

2006-04-03
2006-01-0388
Early in the engineering design cycle, it is difficult to quantify product reliability due to insufficient data or information to model uncertainties. Probability theory can not be therefore, used. Design decisions are usually, based on fuzzy information which is imprecise and incomplete. Recently, evidence theory has been proposed to handle uncertainty with limited information. In this paper, a computationally efficient design optimization method is proposed based on evidence theory, which can handle a mixture of epistemic and random uncertainties. It quickly identifies the vicinity of the optimal point and the active constraints by moving a hyper-ellipse in the original design space, using a reliability-based design optimization (RBDO) algorithm. Subsequently, a derivative-free optimizer calculates the evidence-based optimum, starting from the close-by RBDO optimum, considering only the identified active constraints.
Technical Paper

System Reliability-Based Design using a Single-Loop Method

2007-04-16
2007-01-0555
An efficient approach for series system reliability-based design optimization (RBDO) is presented. The key idea is to apportion optimally the system reliability among the failure modes by considering the target values of the failure probabilities of the modes as design variables. Critical failure modes that contribute the most to the overall system reliability are identified. This paper proposes a computationally efficient, system RBDO approach using a single-loop method where the searches for the optimum design and for the most probable failure points proceed simultaneously. Specifically, at each iteration the optimizer uses approximated most probable failure points from the previous iteration to search for the optimum. A second-order Ditlevsen upper bound is used for the joint failure probability of failure modes. Also, an easy to implement active strategy set is employed to improve algorithmic stability.
Technical Paper

An Efficient Possibility-Based Design Optimization Method for a Combination of Interval and Random Variables

2007-04-16
2007-01-0553
Reliability-based design optimization accounts for variation. However, it assumes that statistical information is available in the form of fully defined probabilistic distributions. This is not true for a variety of engineering problems where uncertainty is usually given in terms of interval ranges. In this case, interval analysis or possibility theory can be used instead of probability theory. This paper shows how possibility theory can be used in design and presents a computationally efficient sequential optimization algorithm. The algorithm handles problems with only uncertain or a combination of random and uncertain design variables and parameters. It consists of a sequence of cycles composed of a deterministic design optimization followed by a set of worst-case reliability evaluation loops. A crank-slider mechanism example demonstrates the accuracy and efficiency of the proposed sequential algorithm.
Technical Paper

A Time-Dependent Reliability Analysis Method using a Niching Genetic Algorithm

2007-04-16
2007-01-0548
A reliability analysis method is presented for time-dependent systems under uncertainty. A level-crossing problem is considered where the system fails if its maximum response exceeds a specified threshold. The proposed method uses a double-loop optimization algorithm. The inner loop calculates the maximum response in time for a given set of random variables, and transforms a time-dependent problem into a time-independent one. A time integration method is used to calculate the response at discrete times. For each sample function of the response random process, the maximum response is found using a global-local search method consisting of a genetic algorithm (GA), and a gradient-based optimizer. This dynamic response usually exhibits multiple peaks and crosses the allowable response level to form a set of complex limit states, which lead to multiple most probable points (MPPs).
Technical Paper

Simulation of Tire-Snow Interfacial Forces for a Range of Snow Densities with Uncertainty

2006-04-03
2006-01-0497
The objective of this paper is to assess the effect of snow density on tire-snow interaction in the presence of uncertainty. The snow-depth dependent finite element analysis (FEA) and semi-analytical models we have developed recently can predict tire-snow interfacial forces at a given density under combined slip conditions. One drawback of the models is that they are only applicable for fresh, low-density snow due to the unavailability of a density-dependent snow model. In reality, the snow density on the ground can vary between that of fresh snow to heavily compacted snow that is similar to ice. Even for fresh snow on the ground, as a vehicle moves forward, the rear wheels experience higher snow densities than the front wheels. In addition, being a natural material, snow's physical properties vary significantly even for the same density.
Technical Paper

Improving Robust Design with Preference Aggregation Methods

2004-03-08
2004-01-1140
Robust design is a methodology for improving the quality of a product or process by minimizing the effect of variations in the inputs without eliminating the causes of those variations. In robust design, the putative best design is obtained by solving a multi-criteria optimization problem, trading off the nominal performance against the minimization of the variation of the performance measure. Because some existing methods combine the two criteria with a weighted sum or another fixed aggregation strategy, which are known to miss Pareto points, they may fail to obtain a desired design. To overcome this inadequacy, a more comprehensive preference aggregation method is implemented here into robust design. Three examples -- one simple mathematical example, one multi-criteria structure design example, and one automotive example -- are presented to illustrate the effectiveness of the proposed method.
Technical Paper

A Reliability-Based Robust Design Methodology

2005-04-11
2005-01-0811
Mathematical optimization plays an important role in engineering design, leading to greatly improved performance. Deterministic optimization however, can lead to undesired choices because it neglects input and model uncertainty. Reliability-based design optimization (RBDO) and robust design improve optimization by considering uncertainty. A design is called reliable if it meets all performance targets in the presence of variation/uncertainty and robust if it is insensitive to variation/uncertainty. Ultimately, a design should be optimal, reliable, and robust. Usually, some of the deterministic optimality is traded-off in order for the design to be reliable and/or robust. This paper describes the state-of-the-art in assessing reliability and robustness in engineering design and proposes a new unifying formulation. The principles of deterministic optimality, reliability and robustness are first defined.
Technical Paper

A Design Optimization Method Using Possibility Theory

2005-04-11
2005-01-0343
Early in the engineering design cycle, it is difficult to quantify product reliability or compliance to performance targets due to insufficient data or information for modeling the uncertainties. Design decisions are therefore, based on fuzzy information that is vague, imprecise qualitative, linguistic or incomplete. The uncertain information is usually available as intervals with lower and upper limits. In this work, the possibility theory is used to assess design reliability with incomplete information. The possibility theory can be viewed as a variant of fuzzy set theory. A possibility-based design optimization method is proposed where all design constraints are expressed possibilistically. It is shown that the method gives a conservative solution compared with all conventional reliability-based designs obtained with different probability distributions.
Technical Paper

Modeling and Optimization of Vehicle Drivetrain Dynamic Performance Considering Uncertainty

2005-05-16
2005-01-2371
A vehicle drivetrain is designed to meet specific vehicle performance criteria which usually involve trade-offs among conflicting performance measures. This paper describes a methodology to optimize the drivetrain design including the axle ratio, transmission shift points and transmission shift ratios considering uncertainty. A complete vehicle dynamic model is developed using the bond graph method. The model includes the vehicle, engine, transmission, torque converter, driveline, and transmission controller. An equivalent MATLAB Simulink model is also developed in order to carry out the nonlinear dynamic analysis efficiently. A deterministic optimization is first performed to determine the optimum design in terms of fuel economy, without considering variations or uncertainties. Subsequently, a Reliability-Based Design Optimization is carried out to find the optimum design in the presence of uncertainty.
Journal Article

Optimal and Robust Design of the PEM Fuel Cell Cathode Gas Diffusion Layer

2008-04-14
2008-01-1217
The cathode gas diffusion layer (GDL) is an important component of polymer electrolyte membrane (PEM) fuel cell. Its design parameters, including thickness, porosity and permeability, significantly affect the reactant transport and water management, thus impacting the fuel cell performance. This paper presents an optimization study of the GDL design parameters with the objective of maximizing the current density under a given voltage. A two-dimensional single-phase PEM fuel cell model is used. A multivariable optimization problem is formed to maximize the current density at the cathode under a given electrode voltage with respect to the GDL parameters. In order to reduce the computational effort and find the global optimum among the potential multiple optima, a global metamodel of the actual CFD-based fuel cell simulation, is adaptively generated using radial basis function approximations.
Journal Article

Efficient Re-Analysis Methodology for Probabilistic Vibration of Large-Scale Structures

2008-04-14
2008-01-0216
It is challenging to perform probabilistic analysis and design of large-scale structures because probabilistic analysis requires repeated finite element analyses of large models and each analysis is expensive. This paper presents a methodology for probabilistic analysis and reliability based design optimization of large scale structures that consists of two re-analysis methods; one for estimating the deterministic vibratory response and another for estimating the probability of the response exceeding a certain level. The deterministic re-analysis method can analyze efficiently large-scale finite element models consisting of tens or hundreds of thousand degrees of freedom and large numbers of design variables that vary in a wide range. The probabilistic re-analysis method calculates very efficiently the system reliability for many probability distributions of the design variables by performing a single Monte Carlo simulation.
Journal Article

Probabilistic Reanalysis Using Monte Carlo Simulation

2008-04-14
2008-01-0215
An approach for Probabilistic Reanalysis (PRA) of a system is presented. PRA calculates very efficiently the system reliability or the average value of an attribute of a design for many probability distributions of the input variables, by performing a single Monte Carlo simulation. In addition, PRA calculates the sensitivity derivatives of the reliability to the parameters of the probability distributions. The approach is useful for analysis problems where reliability bounds need to be calculated because the probability distribution of the input variables is uncertain or for design problems where the design variables are random. The accuracy and efficiency of PRA is demonstrated on vibration analysis of a car and on system reliability-based optimization (RBDO) of an internal combustion engine.
Journal Article

Design under Uncertainty using a Combination of Evidence Theory and a Bayesian Approach

2008-04-14
2008-01-0377
Early in the engineering design cycle, it is difficult to quantify product reliability due to insufficient data or information to model uncertainties. Probability theory can not be therefore, used. Design decisions are usually based on fuzzy information which is imprecise and incomplete. Various design methods such as Possibility-Based Design Optimization (PBDO) and Evidence-Based Design Optimization (EBDO) have been developed to systematically treat design with non-probabilistic uncertainties. In practical engineering applications, information regarding the uncertain variables and parameters may exist in the form of sample points, and uncertainties with sufficient and insufficient information may exist simultaneously. Most of the existing optimal design methods under uncertainty can not handle this form of incomplete information. They have to either discard some valuable information or postulate the existence of additional information.
Journal Article

Piston Design Using Multi-Objective Reliability-Based Design Optimization

2010-04-12
2010-01-0907
Piston design is a challenging engineering problem which involves complex physics and requires satisfying multiple performance objectives. Uncertainty in piston operating conditions and variability in piston design variables are inevitable and must be accounted for. The piston assembly can be a major source of engine mechanical friction and cold start noise, if not designed properly. In this paper, an analytical piston model is used in a deterministic and probabilistic (reliability-based) multi-objective design optimization process to obtain an optimal piston design. The model predicts piston performance in terms of scuffing, friction and noise, In order to keep the computational cost low, efficient and accurate metamodels of the piston performance metrics are used. The Pareto set of all optimal solutions is calculated allowing the designer to choose the “best” solution according to trade-offs among the multiple objectives.
Journal Article

An RBDO Method for Multiple Failure Region Problems using Probabilistic Reanalysis and Approximate Metamodels

2009-04-20
2009-01-0204
A Reliability-Based Design Optimization (RBDO) method for multiple failure regions is presented. The method uses a Probabilistic Re-Analysis (PRRA) approach in conjunction with an approximate global metamodel with local refinements. The latter serves as an indicator to determine the failure and safe regions. PRRA calculates very efficiently the system reliability of a design by performing a single Monte Carlo (MC) simulation. Although PRRA is based on MC simulation, it calculates “smooth” sensitivity derivatives, allowing therefore, the use of a gradient-based optimizer. An “accurate-on-demand” metamodel is used in the PRRA that allows us to handle problems with multiple disjoint failure regions and potentially multiple most-probable points (MPP). The multiple failure regions are identified by using a clustering technique. A maximin “space-filling” sampling technique is used to construct the metamodel. A vibration absorber example highlights the potential of the proposed method.
X