Refine Your Search

Topic

Search Results

Journal Article

Piston Design Using Multi-Objective Reliability-Based Design Optimization

2010-04-12
2010-01-0907
Piston design is a challenging engineering problem which involves complex physics and requires satisfying multiple performance objectives. Uncertainty in piston operating conditions and variability in piston design variables are inevitable and must be accounted for. The piston assembly can be a major source of engine mechanical friction and cold start noise, if not designed properly. In this paper, an analytical piston model is used in a deterministic and probabilistic (reliability-based) multi-objective design optimization process to obtain an optimal piston design. The model predicts piston performance in terms of scuffing, friction and noise, In order to keep the computational cost low, efficient and accurate metamodels of the piston performance metrics are used. The Pareto set of all optimal solutions is calculated allowing the designer to choose the “best” solution according to trade-offs among the multiple objectives.
Journal Article

Efficient Re-Analysis Methodology for Probabilistic Vibration of Large-Scale Structures

2008-04-14
2008-01-0216
It is challenging to perform probabilistic analysis and design of large-scale structures because probabilistic analysis requires repeated finite element analyses of large models and each analysis is expensive. This paper presents a methodology for probabilistic analysis and reliability based design optimization of large scale structures that consists of two re-analysis methods; one for estimating the deterministic vibratory response and another for estimating the probability of the response exceeding a certain level. The deterministic re-analysis method can analyze efficiently large-scale finite element models consisting of tens or hundreds of thousand degrees of freedom and large numbers of design variables that vary in a wide range. The probabilistic re-analysis method calculates very efficiently the system reliability for many probability distributions of the design variables by performing a single Monte Carlo simulation.
Journal Article

An RBDO Method for Multiple Failure Region Problems using Probabilistic Reanalysis and Approximate Metamodels

2009-04-20
2009-01-0204
A Reliability-Based Design Optimization (RBDO) method for multiple failure regions is presented. The method uses a Probabilistic Re-Analysis (PRRA) approach in conjunction with an approximate global metamodel with local refinements. The latter serves as an indicator to determine the failure and safe regions. PRRA calculates very efficiently the system reliability of a design by performing a single Monte Carlo (MC) simulation. Although PRRA is based on MC simulation, it calculates “smooth” sensitivity derivatives, allowing therefore, the use of a gradient-based optimizer. An “accurate-on-demand” metamodel is used in the PRRA that allows us to handle problems with multiple disjoint failure regions and potentially multiple most-probable points (MPP). The multiple failure regions are identified by using a clustering technique. A maximin “space-filling” sampling technique is used to construct the metamodel. A vibration absorber example highlights the potential of the proposed method.
Journal Article

Design under Uncertainty using a Combination of Evidence Theory and a Bayesian Approach

2008-04-14
2008-01-0377
Early in the engineering design cycle, it is difficult to quantify product reliability due to insufficient data or information to model uncertainties. Probability theory can not be therefore, used. Design decisions are usually based on fuzzy information which is imprecise and incomplete. Various design methods such as Possibility-Based Design Optimization (PBDO) and Evidence-Based Design Optimization (EBDO) have been developed to systematically treat design with non-probabilistic uncertainties. In practical engineering applications, information regarding the uncertain variables and parameters may exist in the form of sample points, and uncertainties with sufficient and insufficient information may exist simultaneously. Most of the existing optimal design methods under uncertainty can not handle this form of incomplete information. They have to either discard some valuable information or postulate the existence of additional information.
Journal Article

Probabilistic Reanalysis Using Monte Carlo Simulation

2008-04-14
2008-01-0215
An approach for Probabilistic Reanalysis (PRA) of a system is presented. PRA calculates very efficiently the system reliability or the average value of an attribute of a design for many probability distributions of the input variables, by performing a single Monte Carlo simulation. In addition, PRA calculates the sensitivity derivatives of the reliability to the parameters of the probability distributions. The approach is useful for analysis problems where reliability bounds need to be calculated because the probability distribution of the input variables is uncertain or for design problems where the design variables are random. The accuracy and efficiency of PRA is demonstrated on vibration analysis of a car and on system reliability-based optimization (RBDO) of an internal combustion engine.
Journal Article

Optimal and Robust Design of the PEM Fuel Cell Cathode Gas Diffusion Layer

2008-04-14
2008-01-1217
The cathode gas diffusion layer (GDL) is an important component of polymer electrolyte membrane (PEM) fuel cell. Its design parameters, including thickness, porosity and permeability, significantly affect the reactant transport and water management, thus impacting the fuel cell performance. This paper presents an optimization study of the GDL design parameters with the objective of maximizing the current density under a given voltage. A two-dimensional single-phase PEM fuel cell model is used. A multivariable optimization problem is formed to maximize the current density at the cathode under a given electrode voltage with respect to the GDL parameters. In order to reduce the computational effort and find the global optimum among the potential multiple optima, a global metamodel of the actual CFD-based fuel cell simulation, is adaptively generated using radial basis function approximations.
Journal Article

A Variable-Size Local Domain Approach to Computer Model Validation in Design Optimization

2011-04-12
2011-01-0243
A common approach to the validation of simulation models focuses on validation throughout the entire design space. A more recent methodology validates designs as they are generated during a simulation-based optimization process. The latter method relies on validating the simulation model in a sequence of local domains. To improve its computational efficiency, this paper proposes an iterative process, where the size and shape of local domains at the current step are determined from a parametric bootstrap methodology involving maximum likelihood estimators of unknown model parameters from the previous step. Validation is carried out in the local domain at each step. The iterative process continues until the local domain does not change from iteration to iteration during the optimization process ensuring that a converged design optimum has been obtained.
Journal Article

A Simulation and Optimization Methodology for Reliability of Vehicle Fleets

2011-04-12
2011-01-0725
Understanding reliability is critical in design, maintenance and durability analysis of engineering systems. A reliability simulation methodology is presented in this paper for vehicle fleets using limited data. The method can be used to estimate the reliability of non-repairable as well as repairable systems. It can optimally allocate, based on a target system reliability, individual component reliabilities using a multi-objective optimization algorithm. The algorithm establishes a Pareto front that can be used for optimal tradeoff between reliability and the associated cost. The method uses Monte Carlo simulation to estimate the system failure rate and reliability as a function of time. The probability density functions (PDF) of the time between failures for all components of the system are estimated using either limited data or a user-supplied MTBF (mean time between failures) and its coefficient of variation.
Technical Paper

Optimal Engine Torque Management for Reducing Driveline Clunk Using Time - Dependent Metamodels

2007-05-15
2007-01-2236
Quality and performance are two important customer requirements in vehicle design. Driveline clunk negatively affects the perceived quality and must be therefore, minimized. This is usually achieved using engine torque management, which is part of engine calibration. During a tip-in event, the engine torque rate of rise is limited until all the driveline lash is taken up. However, the engine torque rise, and its rate can negatively affect the vehicle throttle response. Therefore, the engine torque management must be balanced against throttle response. In practice, the engine torque rate of rise is calibrated manually. This paper describes a methodology for calibrating the engine torque in order to minimize the clunk disturbance, while still meeting throttle response constraints. A set of predetermined engine torque profiles are calibrated in a vehicle and the transmission turbine speed is measured for each profile. The latter is used to quantify the clunk disturbance.
Technical Paper

An Efficient Re-Analysis Methodology for Vibration of Large-Scale Structures

2007-05-15
2007-01-2326
Finite element analysis is a well-established methodology in structural dynamics. However, optimization and/or probabilistic studies can be prohibitively expensive because they require repeated FE analyses of large models. Various reanalysis methods have been proposed in order to calculate efficiently the dynamic response of a structure after a baseline design has been modified, without recalculating the new response. The parametric reduced-order modeling (PROM) and the combined approximation (CA) methods are two re-analysis methods, which can handle large model parameter changes in a relatively efficient manner. Although both methods are promising by themselves, they can not handle large FE models with large numbers of DOF (e.g. 100,000) with a large number of design parameters (e.g. 50), which are common in practice. In this paper, the advantages and disadvantages of the PROM and CA methods are first discussed in detail.
Technical Paper

A Time-Dependent Reliability Analysis Method using a Niching Genetic Algorithm

2007-04-16
2007-01-0548
A reliability analysis method is presented for time-dependent systems under uncertainty. A level-crossing problem is considered where the system fails if its maximum response exceeds a specified threshold. The proposed method uses a double-loop optimization algorithm. The inner loop calculates the maximum response in time for a given set of random variables, and transforms a time-dependent problem into a time-independent one. A time integration method is used to calculate the response at discrete times. For each sample function of the response random process, the maximum response is found using a global-local search method consisting of a genetic algorithm (GA), and a gradient-based optimizer. This dynamic response usually exhibits multiple peaks and crosses the allowable response level to form a set of complex limit states, which lead to multiple most probable points (MPPs).
Technical Paper

System Reliability-Based Design using a Single-Loop Method

2007-04-16
2007-01-0555
An efficient approach for series system reliability-based design optimization (RBDO) is presented. The key idea is to apportion optimally the system reliability among the failure modes by considering the target values of the failure probabilities of the modes as design variables. Critical failure modes that contribute the most to the overall system reliability are identified. This paper proposes a computationally efficient, system RBDO approach using a single-loop method where the searches for the optimum design and for the most probable failure points proceed simultaneously. Specifically, at each iteration the optimizer uses approximated most probable failure points from the previous iteration to search for the optimum. A second-order Ditlevsen upper bound is used for the joint failure probability of failure modes. Also, an easy to implement active strategy set is employed to improve algorithmic stability.
Technical Paper

An Efficient Possibility-Based Design Optimization Method for a Combination of Interval and Random Variables

2007-04-16
2007-01-0553
Reliability-based design optimization accounts for variation. However, it assumes that statistical information is available in the form of fully defined probabilistic distributions. This is not true for a variety of engineering problems where uncertainty is usually given in terms of interval ranges. In this case, interval analysis or possibility theory can be used instead of probability theory. This paper shows how possibility theory can be used in design and presents a computationally efficient sequential optimization algorithm. The algorithm handles problems with only uncertain or a combination of random and uncertain design variables and parameters. It consists of a sequence of cycles composed of a deterministic design optimization followed by a set of worst-case reliability evaluation loops. A crank-slider mechanism example demonstrates the accuracy and efficiency of the proposed sequential algorithm.
Technical Paper

Development of Intelligent Navigation Systems for Chinese Users

2008-04-14
2008-01-0198
Navigation systems have been recently introduced into vehicles in China, one of the world's largest in-vehicle system markets in the next few years. However, existing Chinese navigation systems either simply translate the system language from other languages to Chinese or do not have intelligent functions which consider the characteristics of Chinese users, their language, or geographic features of mainland China. To solve this problem, this study first reviewed the characteristics of Chinese language (textual part) including its visual (words orientation: horizontal vs. vertical), words simplification, and auditory formats (e.g., dialects) which are different from western languages.
Technical Paper

Imprecise Reliability Assessment When the Type of the Probability Distribution of the Random Variables is Unknown

2009-04-20
2009-01-0199
In reliability design, often, there is scarce data for constructing probabilistic models. It is particularly challenging to model uncertainty in variables when the type of their probability distribution is unknown. Moreover, it is expensive to estimate the upper and lower bounds of the reliability of a system involving such variables. A method for modeling uncertainty by using Polynomial Chaos Expansion is presented. The method requires specifying bounds for statistical summaries such as the first four moments and credible intervals. A constrained optimization problem, in which decision variables are the coefficients of the Polynomial Chaos Expansion approximation, is formulated and solved in order to estimate the minimum and maximum values of a system’s reliability. This problem is solved efficiently by employing a probabilistic re-analysis approach to approximate the system reliability as a function of the moments of the random variables.
Technical Paper

Improving Robust Design with Preference Aggregation Methods

2004-03-08
2004-01-1140
Robust design is a methodology for improving the quality of a product or process by minimizing the effect of variations in the inputs without eliminating the causes of those variations. In robust design, the putative best design is obtained by solving a multi-criteria optimization problem, trading off the nominal performance against the minimization of the variation of the performance measure. Because some existing methods combine the two criteria with a weighted sum or another fixed aggregation strategy, which are known to miss Pareto points, they may fail to obtain a desired design. To overcome this inadequacy, a more comprehensive preference aggregation method is implemented here into robust design. Three examples -- one simple mathematical example, one multi-criteria structure design example, and one automotive example -- are presented to illustrate the effectiveness of the proposed method.
Technical Paper

Hydraulic Pressure Control and Parameter Optimization of Integrated Electro-Hydraulic Brake System

2017-09-17
2017-01-2516
A general principle scheme of IEHB (Integrated Electro-Hydraulic Brake system) is proposed, and the working principle of the system is simply introduced in this paper. Considering the structure characteristics of the hydraulic control unit of the system, a kind of time-sharing control strategy is adopted to realize the purpose of independent and precise hydraulic pressure regulation of each wheel brake cylinder in various brake conditions of a vehicle. Because of the strong nonlinear and time varying characteristics of the dynamic brake pressure regulation processes of IEHB, its comprehensive brake performance is mainly affected by temperature, humidity, load change, the structure and control parameters of IEHB, and so on.
Technical Paper

Reliability Based Design Optimization of Dynamic Vehicle Performance Using Bond Graphs and Time Dependent Metamodels

2006-04-03
2006-01-0109
A vehicle drivetrain is designed to meet specific vehicle performance criteria which usually involve trade-offs among conflicting performance measures. This paper describes a methodology to optimize the drivetrain design including the axle ratio, transmission shift points and transmission shift ratios considering uncertainty. A complete vehicle dynamic model is developed using the bond graph method. The model includes the vehicle, engine, transmission, torque converter, driveline, and transmission controller. An equivalent MATLAB Simulink model performs the nonlinear dynamic analysis. In order to reduce the computational effort, a time-dependent metamodel is developed based on principal component analysis using singular value decomposition. The optimization is performed using both the Simulink vehicle dynamic model and the metamodel. A deterministic optimization first determines the optimum design in terms of fuel economy, without considering variations or uncertainties.
Technical Paper

Simulation of Tire-Snow Interfacial Forces for a Range of Snow Densities with Uncertainty

2006-04-03
2006-01-0497
The objective of this paper is to assess the effect of snow density on tire-snow interaction in the presence of uncertainty. The snow-depth dependent finite element analysis (FEA) and semi-analytical models we have developed recently can predict tire-snow interfacial forces at a given density under combined slip conditions. One drawback of the models is that they are only applicable for fresh, low-density snow due to the unavailability of a density-dependent snow model. In reality, the snow density on the ground can vary between that of fresh snow to heavily compacted snow that is similar to ice. Even for fresh snow on the ground, as a vehicle moves forward, the rear wheels experience higher snow densities than the front wheels. In addition, being a natural material, snow's physical properties vary significantly even for the same density.
Technical Paper

Design Optimization and Reliability Estimation with Incomplete Uncertainty Information

2006-04-03
2006-01-0962
Existing methods for design optimization under uncertainty assume that a high level of information is available, typically in the form of data. In reality, however, insufficient data prevents correct inference of probability distributions, membership functions, or interval ranges. In this article we use an engine design example to show that optimal design decisions and reliability estimations depend strongly on uncertainty characterization. We contrast the reliability-based optimal designs to the ones obtained using worst-case optimization, and ask the question of how to obtain non-conservative designs with incomplete uncertainty information. We propose an answer to this question through the use of Bayesian statistics. We estimate the truck's engine reliability based only on available samples, and demonstrate that the accuracy of our estimates increases as more samples become available.
X