Refine Your Search

Topic

Search Results

Journal Article

A Methodology for Design Decisions using Block Diagrams

2013-04-08
2013-01-0947
Our recent work has shown that representation of systems using a reliability block diagram can be used as a decision making tool. In decision making, we called these block diagrams decision topologies. In this paper, we generalize the results and show that decision topologies can be used to make many engineering decisions and can in fact replace decision analysis for most decisions. We also provide a meta-proof that the proposed method using decision topologies is entirely consistent with decision analysis at the limit. The main advantages of the method are that (1) it provides a visual representation of a decision situation, (2) it can easily model tradeoffs, (3) it can incorporate binary attributes, (4) it can model preferences with limited information, and (5) it can be used in a low-fidelity sense to quickly make a decision.
Technical Paper

A Methodology of Design for Fatigue Using an Accelerated Life Testing Approach with Saddlepoint Approximation

2019-04-02
2019-01-0159
We present an Accelerated Life Testing (ALT) methodology along with a design for fatigue approach, using Gaussian or non-Gaussian excitations. The accuracy of fatigue life prediction at nominal loading conditions is affected by model and material uncertainty. This uncertainty is reduced by performing tests at a higher loading level, resulting in a reduction in test duration. Based on the data obtained from experiments, we formulate an optimization problem to calculate the Maximum Likelihood Estimator (MLE) values of the uncertain model parameters. In our proposed ALT method, we lift all the assumptions on the type of life distribution or the stress-life relationship and we use Saddlepoint Approximation (SPA) method to calculate the fatigue life Probability Density Functions (PDFs).
Journal Article

A New Metamodeling Approach for Time-Dependent Reliability of Dynamic Systems with Random Parameters Excited by Input Random Processes

2014-04-01
2014-01-0717
We propose a new metamodeling method to characterize the output (response) random process of a dynamic system with random parameters, excited by input random processes. The metamodel can be then used to efficiently estimate the time-dependent reliability of a dynamic system using analytical or simulation-based methods. The metamodel is constructed by decomposing the input random processes using principal components or wavelets and then using a few simulations to estimate the distributions of the decomposition coefficients. A similar decomposition is also performed on the output random process. A kriging model is then established between the input and output decomposition coefficients and subsequently used to quantify the output random process corresponding to a realization of the input random parameters and random processes. What distinguishes our approach from others in metamodeling is that the system input is not deterministic but random.
Journal Article

A Re-Analysis Methodology for System RBDO Using a Trust Region Approach with Local Metamodels

2010-04-12
2010-01-0645
A simulation-based, system reliability-based design optimization (RBDO) method is presented that can handle problems with multiple failure regions and correlated random variables. Copulas are used to represent the correlation. The method uses a Probabilistic Re-Analysis (PRRA) approach in conjunction with a trust-region optimization approach and local metamodels covering each trust region. PRRA calculates very efficiently the system reliability of a design by performing a single Monte Carlo (MC) simulation per trust region. Although PRRA is based on MC simulation, it calculates “smooth” sensitivity derivatives, allowing therefore, the use of a gradient-based optimizer. The PRRA method is based on importance sampling. It provides accurate results, if the support of the sampling PDF contains the support of the joint PDF of the input random variables. The sequential, trust-region optimization approach satisfies this requirement.
Journal Article

A Simulation and Optimization Methodology for Reliability of Vehicle Fleets

2011-04-12
2011-01-0725
Understanding reliability is critical in design, maintenance and durability analysis of engineering systems. A reliability simulation methodology is presented in this paper for vehicle fleets using limited data. The method can be used to estimate the reliability of non-repairable as well as repairable systems. It can optimally allocate, based on a target system reliability, individual component reliabilities using a multi-objective optimization algorithm. The algorithm establishes a Pareto front that can be used for optimal tradeoff between reliability and the associated cost. The method uses Monte Carlo simulation to estimate the system failure rate and reliability as a function of time. The probability density functions (PDF) of the time between failures for all components of the system are estimated using either limited data or a user-supplied MTBF (mean time between failures) and its coefficient of variation.
Journal Article

A Subdomain Approach for Uncertainty Quantification of Long Time Horizon Random Processes

2023-04-11
2023-01-0083
This paper addresses the uncertainty quantification of time-dependent problems excited by random processes represented by Karhunen Loeve (KL) expansion. The latter expresses a random process as a series of terms involving the dominant eigenvalues and eigenfunctions of the process covariance matrix weighted by samples of uncorrelated standard normal random variables. For many engineering appli bn vb nmcations, such as random vibrations, durability or fatigue, a long-time horizon is required for meaningful results. In this case however, a large number of KL terms is needed resulting in a very high computational effort for uncertainty propagation. This paper presents a new approach to generate time trajectories (sample functions) of a random process using KL expansion, if the time horizon (duration) is much larger than the process correlation length.
Technical Paper

An Optimization Study of Manufacturing Variation Effects on Diesel Injector Design with Emphasis on Emissions

2004-03-08
2004-01-1560
This paper investigates the effects of manufacturing variations in fuel injectors on the engine performance with emphasis on emissions. The variations are taken into consideration within a Reliability-Based Design Optimization (RBDO) framework. A reduced version of Multi-Zone Diesel engine Simulation (MZDS), MZDS-lite, is used to enable the optimization study. The numerical noise of MZDS-lite prohibits the use of gradient-based optimization methods. Therefore, surrogate models are developed to filter out the noise and to reduce computational cost. Three multi-objective optimization problems are formulated, solved and compared: deterministic optimization using MZDS-lite, deterministic optimization using surrogate models and RBDO using surrogate models. The obtained results confirm that manufacturing variation effects must be taken into account in the early product development stages.
Technical Paper

Design Optimization and Reliability Estimation with Incomplete Uncertainty Information

2006-04-03
2006-01-0962
Existing methods for design optimization under uncertainty assume that a high level of information is available, typically in the form of data. In reality, however, insufficient data prevents correct inference of probability distributions, membership functions, or interval ranges. In this article we use an engine design example to show that optimal design decisions and reliability estimations depend strongly on uncertainty characterization. We contrast the reliability-based optimal designs to the ones obtained using worst-case optimization, and ask the question of how to obtain non-conservative designs with incomplete uncertainty information. We propose an answer to this question through the use of Bayesian statistics. We estimate the truck's engine reliability based only on available samples, and demonstrate that the accuracy of our estimates increases as more samples become available.
Technical Paper

Design Under Uncertainty and Assessment of Performance Reliability of a Dual-Use Medium Truck with Hydraulic-Hybrid Powertrain and Fuel Cell Auxiliary Power Unit

2005-04-11
2005-01-1396
Medium trucks constitute a large market segment of the commercial transportation sector, and are also used widely for military tactical operations. Recent technological advances in hybrid powertrains and fuel cell auxiliary power units have enabled design alternatives that can improve fuel economy and reduce emissions dramatically. However, deterministic design optimization of these configurations may yield designs that are optimal with respect to performance but raise concerns regarding the reliability of achieving that performance over lifetime. In this article we identify and quantify uncertainties due to modeling approximations or incomplete information. We then model their propagation using Monte Carlo simulation and perform sensitivity analysis to isolate statistically significant uncertainties. Finally, we formulate and solve a series of reliability-based optimization problems and quantify tradeoffs between optimality and reliability.
Journal Article

Flexible Design and Operation of a Smart Charging Microgrid

2014-04-01
2014-01-0716
The reliability theory of repairable systems is vastly different from that of non-repairable systems. The authors have recently proposed a ‘decision-based’ framework to design and maintain repairable systems for optimal performance and reliability using a set of metrics such as minimum failure free period, number of failures in planning horizon (lifecycle), and cost. The optimal solution includes the initial design, the system maintenance throughout the planning horizon, and the protocol to operate the system. In this work, we extend this idea by incorporating flexibility and demonstrate our approach using a smart charging electric microgrid architecture. The flexibility is realized by allowing the architecture to change with time. Our approach “learns” the working characteristics of the microgrid. We use actual load and supply data over a short time to quantify the load and supply random processes and also establish the correlation between them.
Technical Paper

Managing the Computational Cost in a Monte Carlo Simulation by Considering the Value of Information

2012-04-16
2012-01-0915
Monte Carlo simulation is a popular tool for reliability assessment because of its robustness and ease of implementation. A major concern with this method is its computational cost; standard Monte Carlo simulation requires quadrupling the number of replications for halving the standard deviation of the estimated failure probability. Efforts to increase efficiency focus on intelligent sampling procedures and methods for efficient calculation of the performance function of a system. This paper proposes a new method to manage cost that views design as a decision among alternatives with uncertain reliabilities. Information from a simulation has value only if it enables the designer to make a better choice among the alternative options. Consequently, the value of information from the simulation is equal to the gain from using this information to improve the decision. A designer can determine the number of replications that are worth performing by using the method.
Journal Article

Managing the Computational Cost of Monte Carlo Simulation with Importance Sampling by Considering the Value of Information

2013-04-08
2013-01-0943
Importance Sampling is a popular method for reliability assessment. Although it is significantly more efficient than standard Monte Carlo simulation if a suitable sampling distribution is used, in many design problems it is too expensive. The authors have previously proposed a method to manage the computational cost in standard Monte Carlo simulation that views design as a choice among alternatives with uncertain reliabilities. Information from simulation has value only if it helps the designer make a better choice among the alternatives. This paper extends their method to Importance Sampling. First, the designer estimates the prior probability density functions of the reliabilities of the alternative designs and calculates the expected utility of the choice of the best design. Subsequently, the designer estimates the likelihood function of the probability of failure by performing an initial simulation with Importance Sampling.
Journal Article

Mean-Value Second-Order Saddlepoint Approximation for Reliability Analysis

2017-03-28
2017-01-0207
A new second-order Saddlepoint Approximation (SA) method for structural reliability analysis is introduced. The Mean-value Second-order Saddlepoint Approximation (MVSOSA) is presented as an extension to the Mean-value First-order Saddlepoint Approximation (MVFOSA). The proposed method is based on a second-order Taylor expansion of the limit state function around the mean value of the input random variables. It requires not only the first but also the second-order sensitivity derivatives of the limit state function. If sensitivity analysis must be avoided because of computational cost, a quadrature integration approach, based on sparse grids, is also presented and linked to the saddlepoint approximation (SGSA - Sparse Grid Saddlepoint Approximation). The SGSA method is compared with the first and second-order SA methods in terms of accuracy and efficiency. The proposed MVSOSA and SGSA methods are used in the reliability analysis of two examples.
Technical Paper

Modeling the Stiffness and Damping Properties of Styrene-Butadiene Rubber

2011-05-17
2011-01-1628
Styrene-Butadiene Rubber (SBR), a copolymer of butadiene and styrene, is widely used in the automotive industry due to its high durability and resistance to abrasion, oils and oxidation. Some of the common applications include tires, vibration isolators, and gaskets, among others. This paper characterizes the dynamic behavior of SBR and discusses the suitability of a visco-elastic model of elastomers, known as the Kelvin model, from a mathematical and physical point of view. An optimization algorithm is used to estimate the parameters of the Kelvin model. The resulting model was shown to produce reasonable approximations of measured dynamic stiffness. The model was also used to calculate the self heating of the elastomer due to energy dissipation by the viscous damping components in the model. Developing such a predictive capability is essential in understanding the dynamic behavior of elastomers considering that their dynamic stiffness can in general depend on temperature.
Journal Article

Optimal Preventive Maintenance Schedule Based on Lifecycle Cost and Time-Dependent Reliability

2012-04-16
2012-01-0070
Reliability is an important engineering requirement for consistently delivering acceptable product performance through time. It also affects the scheduling for preventive maintenance. Reliability usually degrades with time increasing therefore, the lifecycle cost due to more frequent failures which result in increased warranty costs, costly repairs and loss of market share. In a lifecycle cost based design, we must account for product quality and preventive maintenance using time-dependent reliability. Quality is a measure of our confidence that the product conforms to specifications as it leaves the factory. For a repairable system, preventive maintenance is scheduled to avoid failures, unnecessary production loss and safety violations. This article proposes a methodology to obtain the optimal scheduling for preventive maintenance using time-dependent reliability principles.
Technical Paper

Optimal Water Jacket Flow Distribution Using a New Group-Based Space-Filling Design of Experiments Algorithm

2018-04-03
2018-01-1017
The availability of computational resources has enabled an increased utilization of Design of Experiments (DoE) and metamodeling (response surface generation) for large-scale optimization problems. Despite algorithmic advances however, the analysis of systems such as water jackets of an automotive engine, can be computationally demanding in part due to the required accuracy of metamodels. Because the metamodels may have many inputs, their accuracy depends on the number of training points and how well they cover the entire design (input) space. For this reason, the space-filling properties of the DoE are very important. This paper utilizes a new group-based DoE algorithm with space-filling groups of points to construct a metamodel. Points are added sequentially so that the space-filling properties of the entire group of points is preserved. The addition of points is continuous until a specified metamodel accuracy is met.
Journal Article

Prediction of Fuel Maps in Variable Valve Timing Spark Ignited Gasoline Engines Using Kriging Metamodels

2020-04-14
2020-01-0744
Creating a fuel map for simulation of an engine with Variable Valve Actuation (VVA) can be computationally demanding. Design of Experiments (DOE) and metamodeling is one way to address this issue. In this paper, we introduce a sequential process to generate an engine fuel map using Kriging metamodels which account for different engine characteristics such as load and fuel consumption at different operating conditions. The generated map predicts engine output parameters such as fuel rate and load. We first create metamodels to accurately predict the Brake Mean Effective Pressure (BMEP), fuel rate, Residual Gas Fraction (RGF) and CA50 (Crank Angle for 50% Heat Release after top dead center). The last two quantities are used to ensure acceptable combustion. The metamodels are created sequentially to ensure acceptable accuracy is achieved with a small number of simulations.
Journal Article

Prediction of Spark Timing to Achieve a Specified Torque Profile in Spark-Ignition Engines Using Time-Dependent Metamodeling

2021-04-06
2021-01-0238
The internal combustion engine is a source of unwanted vibration on the vehicle body. The unwanted vibration comes from forces on the engine mounts which depend on the engine torque during a transient maneuver. In particular, during a tip-in or a tip-out maneuver, different torque profiles result in different magnitudes of vibration. A desired engine torque shape can be thus obtained to minimize the unwanted vibration. The desired torque shape can be achieved by controlling a set of engine calibration parameters. This paper provides a methodology to determine the spark timing profile to achieve a desired engine torque profile during a tip-out maneuver. The spark timing profiles are described by a third-order polynomial as a function of time. A set of coefficients to define a third-order polynomial (design sites) are first generated using design of experiments (DOE).
Technical Paper

Propagation of Uncertainty in Optimal Design of Multilevel Systems: Piston-Ring/Cylinder-Liner Case Study

2004-03-08
2004-01-1559
This paper proposes an approach for optimal design of multilevel systems under uncertainty. The approach utilizes the stochastic extension of the analytical target cascading formulation. The reliability of satisfying the probabilistic constraints is computed by means of the most probable point method using the hybrid mean value algorithm. A linearization technique is employed for estimating the propagation of uncertainties throughout the problem hierarchy. The proposed methodology is applied to a piston-ring/cylinder-liner engine subassembly design problem. Specifically, we assess the impact of variations in manufacturing-related properties such as surface roughness on engine attributes such as brake-specific fuel consumption. Results are compared to the ones obtained using Monte Carlo simulation.
Journal Article

Reanalysis of Linear Dynamic Systems using Modified Combined Approximations with Frequency Shifts

2016-04-05
2016-01-1338
Weight reduction is very important in automotive design because of stringent demand on fuel economy. Structural optimization of dynamic systems using finite element (FE) analysis plays an important role in reducing weight while simultaneously delivering a product that meets all functional requirements for durability, crash and NVH. With advancing computer technology, the demand for solving large FE models has grown. Optimization is however costly due to repeated full-order analyses. Reanalysis methods can be used in structural vibrations to reduce the analysis cost from repeated eigenvalue analyses for both deterministic and probabilistic problems. Several reanalysis techniques have been introduced over the years including Parametric Reduced Order Modeling (PROM), Combined Approximations (CA) and the Epsilon algorithm, among others.
X