Refine Your Search

Topic

Search Results

Journal Article

A Methodology for Fatigue Life Estimation of Linear Vibratory Systems under Non-Gaussian Loads

2017-03-28
2017-01-0197
Fatigue life estimation, reliability and durability are important in acquisition, maintenance and operation of vehicle systems. Fatigue life is random because of the stochastic load, the inherent variability of material properties, and the uncertainty in the definition of the S-N curve. The commonly used fatigue life estimation methods calculate the mean (not the distribution) of fatigue life under Gaussian loads using the potentially restrictive narrow-band assumption. In this paper, a general methodology is presented to calculate the statistics of fatigue life for a linear vibratory system under stationary, non-Gaussian loads considering the effects of skewness and kurtosis. The input loads are first characterized using their first four moments (mean, standard deviation, skewness and kurtosis) and a correlation structure equivalent to a given Power Spectral Density (PSD).
Journal Article

Durability Test Time Reduction Methods

2017-03-28
2017-01-0258
Laboratory based durability simulation has become an increasingly important component of vehicle system design validation and production release. It offers several advantages over field testing which has driven its adoption in the automotive and military sectors. Among these advantages are 1) repeatability, 2) earlier testing, 3) isolation of subsystems or components and 4) ability to compress and/or accelerate the testing. In this paper we present time-domain methods and techniques adapted, implemented and used at TARDEC to reduce the time required to perform a laboratory durability test of a full vehicle system, subsystem or component. Specifically, these methods approach a durability schedule holistically by considering all events/surfaces, repeats and channels of interest. They employ the standard Generic Stress Life (GSL) approach, utilizing rain flow cycle counting and a minimum-average method of identifying segments of the events which are less severe.
Journal Article

An Efficient Method to Calculate the Failure Rate of Dynamic Systems with Random Parameters Using the Total Probability Theorem

2015-04-14
2015-01-0425
Using the total probability theorem, we propose a method to calculate the failure rate of a linear vibratory system with random parameters excited by stationary Gaussian processes. The response of such a system is non-stationary because of the randomness of the input parameters. A space-filling design, such as optimal symmetric Latin hypercube sampling or maximin, is first used to sample the input parameter space. For each design point, the output process is stationary and Gaussian. We present two approaches to calculate the corresponding conditional probability of failure. A Kriging metamodel is then created between the input parameters and the output conditional probabilities allowing us to estimate the conditional probabilities for any set of input parameters. The total probability theorem is finally applied to calculate the time-dependent probability of failure and the failure rate of the dynamic system. The proposed method is demonstrated using a vibratory system.
Technical Paper

A Cost-Driven Method for Design Optimization Using Validated Local Domains

2013-04-08
2013-01-1385
Design optimization often relies on computational models, which are subjected to a validation process to ensure their accuracy. Because validation of computer models in the entire design space can be costly, we have previously proposed an approach where design optimization and model validation, are concurrently performed using a sequential approach with variable-size local domains. We used test data and statistical bootstrap methods to size each local domain where the prediction model is considered validated and where design optimization is performed. The method proceeds iteratively until the optimum design is obtained. This method however, requires test data to be available in each local domain along the optimization path. In this paper, we refine our methodology by using polynomial regression to predict the size and shape of a local domain at some steps along the optimization process without using test data.
Journal Article

Piston Design Using Multi-Objective Reliability-Based Design Optimization

2010-04-12
2010-01-0907
Piston design is a challenging engineering problem which involves complex physics and requires satisfying multiple performance objectives. Uncertainty in piston operating conditions and variability in piston design variables are inevitable and must be accounted for. The piston assembly can be a major source of engine mechanical friction and cold start noise, if not designed properly. In this paper, an analytical piston model is used in a deterministic and probabilistic (reliability-based) multi-objective design optimization process to obtain an optimal piston design. The model predicts piston performance in terms of scuffing, friction and noise, In order to keep the computational cost low, efficient and accurate metamodels of the piston performance metrics are used. The Pareto set of all optimal solutions is calculated allowing the designer to choose the “best” solution according to trade-offs among the multiple objectives.
Technical Paper

Modeling Dependence and Assessing the Effect of Uncertainty in Dependence in Probabilistic Analysis and Decision Under Uncertainty

2010-04-12
2010-01-0697
A complete probabilistic model of uncertainty in probabilistic analysis and design problems is the joint probability distribution of the random variables. Often, it is impractical to estimate this joint probability distribution because the mechanism of the dependence of the variables is not completely understood. This paper proposes modeling dependence by using copulas and demonstrates their representational power. It also compares this representation with a Monte-Carlo simulation using dispersive sampling.
Journal Article

Time-Dependent Reliability Estimation for Dynamic Systems Using a Random Process Approach

2010-04-12
2010-01-0644
Reliability is an important engineering requirement for consistently delivering acceptable product performance through time. As time progresses, the product may fail due to time-dependent operating conditions and material properties, component degradation, etc. The reliability degradation with time may increase the lifecycle cost due to potential warranty costs, repairs and loss of market share. Reliability is the probability that the system will perform its intended function successfully for a specified time interval. In this work, we consider the first-passage reliability which accounts for the first time failure of non-repairable systems. Methods are available in the literature, which provide an upper bound to the true reliability which may overestimate the true value considerably. This paper proposes a methodology to calculate the cumulative probability of failure (probability of first passage or upcrossing) of a dynamic system, driven by an ergodic input random process.
Journal Article

On the Time-Dependent Reliability of Non-Monotonic, Non-Repairable Systems

2010-04-12
2010-01-0696
The system response of many engineering systems depends on time. A random process approach is therefore, needed to quantify variation or uncertainty. The system input may consist of a combination of random variables and random processes. In this case, a time-dependent reliability analysis must be performed to calculate the probability of failure within a specified time interval. This is known as cumulative probability of failure which is in general, different from the instantaneous probability of failure. Failure occurs if the limit state function becomes negative at least at one instance within a specified time interval. Time-dependent reliability problems appear if for example, the material properties deteriorate in time or if random loading is involved which is modeled by a random process. Existing methods to calculate the cumulative probability of failure provide an upper bound which may grossly overestimate the true value.
Journal Article

An RBDO Method for Multiple Failure Region Problems using Probabilistic Reanalysis and Approximate Metamodels

2009-04-20
2009-01-0204
A Reliability-Based Design Optimization (RBDO) method for multiple failure regions is presented. The method uses a Probabilistic Re-Analysis (PRRA) approach in conjunction with an approximate global metamodel with local refinements. The latter serves as an indicator to determine the failure and safe regions. PRRA calculates very efficiently the system reliability of a design by performing a single Monte Carlo (MC) simulation. Although PRRA is based on MC simulation, it calculates “smooth” sensitivity derivatives, allowing therefore, the use of a gradient-based optimizer. An “accurate-on-demand” metamodel is used in the PRRA that allows us to handle problems with multiple disjoint failure regions and potentially multiple most-probable points (MPP). The multiple failure regions are identified by using a clustering technique. A maximin “space-filling” sampling technique is used to construct the metamodel. A vibration absorber example highlights the potential of the proposed method.
Technical Paper

Reliability Estimation of Large-Scale Dynamic Systems by using Re-analysis and Tail Modeling

2009-04-20
2009-01-0200
Probabilistic studies can be prohibitively expensive because they require repeated finite element analyses of large models. Re-analysis methods have been proposed with the premise to estimate accurately the dynamic response of a structure after a baseline design has been modified, without recalculating the new response. Although these methods increase computational efficiency, they are still not efficient enough for probabilistic analysis of large-scale dynamic systems with low failure probabilities (less or equal to 10-3). This paper presents a methodology that uses deterministic and probabilistic re-analysis methods to generate sample points of the response. Subsequently, tail modeling is used to estimate the right tail of the response PDF and the probability of failure a highly reliable system. The methodology is demonstrated on probabilistic vibration analysis of a realistic vehicle FE model.
Technical Paper

Imprecise Reliability Assessment When the Type of the Probability Distribution of the Random Variables is Unknown

2009-04-20
2009-01-0199
In reliability design, often, there is scarce data for constructing probabilistic models. It is particularly challenging to model uncertainty in variables when the type of their probability distribution is unknown. Moreover, it is expensive to estimate the upper and lower bounds of the reliability of a system involving such variables. A method for modeling uncertainty by using Polynomial Chaos Expansion is presented. The method requires specifying bounds for statistical summaries such as the first four moments and credible intervals. A constrained optimization problem, in which decision variables are the coefficients of the Polynomial Chaos Expansion approximation, is formulated and solved in order to estimate the minimum and maximum values of a system’s reliability. This problem is solved efficiently by employing a probabilistic re-analysis approach to approximate the system reliability as a function of the moments of the random variables.
Journal Article

Prediction of Automotive Side Swing Door Closing Effort

2009-04-20
2009-01-0084
The door closing effort is a quality issue concerning both automobile designers and customers. This paper describes an Excel based mathematical model for predicting the side door closing effort in terms of the required minimum energy or velocity, to close the door from a small open position when the check-link ceases to function. A simplified but comprehensive model is developed which includes the cabin pressure (air bind), seal compression, door weight, latch effort, and hinge friction effects. The flexibility of the door and car body is ignored. Because the model simplification introduces errors, we calibrate it using measured data. Calibration is also necessary because some input parameters are difficult to obtain directly. In this work, we provide the option to calibrate the hinge model, the latch model, the seal compression model, and the air bind model. The door weight effect is geometrically exact, and does not need calibration.
Journal Article

Efficient Re-Analysis Methodology for Probabilistic Vibration of Large-Scale Structures

2008-04-14
2008-01-0216
It is challenging to perform probabilistic analysis and design of large-scale structures because probabilistic analysis requires repeated finite element analyses of large models and each analysis is expensive. This paper presents a methodology for probabilistic analysis and reliability based design optimization of large scale structures that consists of two re-analysis methods; one for estimating the deterministic vibratory response and another for estimating the probability of the response exceeding a certain level. The deterministic re-analysis method can analyze efficiently large-scale finite element models consisting of tens or hundreds of thousand degrees of freedom and large numbers of design variables that vary in a wide range. The probabilistic re-analysis method calculates very efficiently the system reliability for many probability distributions of the design variables by performing a single Monte Carlo simulation.
Journal Article

Design under Uncertainty using a Combination of Evidence Theory and a Bayesian Approach

2008-04-14
2008-01-0377
Early in the engineering design cycle, it is difficult to quantify product reliability due to insufficient data or information to model uncertainties. Probability theory can not be therefore, used. Design decisions are usually based on fuzzy information which is imprecise and incomplete. Various design methods such as Possibility-Based Design Optimization (PBDO) and Evidence-Based Design Optimization (EBDO) have been developed to systematically treat design with non-probabilistic uncertainties. In practical engineering applications, information regarding the uncertain variables and parameters may exist in the form of sample points, and uncertainties with sufficient and insufficient information may exist simultaneously. Most of the existing optimal design methods under uncertainty can not handle this form of incomplete information. They have to either discard some valuable information or postulate the existence of additional information.
Journal Article

Probabilistic Reanalysis Using Monte Carlo Simulation

2008-04-14
2008-01-0215
An approach for Probabilistic Reanalysis (PRA) of a system is presented. PRA calculates very efficiently the system reliability or the average value of an attribute of a design for many probability distributions of the input variables, by performing a single Monte Carlo simulation. In addition, PRA calculates the sensitivity derivatives of the reliability to the parameters of the probability distributions. The approach is useful for analysis problems where reliability bounds need to be calculated because the probability distribution of the input variables is uncertain or for design problems where the design variables are random. The accuracy and efficiency of PRA is demonstrated on vibration analysis of a car and on system reliability-based optimization (RBDO) of an internal combustion engine.
Journal Article

Optimal and Robust Design of the PEM Fuel Cell Cathode Gas Diffusion Layer

2008-04-14
2008-01-1217
The cathode gas diffusion layer (GDL) is an important component of polymer electrolyte membrane (PEM) fuel cell. Its design parameters, including thickness, porosity and permeability, significantly affect the reactant transport and water management, thus impacting the fuel cell performance. This paper presents an optimization study of the GDL design parameters with the objective of maximizing the current density under a given voltage. A two-dimensional single-phase PEM fuel cell model is used. A multivariable optimization problem is formed to maximize the current density at the cathode under a given electrode voltage with respect to the GDL parameters. In order to reduce the computational effort and find the global optimum among the potential multiple optima, a global metamodel of the actual CFD-based fuel cell simulation, is adaptively generated using radial basis function approximations.
Technical Paper

Optimal Engine Torque Management for Reducing Driveline Clunk Using Time - Dependent Metamodels

2007-05-15
2007-01-2236
Quality and performance are two important customer requirements in vehicle design. Driveline clunk negatively affects the perceived quality and must be therefore, minimized. This is usually achieved using engine torque management, which is part of engine calibration. During a tip-in event, the engine torque rate of rise is limited until all the driveline lash is taken up. However, the engine torque rise, and its rate can negatively affect the vehicle throttle response. Therefore, the engine torque management must be balanced against throttle response. In practice, the engine torque rate of rise is calibrated manually. This paper describes a methodology for calibrating the engine torque in order to minimize the clunk disturbance, while still meeting throttle response constraints. A set of predetermined engine torque profiles are calibrated in a vehicle and the transmission turbine speed is measured for each profile. The latter is used to quantify the clunk disturbance.
Technical Paper

An Efficient Re-Analysis Methodology for Vibration of Large-Scale Structures

2007-05-15
2007-01-2326
Finite element analysis is a well-established methodology in structural dynamics. However, optimization and/or probabilistic studies can be prohibitively expensive because they require repeated FE analyses of large models. Various reanalysis methods have been proposed in order to calculate efficiently the dynamic response of a structure after a baseline design has been modified, without recalculating the new response. The parametric reduced-order modeling (PROM) and the combined approximation (CA) methods are two re-analysis methods, which can handle large model parameter changes in a relatively efficient manner. Although both methods are promising by themselves, they can not handle large FE models with large numbers of DOF (e.g. 100,000) with a large number of design parameters (e.g. 50), which are common in practice. In this paper, the advantages and disadvantages of the PROM and CA methods are first discussed in detail.
Technical Paper

A Time-Dependent Reliability Analysis Method using a Niching Genetic Algorithm

2007-04-16
2007-01-0548
A reliability analysis method is presented for time-dependent systems under uncertainty. A level-crossing problem is considered where the system fails if its maximum response exceeds a specified threshold. The proposed method uses a double-loop optimization algorithm. The inner loop calculates the maximum response in time for a given set of random variables, and transforms a time-dependent problem into a time-independent one. A time integration method is used to calculate the response at discrete times. For each sample function of the response random process, the maximum response is found using a global-local search method consisting of a genetic algorithm (GA), and a gradient-based optimizer. This dynamic response usually exhibits multiple peaks and crosses the allowable response level to form a set of complex limit states, which lead to multiple most probable points (MPPs).
Technical Paper

System Reliability-Based Design using a Single-Loop Method

2007-04-16
2007-01-0555
An efficient approach for series system reliability-based design optimization (RBDO) is presented. The key idea is to apportion optimally the system reliability among the failure modes by considering the target values of the failure probabilities of the modes as design variables. Critical failure modes that contribute the most to the overall system reliability are identified. This paper proposes a computationally efficient, system RBDO approach using a single-loop method where the searches for the optimum design and for the most probable failure points proceed simultaneously. Specifically, at each iteration the optimizer uses approximated most probable failure points from the previous iteration to search for the optimum. A second-order Ditlevsen upper bound is used for the joint failure probability of failure modes. Also, an easy to implement active strategy set is employed to improve algorithmic stability.
X