Refine Your Search

Topic

Search Results

Technical Paper

A 3-D Joint Model for Automotive Structures

1992-06-01
921088
A simple, design-oriented model of joints in vehicles structures is developed. This model accounts for the flexibility, the offsets of rotation centers of joint branches from geometric center, and the coupling between rotations of a joint branch in different planes. A family of joint models with different levels of complexity is also defined. A probabilistic system identification is used to estimate the joint model parameters by using the measured displacements. Statistical tools which identify important parameters are also presented. The identification methodology is applied to the estimation of parameters of a B-pillar to rocker joint.
Technical Paper

A Cost-Driven Method for Design Optimization Using Validated Local Domains

2013-04-08
2013-01-1385
Design optimization often relies on computational models, which are subjected to a validation process to ensure their accuracy. Because validation of computer models in the entire design space can be costly, we have previously proposed an approach where design optimization and model validation, are concurrently performed using a sequential approach with variable-size local domains. We used test data and statistical bootstrap methods to size each local domain where the prediction model is considered validated and where design optimization is performed. The method proceeds iteratively until the optimum design is obtained. This method however, requires test data to be available in each local domain along the optimization path. In this paper, we refine our methodology by using polynomial regression to predict the size and shape of a local domain at some steps along the optimization process without using test data.
Journal Article

A Group-Based Space-Filling Design of Experiments Algorithm

2018-04-03
2018-01-1102
Computer-aided engineering (CAE) is an important tool routinely used to simulate complex engineering systems. Virtual simulations enhance engineering insight into prospective designs and potential design issues and can limit the need for expensive engineering prototypes. For complex engineering systems, however, the effectiveness of virtual simulations is often hindered by excessive computational cost. To minimize the cost of running expensive computer simulations, approximate models of the original model (often called surrogate models or metamodels) can provide sufficient accuracy at a lower computing overhead compared to repeated runs of a full simulation. Metamodel accuracy improves if constructed using space-filling designs of experiments (DOEs). The latter provide a collection of sample points in the design space preferably covering the entire space.
Journal Article

A Nonparametric Bootstrap Approach to Variable-size Local-domain Design Optimization and Computer Model Validation

2012-04-16
2012-01-0226
Design optimization often relies on computational models, which are subjected to a validation process to ensure their accuracy. Because validation of computer models in the entire design space can be costly, a recent approach was proposed where design optimization and model validation were concurrently performed using a sequential approach with both fixed and variable-size local domains. The variable-size approach used parametric distributions such as Gaussian to quantify the variability in test data and model predictions, and a maximum likelihood estimation to calibrate the prediction model. Also, a parametric bootstrap method was used to size each local domain. In this article, we generalize the variable-size approach, by not assuming any distribution such as Gaussian. A nonparametric bootstrap methodology is instead used to size the local domains. We expect its generality to be useful in applications where distributional assumptions are difficult to verify, or not met at all.
Journal Article

A Re-Analysis Methodology for System RBDO Using a Trust Region Approach with Local Metamodels

2010-04-12
2010-01-0645
A simulation-based, system reliability-based design optimization (RBDO) method is presented that can handle problems with multiple failure regions and correlated random variables. Copulas are used to represent the correlation. The method uses a Probabilistic Re-Analysis (PRRA) approach in conjunction with a trust-region optimization approach and local metamodels covering each trust region. PRRA calculates very efficiently the system reliability of a design by performing a single Monte Carlo (MC) simulation per trust region. Although PRRA is based on MC simulation, it calculates “smooth” sensitivity derivatives, allowing therefore, the use of a gradient-based optimizer. The PRRA method is based on importance sampling. It provides accurate results, if the support of the sampling PDF contains the support of the joint PDF of the input random variables. The sequential, trust-region optimization approach satisfies this requirement.
Journal Article

A Simulation and Optimization Methodology for Reliability of Vehicle Fleets

2011-04-12
2011-01-0725
Understanding reliability is critical in design, maintenance and durability analysis of engineering systems. A reliability simulation methodology is presented in this paper for vehicle fleets using limited data. The method can be used to estimate the reliability of non-repairable as well as repairable systems. It can optimally allocate, based on a target system reliability, individual component reliabilities using a multi-objective optimization algorithm. The algorithm establishes a Pareto front that can be used for optimal tradeoff between reliability and the associated cost. The method uses Monte Carlo simulation to estimate the system failure rate and reliability as a function of time. The probability density functions (PDF) of the time between failures for all components of the system are estimated using either limited data or a user-supplied MTBF (mean time between failures) and its coefficient of variation.
Journal Article

A Variable-Size Local Domain Approach to Computer Model Validation in Design Optimization

2011-04-12
2011-01-0243
A common approach to the validation of simulation models focuses on validation throughout the entire design space. A more recent methodology validates designs as they are generated during a simulation-based optimization process. The latter method relies on validating the simulation model in a sequence of local domains. To improve its computational efficiency, this paper proposes an iterative process, where the size and shape of local domains at the current step are determined from a parametric bootstrap methodology involving maximum likelihood estimators of unknown model parameters from the previous step. Validation is carried out in the local domain at each step. The iterative process continues until the local domain does not change from iteration to iteration during the optimization process ensuring that a converged design optimum has been obtained.
Technical Paper

An Efficient Re-Analysis Methodology for Vibration of Large-Scale Structures

2007-05-15
2007-01-2326
Finite element analysis is a well-established methodology in structural dynamics. However, optimization and/or probabilistic studies can be prohibitively expensive because they require repeated FE analyses of large models. Various reanalysis methods have been proposed in order to calculate efficiently the dynamic response of a structure after a baseline design has been modified, without recalculating the new response. The parametric reduced-order modeling (PROM) and the combined approximation (CA) methods are two re-analysis methods, which can handle large model parameter changes in a relatively efficient manner. Although both methods are promising by themselves, they can not handle large FE models with large numbers of DOF (e.g. 100,000) with a large number of design parameters (e.g. 50), which are common in practice. In this paper, the advantages and disadvantages of the PROM and CA methods are first discussed in detail.
Journal Article

An Improved Reanalysis Method Using Parametric Reduced Order Modeling for Linear Dynamic Systems

2016-04-05
2016-01-1318
Finite element analysis is a standard tool for deterministic or probabilistic design optimization of dynamic systems. The optimization process requires repeated eigenvalue analyses which can be computationally expensive. Several reanalysis techniques have been proposed to reduce the computational cost including Parametric Reduced Order Modeling (PROM), Combined Approximations (CA), and the Modified Combined Approximations (MCA) method. Although the cost of reanalysis is substantially reduced, it can still be high for models with a large number of degrees of freedom and a large number of design variables. Reanalysis methods use a basis composed of eigenvectors from both the baseline and the modified designs which are in general linearly dependent. To eliminate the linear dependency and improve accuracy, Gram Schmidt orthonormalization is employed which is costly itself.
Journal Article

An RBDO Method for Multiple Failure Region Problems using Probabilistic Reanalysis and Approximate Metamodels

2009-04-20
2009-01-0204
A Reliability-Based Design Optimization (RBDO) method for multiple failure regions is presented. The method uses a Probabilistic Re-Analysis (PRRA) approach in conjunction with an approximate global metamodel with local refinements. The latter serves as an indicator to determine the failure and safe regions. PRRA calculates very efficiently the system reliability of a design by performing a single Monte Carlo (MC) simulation. Although PRRA is based on MC simulation, it calculates “smooth” sensitivity derivatives, allowing therefore, the use of a gradient-based optimizer. An “accurate-on-demand” metamodel is used in the PRRA that allows us to handle problems with multiple disjoint failure regions and potentially multiple most-probable points (MPP). The multiple failure regions are identified by using a clustering technique. A maximin “space-filling” sampling technique is used to construct the metamodel. A vibration absorber example highlights the potential of the proposed method.
Technical Paper

Assessment of Imprecise Reliability Using Efficient Probabilistic Reanalysis

2007-04-16
2007-01-0552
In reliability design, often, there is scarce data for constructing probabilistic models. Probabilistic models whose parameters vary in known intervals could be more suitable than Bayesian models because the former models do not require making assumptions that are not supported by the available evidence. If we use models whose parameters vary in intervals we need to calculate upper and lower bounds of the failure probability (or reliability) of a system in order to make design decisions. Monte Carlo simulation can be used for this purpose, but it is too expensive for all but very simple systems. This paper proposes an efficient Monte-Carlo simulation approach for estimation of upper and lower probabilities. This approach is based on two ideas: a) use an efficient approach for reliability reanalysis of a system, which is introduced in this paper, and b) approximate the probability distribution of the minimum and maximum failure probabilities using extreme value statistics.
Journal Article

Bootstrapping and Separable Monte Carlo Simulation Methods Tailored for Efficient Assessment of Probability of Failure of Structural Systems

2015-04-14
2015-01-0420
There is randomness in both the applied loads and the strength of systems. Therefore, to account for the uncertainty, the safety of the system must be quantified using its reliability. Monte Carlo Simulation (MCS) is widely used for probabilistic analysis because of its robustness. However, the high computational cost limits the accuracy of MCS. Smarslok et al. [2010] developed an improved sampling technique for reliability assessment called Separable Monte Carlo (SMC) that can significantly increase the accuracy of estimation without increasing the cost of sampling. However, this method was applied to time-invariant problems involving two random variables. This paper extends SMC to problems with multiple random variables and develops a novel method for estimation of the standard deviation of the probability of failure of a structure. The method is demonstrated and validated on reliability assessment of an offshore wind turbine under turbulent wind loads.
Technical Paper

Combined Approximation for Efficient Reliability Analysis of Linear Dynamic Systems

2015-04-14
2015-01-0424
The Combined Approximation (CA) method is an efficient reanalysis method that aims at reducing the cost of optimization problems. The CA uses results of a single exact analysis, and it is suitable for different types of structures and design variables. The second author utilized CA to calculate the frequency response function of a system at a frequency of interest by using the results at a frequency in the vicinity of that frequency. He showed that the CA yields accurate results for small frequency perturbations. This work demonstrates a methodology that utilizes CA to reduce the cost of Monte Carlo simulation (MCs) of linear systems under random dynamic loads. The main idea is to divide the power spectral density function (PSD) of the input load into several frequency bins before calculating the load realizations.
Journal Article

Computational Efficiency Improvements in Topography Optimization Using Reanalysis

2016-04-05
2016-01-1395
To improve fuel economy, there is a trend in automotive industry to use light weight, high strength materials. Automotive body structures are composed of several panels which must be downsized to reduce weight. Because this affects NVH (Noise, Vibration and Harshness) performance, engineers are challenged to recover the lost panel stiffness from down-gaging in order to improve the structure borne noise transmitted through the lightweight panels in the frequency range of 100-300 Hz where most of the booming and low medium frequency noise occurs. The loss in performance can be recovered by optimized panel geometry using beading or damping treatment. Topography optimization is a special class of shape optimization for changing sheet metal shapes by introducing beads. A large number of design variables can be handled and the process is easy to setup in commercial codes. However, optimization methods are computationally intensive because of repeated full-order analyses.
Technical Paper

Design Under Uncertainty and Assessment of Performance Reliability of a Dual-Use Medium Truck with Hydraulic-Hybrid Powertrain and Fuel Cell Auxiliary Power Unit

2005-04-11
2005-01-1396
Medium trucks constitute a large market segment of the commercial transportation sector, and are also used widely for military tactical operations. Recent technological advances in hybrid powertrains and fuel cell auxiliary power units have enabled design alternatives that can improve fuel economy and reduce emissions dramatically. However, deterministic design optimization of these configurations may yield designs that are optimal with respect to performance but raise concerns regarding the reliability of achieving that performance over lifetime. In this article we identify and quantify uncertainties due to modeling approximations or incomplete information. We then model their propagation using Monte Carlo simulation and perform sensitivity analysis to isolate statistically significant uncertainties. Finally, we formulate and solve a series of reliability-based optimization problems and quantify tradeoffs between optimality and reliability.
Journal Article

Efficient Global Surrogate Modeling Based on Multi-Layer Sampling

2018-04-03
2018-01-0616
Global surrogate modeling aims to build surrogate model with high accuracy in the whole design domain. A major challenge to achieve this objective is how to reduce the number of function evaluations to the original computer simulation model. To date, the most widely used approach for global surrogate modeling is the adaptive surrogate modeling method. It starts with an initial surrogate model, which is then refined adaptively using the mean square error (MSE) or maximizing the minimum distance criteria. It is observed that current methods may not be able to effectively construct a global surrogate model when the underlying black box function is highly nonlinear in only certain regions. A new surrogate modeling method which can allocate more training points in regions with high nonlinearity is needed to overcome this challenge. This article proposes an efficient global surrogate modeling method based on a multi-layer sampling scheme.
Journal Article

Efficient Probabilistic Reanalysis and Optimization of a Discrete Event System

2011-04-12
2011-01-1081
This paper presents a methodology to evaluate and optimize discrete event systems, such as an assembly line or a call center. First, the methodology estimates the performance of a system for a single probability distribution of the inputs. Probabilistic Reanalysis (PRRA) uses this information to evaluate the effect of changes in the system configuration on its performance. PRRA is integrated with a program to optimize the system. The proposed methodology is dramatically more efficient than one requiring a new Monte Carlo simulation each time we change the system. We demonstrate the approach on a drilling center and an electronic parts factory.
Journal Article

Efficient Random Vibration Analysis Using Markov Chain Monte Carlo Simulation

2012-04-16
2012-01-0067
Reliability assessment of dynamic systems with low failure probability can be very expensive. This paper presents and demonstrates a method that uses the Metropolis-Hastings algorithm to sample from an optimal probability density function (PDF) of the random variables. This function is the true PDF truncated over the failure region. For a system subjected to time varying excitation, Shinozuka's method is employed to generate time histories of the excitation. Random values of the frequencies and the phase angles of the excitation are drawn from the optimal PDF. It is shown that running the subset simulation by the proposed approach, which uses Shinozuka's method, is more efficient than the original subset simulation. The main reasons are that the approach involves only 10 to 20 random variables, and it takes advantage of the symmetry of the expression of the displacement as a function of the inputs. The paper demonstrates the method on two examples.
Journal Article

Estimation of High-Cycle Fatigue Life by using Re-analysis

2012-04-16
2012-01-0066
In design of real-life systems, such as the suspension of a car, an offshore platform or a wind turbine, there are significant uncertainties in the model of the inputs. For example, scarcity of data leads to inaccuracies in the power spectral density function of the waves and the probability distribution of the wind speed. Therefore, it is necessary to evaluate the performance and safety of a system for different probability distributions. This is computationally expensive or even impractical. This paper presents a methodology to assess efficiently the fatigue life of structures for different power spectra of the applied loads. We accomplish that by reweighting the incremental damage calculated in one simulation. We demonstrate the accuracy and efficiency of the proposed method on an example which involves a nonlinear quarter car under a random dynamic load. The fatigue life of the suspension spring under loads generated by a sampling spectrum is calculated.
Technical Paper

Evidence Theory Approach and Bayesian Approach for Modeling Uncertainty when Information is Imprecise

2003-03-03
2003-01-0144
This paper investigates the potential of Evidence Theory (ET) and Bayesian Theory (BT) for decision under uncertainty, when the evidence about uncertainty is imprecise. The basic concepts of ET and BT are introduced and the ways these theories model uncertainties, propagate them through systems and assess the safety of these systems are presented. ET and BT approaches are demonstrated and compared on examples involving an algebraic function when the evidence about the input variables consists of intervals provided by experts. It is recommended that a decision maker compute both the Bayesian probability of events and their lower and upper probabilities using ET when evidence from experts is imprecise. A large gap between the lower and upper probability suggests that more information should be collected before making a decision. If this is not feasible, then Bayesian probabilities can help make a decision.
X