Refine Your Search

Topic

Search Results

Technical Paper

A Cost-Driven Method for Design Optimization Using Validated Local Domains

2013-04-08
2013-01-1385
Design optimization often relies on computational models, which are subjected to a validation process to ensure their accuracy. Because validation of computer models in the entire design space can be costly, we have previously proposed an approach where design optimization and model validation, are concurrently performed using a sequential approach with variable-size local domains. We used test data and statistical bootstrap methods to size each local domain where the prediction model is considered validated and where design optimization is performed. The method proceeds iteratively until the optimum design is obtained. This method however, requires test data to be available in each local domain along the optimization path. In this paper, we refine our methodology by using polynomial regression to predict the size and shape of a local domain at some steps along the optimization process without using test data.
Journal Article

A Group-Based Space-Filling Design of Experiments Algorithm

2018-04-03
2018-01-1102
Computer-aided engineering (CAE) is an important tool routinely used to simulate complex engineering systems. Virtual simulations enhance engineering insight into prospective designs and potential design issues and can limit the need for expensive engineering prototypes. For complex engineering systems, however, the effectiveness of virtual simulations is often hindered by excessive computational cost. To minimize the cost of running expensive computer simulations, approximate models of the original model (often called surrogate models or metamodels) can provide sufficient accuracy at a lower computing overhead compared to repeated runs of a full simulation. Metamodel accuracy improves if constructed using space-filling designs of experiments (DOEs). The latter provide a collection of sample points in the design space preferably covering the entire space.
Journal Article

A Methodology for Fatigue Life Estimation of Linear Vibratory Systems under Non-Gaussian Loads

2017-03-28
2017-01-0197
Fatigue life estimation, reliability and durability are important in acquisition, maintenance and operation of vehicle systems. Fatigue life is random because of the stochastic load, the inherent variability of material properties, and the uncertainty in the definition of the S-N curve. The commonly used fatigue life estimation methods calculate the mean (not the distribution) of fatigue life under Gaussian loads using the potentially restrictive narrow-band assumption. In this paper, a general methodology is presented to calculate the statistics of fatigue life for a linear vibratory system under stationary, non-Gaussian loads considering the effects of skewness and kurtosis. The input loads are first characterized using their first four moments (mean, standard deviation, skewness and kurtosis) and a correlation structure equivalent to a given Power Spectral Density (PSD).
Technical Paper

A Methodology of Design for Fatigue Using an Accelerated Life Testing Approach with Saddlepoint Approximation

2019-04-02
2019-01-0159
We present an Accelerated Life Testing (ALT) methodology along with a design for fatigue approach, using Gaussian or non-Gaussian excitations. The accuracy of fatigue life prediction at nominal loading conditions is affected by model and material uncertainty. This uncertainty is reduced by performing tests at a higher loading level, resulting in a reduction in test duration. Based on the data obtained from experiments, we formulate an optimization problem to calculate the Maximum Likelihood Estimator (MLE) values of the uncertain model parameters. In our proposed ALT method, we lift all the assumptions on the type of life distribution or the stress-life relationship and we use Saddlepoint Approximation (SPA) method to calculate the fatigue life Probability Density Functions (PDFs).
Journal Article

A Nonparametric Bootstrap Approach to Variable-size Local-domain Design Optimization and Computer Model Validation

2012-04-16
2012-01-0226
Design optimization often relies on computational models, which are subjected to a validation process to ensure their accuracy. Because validation of computer models in the entire design space can be costly, a recent approach was proposed where design optimization and model validation were concurrently performed using a sequential approach with both fixed and variable-size local domains. The variable-size approach used parametric distributions such as Gaussian to quantify the variability in test data and model predictions, and a maximum likelihood estimation to calibrate the prediction model. Also, a parametric bootstrap method was used to size each local domain. In this article, we generalize the variable-size approach, by not assuming any distribution such as Gaussian. A nonparametric bootstrap methodology is instead used to size the local domains. We expect its generality to be useful in applications where distributional assumptions are difficult to verify, or not met at all.
Journal Article

A Re-Analysis Methodology for System RBDO Using a Trust Region Approach with Local Metamodels

2010-04-12
2010-01-0645
A simulation-based, system reliability-based design optimization (RBDO) method is presented that can handle problems with multiple failure regions and correlated random variables. Copulas are used to represent the correlation. The method uses a Probabilistic Re-Analysis (PRRA) approach in conjunction with a trust-region optimization approach and local metamodels covering each trust region. PRRA calculates very efficiently the system reliability of a design by performing a single Monte Carlo (MC) simulation per trust region. Although PRRA is based on MC simulation, it calculates “smooth” sensitivity derivatives, allowing therefore, the use of a gradient-based optimizer. The PRRA method is based on importance sampling. It provides accurate results, if the support of the sampling PDF contains the support of the joint PDF of the input random variables. The sequential, trust-region optimization approach satisfies this requirement.
Journal Article

A Simulation and Optimization Methodology for Reliability of Vehicle Fleets

2011-04-12
2011-01-0725
Understanding reliability is critical in design, maintenance and durability analysis of engineering systems. A reliability simulation methodology is presented in this paper for vehicle fleets using limited data. The method can be used to estimate the reliability of non-repairable as well as repairable systems. It can optimally allocate, based on a target system reliability, individual component reliabilities using a multi-objective optimization algorithm. The algorithm establishes a Pareto front that can be used for optimal tradeoff between reliability and the associated cost. The method uses Monte Carlo simulation to estimate the system failure rate and reliability as a function of time. The probability density functions (PDF) of the time between failures for all components of the system are estimated using either limited data or a user-supplied MTBF (mean time between failures) and its coefficient of variation.
Journal Article

A Variable-Size Local Domain Approach to Computer Model Validation in Design Optimization

2011-04-12
2011-01-0243
A common approach to the validation of simulation models focuses on validation throughout the entire design space. A more recent methodology validates designs as they are generated during a simulation-based optimization process. The latter method relies on validating the simulation model in a sequence of local domains. To improve its computational efficiency, this paper proposes an iterative process, where the size and shape of local domains at the current step are determined from a parametric bootstrap methodology involving maximum likelihood estimators of unknown model parameters from the previous step. Validation is carried out in the local domain at each step. The iterative process continues until the local domain does not change from iteration to iteration during the optimization process ensuring that a converged design optimum has been obtained.
Journal Article

An Improved Reanalysis Method Using Parametric Reduced Order Modeling for Linear Dynamic Systems

2016-04-05
2016-01-1318
Finite element analysis is a standard tool for deterministic or probabilistic design optimization of dynamic systems. The optimization process requires repeated eigenvalue analyses which can be computationally expensive. Several reanalysis techniques have been proposed to reduce the computational cost including Parametric Reduced Order Modeling (PROM), Combined Approximations (CA), and the Modified Combined Approximations (MCA) method. Although the cost of reanalysis is substantially reduced, it can still be high for models with a large number of degrees of freedom and a large number of design variables. Reanalysis methods use a basis composed of eigenvectors from both the baseline and the modified designs which are in general linearly dependent. To eliminate the linear dependency and improve accuracy, Gram Schmidt orthonormalization is employed which is costly itself.
Journal Article

Computational Efficiency Improvements in Topography Optimization Using Reanalysis

2016-04-05
2016-01-1395
To improve fuel economy, there is a trend in automotive industry to use light weight, high strength materials. Automotive body structures are composed of several panels which must be downsized to reduce weight. Because this affects NVH (Noise, Vibration and Harshness) performance, engineers are challenged to recover the lost panel stiffness from down-gaging in order to improve the structure borne noise transmitted through the lightweight panels in the frequency range of 100-300 Hz where most of the booming and low medium frequency noise occurs. The loss in performance can be recovered by optimized panel geometry using beading or damping treatment. Topography optimization is a special class of shape optimization for changing sheet metal shapes by introducing beads. A large number of design variables can be handled and the process is easy to setup in commercial codes. However, optimization methods are computationally intensive because of repeated full-order analyses.
Technical Paper

Design Under Uncertainty and Assessment of Performance Reliability of a Dual-Use Medium Truck with Hydraulic-Hybrid Powertrain and Fuel Cell Auxiliary Power Unit

2005-04-11
2005-01-1396
Medium trucks constitute a large market segment of the commercial transportation sector, and are also used widely for military tactical operations. Recent technological advances in hybrid powertrains and fuel cell auxiliary power units have enabled design alternatives that can improve fuel economy and reduce emissions dramatically. However, deterministic design optimization of these configurations may yield designs that are optimal with respect to performance but raise concerns regarding the reliability of achieving that performance over lifetime. In this article we identify and quantify uncertainties due to modeling approximations or incomplete information. We then model their propagation using Monte Carlo simulation and perform sensitivity analysis to isolate statistically significant uncertainties. Finally, we formulate and solve a series of reliability-based optimization problems and quantify tradeoffs between optimality and reliability.
Journal Article

Efficient Global Surrogate Modeling Based on Multi-Layer Sampling

2018-04-03
2018-01-0616
Global surrogate modeling aims to build surrogate model with high accuracy in the whole design domain. A major challenge to achieve this objective is how to reduce the number of function evaluations to the original computer simulation model. To date, the most widely used approach for global surrogate modeling is the adaptive surrogate modeling method. It starts with an initial surrogate model, which is then refined adaptively using the mean square error (MSE) or maximizing the minimum distance criteria. It is observed that current methods may not be able to effectively construct a global surrogate model when the underlying black box function is highly nonlinear in only certain regions. A new surrogate modeling method which can allocate more training points in regions with high nonlinearity is needed to overcome this challenge. This article proposes an efficient global surrogate modeling method based on a multi-layer sampling scheme.
Journal Article

Efficient Probabilistic Reanalysis and Optimization of a Discrete Event System

2011-04-12
2011-01-1081
This paper presents a methodology to evaluate and optimize discrete event systems, such as an assembly line or a call center. First, the methodology estimates the performance of a system for a single probability distribution of the inputs. Probabilistic Reanalysis (PRRA) uses this information to evaluate the effect of changes in the system configuration on its performance. PRRA is integrated with a program to optimize the system. The proposed methodology is dramatically more efficient than one requiring a new Monte Carlo simulation each time we change the system. We demonstrate the approach on a drilling center and an electronic parts factory.
Journal Article

Enhancing Decision Topology Assessment in Engineering Design

2014-04-01
2014-01-0719
Implications of decision analysis (DA) on engineering design are important and well-documented. However, widespread adoption has not occurred. To that end, the authors recently proposed decision topologies (DT) as a visual method for representing decision situations and proved that they are entirely consistent with normative decision analysis. This paper addresses the practical issue of assessing the DTs of a designer using their responses. As in classical DA, this step is critical to encoding the DA's preferences so that further analysis and mathematical optimization can be performed on the correct set of preferences. We show how multi-attribute DTs can be directly assessed from DM responses. Furthermore, we show that preferences under uncertainty can be trivially incorporated and that topologies can be constructed using single attribute topologies similarly to multi-linear functions in utility analysis. This incremental construction simplifies the process of topology construction.
Journal Article

Flexible Design and Operation of a Smart Charging Microgrid

2014-04-01
2014-01-0716
The reliability theory of repairable systems is vastly different from that of non-repairable systems. The authors have recently proposed a ‘decision-based’ framework to design and maintain repairable systems for optimal performance and reliability using a set of metrics such as minimum failure free period, number of failures in planning horizon (lifecycle), and cost. The optimal solution includes the initial design, the system maintenance throughout the planning horizon, and the protocol to operate the system. In this work, we extend this idea by incorporating flexibility and demonstrate our approach using a smart charging electric microgrid architecture. The flexibility is realized by allowing the architecture to change with time. Our approach “learns” the working characteristics of the microgrid. We use actual load and supply data over a short time to quantify the load and supply random processes and also establish the correlation between them.
Technical Paper

Managing the Computational Cost in a Monte Carlo Simulation by Considering the Value of Information

2012-04-16
2012-01-0915
Monte Carlo simulation is a popular tool for reliability assessment because of its robustness and ease of implementation. A major concern with this method is its computational cost; standard Monte Carlo simulation requires quadrupling the number of replications for halving the standard deviation of the estimated failure probability. Efforts to increase efficiency focus on intelligent sampling procedures and methods for efficient calculation of the performance function of a system. This paper proposes a new method to manage cost that views design as a decision among alternatives with uncertain reliabilities. Information from a simulation has value only if it enables the designer to make a better choice among the alternative options. Consequently, the value of information from the simulation is equal to the gain from using this information to improve the decision. A designer can determine the number of replications that are worth performing by using the method.
Journal Article

Managing the Computational Cost of Monte Carlo Simulation with Importance Sampling by Considering the Value of Information

2013-04-08
2013-01-0943
Importance Sampling is a popular method for reliability assessment. Although it is significantly more efficient than standard Monte Carlo simulation if a suitable sampling distribution is used, in many design problems it is too expensive. The authors have previously proposed a method to manage the computational cost in standard Monte Carlo simulation that views design as a choice among alternatives with uncertain reliabilities. Information from simulation has value only if it helps the designer make a better choice among the alternatives. This paper extends their method to Importance Sampling. First, the designer estimates the prior probability density functions of the reliabilities of the alternative designs and calculates the expected utility of the choice of the best design. Subsequently, the designer estimates the likelihood function of the probability of failure by performing an initial simulation with Importance Sampling.
Technical Paper

Modeling the Stiffness and Damping Properties of Styrene-Butadiene Rubber

2011-05-17
2011-01-1628
Styrene-Butadiene Rubber (SBR), a copolymer of butadiene and styrene, is widely used in the automotive industry due to its high durability and resistance to abrasion, oils and oxidation. Some of the common applications include tires, vibration isolators, and gaskets, among others. This paper characterizes the dynamic behavior of SBR and discusses the suitability of a visco-elastic model of elastomers, known as the Kelvin model, from a mathematical and physical point of view. An optimization algorithm is used to estimate the parameters of the Kelvin model. The resulting model was shown to produce reasonable approximations of measured dynamic stiffness. The model was also used to calculate the self heating of the elastomer due to energy dissipation by the viscous damping components in the model. Developing such a predictive capability is essential in understanding the dynamic behavior of elastomers considering that their dynamic stiffness can in general depend on temperature.
Journal Article

Multi-Objective Decision Making under Uncertainty and Incomplete Knowledge of Designer Preferences

2011-04-12
2011-01-1080
Multi-attribute decision making and multi-objective optimization complement each other. Often, while making design decisions involving multiple attributes, a Pareto front is generated using a multi-objective optimizer. The end user then chooses the optimal design from the Pareto front based on his/her preferences. This seemingly simple methodology requires sufficient modification if uncertainty is present. We explore two kinds of uncertainties in this paper: uncertainty in the decision variables which we call inherent design problem (IDP) uncertainty and that in knowledge of the preferences of the decision maker which we refer to as preference assessment (PA) uncertainty. From a purely utility theory perspective a rational decision maker maximizes his or her expected multi attribute utility.
Journal Article

Optimal Preventive Maintenance Schedule Based on Lifecycle Cost and Time-Dependent Reliability

2012-04-16
2012-01-0070
Reliability is an important engineering requirement for consistently delivering acceptable product performance through time. It also affects the scheduling for preventive maintenance. Reliability usually degrades with time increasing therefore, the lifecycle cost due to more frequent failures which result in increased warranty costs, costly repairs and loss of market share. In a lifecycle cost based design, we must account for product quality and preventive maintenance using time-dependent reliability. Quality is a measure of our confidence that the product conforms to specifications as it leaves the factory. For a repairable system, preventive maintenance is scheduled to avoid failures, unnecessary production loss and safety violations. This article proposes a methodology to obtain the optimal scheduling for preventive maintenance using time-dependent reliability principles.
X