Refine Your Search

Topic

Search Results

Technical Paper

A Methodology of Design for Fatigue Using an Accelerated Life Testing Approach with Saddlepoint Approximation

2019-04-02
2019-01-0159
We present an Accelerated Life Testing (ALT) methodology along with a design for fatigue approach, using Gaussian or non-Gaussian excitations. The accuracy of fatigue life prediction at nominal loading conditions is affected by model and material uncertainty. This uncertainty is reduced by performing tests at a higher loading level, resulting in a reduction in test duration. Based on the data obtained from experiments, we formulate an optimization problem to calculate the Maximum Likelihood Estimator (MLE) values of the uncertain model parameters. In our proposed ALT method, we lift all the assumptions on the type of life distribution or the stress-life relationship and we use Saddlepoint Approximation (SPA) method to calculate the fatigue life Probability Density Functions (PDFs).
Journal Article

Efficient Global Surrogate Modeling Based on Multi-Layer Sampling

2018-04-03
2018-01-0616
Global surrogate modeling aims to build surrogate model with high accuracy in the whole design domain. A major challenge to achieve this objective is how to reduce the number of function evaluations to the original computer simulation model. To date, the most widely used approach for global surrogate modeling is the adaptive surrogate modeling method. It starts with an initial surrogate model, which is then refined adaptively using the mean square error (MSE) or maximizing the minimum distance criteria. It is observed that current methods may not be able to effectively construct a global surrogate model when the underlying black box function is highly nonlinear in only certain regions. A new surrogate modeling method which can allocate more training points in regions with high nonlinearity is needed to overcome this challenge. This article proposes an efficient global surrogate modeling method based on a multi-layer sampling scheme.
Journal Article

Reliability and Cost Trade-Off Analysis of a Microgrid

2018-04-03
2018-01-0619
Optimizing the trade-off between reliability and cost of operating a microgrid, including vehicles as both loads and sources, can be a challenge. Optimal energy management is crucial to develop strategies to improve the efficiency and reliability of microgrids, as well as new communication networks to support optimal and reliable operation. Prior approaches modeled the grid using MATLAB, but did not include the detailed physics of loads and sources, and therefore missed the transient effects that are present in real-time operation of a microgrid. This article discusses the implementation of a physics-based detailed microgrid model including a diesel generator, wind turbine, photovoltaic array, and utility. All elements are modeled as sources in Simulink. Various loads are also implemented including an asynchronous motor. We show how a central control algorithm optimizes the microgrid by trying to maximize reliability while reducing operational cost.
Journal Article

Value of Information for Comparing Dependent Repairable Assemblies and Systems

2018-04-03
2018-01-1103
This article presents an approach for comparing alternative repairable systems and calculating the value of information obtained by testing a specified number of such systems. More specifically, an approach is presented to determine the value of information that comes from field testing a specified number of systems in order to appropriately estimate the reliability metric associated with each of the respective repairable systems. Here the reliability of a repairable system will be measured by its failure rate. In support of the decision-making effort, the failure rate is translated into an expected utility based on a utility curve that represents the risk tolerance of the decision-maker. The algorithm calculates the change of the expected value of the decision with the sample size. The change in the value of the decision represents the value of information obtained from testing.
Journal Article

A Group-Based Space-Filling Design of Experiments Algorithm

2018-04-03
2018-01-1102
Computer-aided engineering (CAE) is an important tool routinely used to simulate complex engineering systems. Virtual simulations enhance engineering insight into prospective designs and potential design issues and can limit the need for expensive engineering prototypes. For complex engineering systems, however, the effectiveness of virtual simulations is often hindered by excessive computational cost. To minimize the cost of running expensive computer simulations, approximate models of the original model (often called surrogate models or metamodels) can provide sufficient accuracy at a lower computing overhead compared to repeated runs of a full simulation. Metamodel accuracy improves if constructed using space-filling designs of experiments (DOEs). The latter provide a collection of sample points in the design space preferably covering the entire space.
Technical Paper

Optimal Water Jacket Flow Distribution Using a New Group-Based Space-Filling Design of Experiments Algorithm

2018-04-03
2018-01-1017
The availability of computational resources has enabled an increased utilization of Design of Experiments (DoE) and metamodeling (response surface generation) for large-scale optimization problems. Despite algorithmic advances however, the analysis of systems such as water jackets of an automotive engine, can be computationally demanding in part due to the required accuracy of metamodels. Because the metamodels may have many inputs, their accuracy depends on the number of training points and how well they cover the entire design (input) space. For this reason, the space-filling properties of the DoE are very important. This paper utilizes a new group-based DoE algorithm with space-filling groups of points to construct a metamodel. Points are added sequentially so that the space-filling properties of the entire group of points is preserved. The addition of points is continuous until a specified metamodel accuracy is met.
Technical Paper

Random Vibration Analysis Using Quasi-Random Bootstrapping

2018-04-03
2018-01-1104
Reliability analysis of engineering structures such as bridges, airplanes, and cars require calculation of small failure probabilities. These probabilities can be calculated using standard Monte Carlo simulation, but this method is impractical for most real-life systems because of its high computational cost. Many studies have focused on reducing the computational cost of a reliability assessment. These include bootstrapping, Separable Monte Carlo, Importance Sampling, and the Combined Approximations. The computational cost can also be reduced using an efficient method for deterministic analysis such as the mode superposition, mode acceleration, and the combined acceleration method. This paper presents and demonstrates a method that uses a combination of Sobol quasi-random sequences and bootstrapping to reduce the number of function calls. The study demonstrates that the use of quasi-random numbers in conjunction bootstrapping reduces dramatically computational cost.
Journal Article

Warranty Forecasting of Repairable Systems for Different Production Patterns

2017-03-28
2017-01-0209
Warranty forecasting of repairable systems is very important for manufacturers of mass produced systems. It is desired to predict the Expected Number of Failures (ENF) after a censoring time using collected failure data before the censoring time. Moreover, systems may be produced with a defective component resulting in extensive warranty costs even after the defective component is detected and replaced with a new design. In this paper, we present a forecasting method to predict the ENF of a repairable system using observed data which is used to calibrate a Generalized Renewal Processes (GRP) model. Manufacturing of products may exhibit different production patterns with different failure statistics through time. For example, vehicles produced in different months may have different failure intensities because of supply chain differences or different skills of production workers, for example.
Journal Article

A Methodology for Fatigue Life Estimation of Linear Vibratory Systems under Non-Gaussian Loads

2017-03-28
2017-01-0197
Fatigue life estimation, reliability and durability are important in acquisition, maintenance and operation of vehicle systems. Fatigue life is random because of the stochastic load, the inherent variability of material properties, and the uncertainty in the definition of the S-N curve. The commonly used fatigue life estimation methods calculate the mean (not the distribution) of fatigue life under Gaussian loads using the potentially restrictive narrow-band assumption. In this paper, a general methodology is presented to calculate the statistics of fatigue life for a linear vibratory system under stationary, non-Gaussian loads considering the effects of skewness and kurtosis. The input loads are first characterized using their first four moments (mean, standard deviation, skewness and kurtosis) and a correlation structure equivalent to a given Power Spectral Density (PSD).
Journal Article

Time-Dependent Reliability-Based Design Optimization of Vibratory Systems

2017-03-28
2017-01-0194
A methodology for time-dependent reliability-based design optimization of vibratory systems with random parameters under stationary excitation is presented. The time-dependent probability of failure is computed using an integral equation which involves up-crossing and joint up-crossing rates. The total probability theorem addresses the presence of the system random parameters and a sparse grid quadrature method calculates the integral of the total probability theorem efficiently. The sensitivity derivatives of the time-dependent probability of failure with respect to the design variables are computed using finite differences. The Modified Combined Approximations (MCA) reanalysis method is used to reduce the overall computational cost from repeated evaluations of the system frequency response or equivalently impulse response function. The method is applied to the shape optimization of a vehicle frame under stochastic loading.
Journal Article

Assessing the Value of Information for Multiple, Correlated Design Alternatives

2017-03-28
2017-01-0208
Design optimization occurs through a series of decisions that are a standard part of the product development process. Decisions are made anywhere from concept selection to the design of the assembly and manufacturing processes. The effectiveness of these decisions is based on the information available to the decision maker. Decision analysis provides a structured approach for quantifying the value of information that may be provided to the decision maker. This paper presents a process for determining the value of information that can be gained by evaluating linearly correlated design alternatives. A unique approach to the application of Bayesian Inference is used to provide simulated estimates in the expected utility with increasing observations sizes. The results provide insight into the optimum observation size that maximizes the expected utility when assessing correlated decision alternatives.
Journal Article

Time-Dependent Reliability Analysis Using a Modified Composite Limit State Approach

2017-03-28
2017-01-0206
Recent developments in time-dependent reliability have introduced the concept of a composite limit state. The composite limit state method can be used to calculate the time-dependent probability of failure for dynamic systems with limit-state functions of input random variables, input random processes and explicit in time. The probability of failure can be calculated exactly using the composite limit state if the instantaneous limit states are linear, forming an open or close polytope, and are functions of only two random variables. In this work, the restriction on the number of random variables is lifted. The proposed algorithm is accurate and efficient for linear instantaneous limit state functions of any number of random variables. An example on the design of a hydrokinetic turbine blade under time-dependent river flow load demonstrates the accuracy of the proposed general composite limit state approach.
Journal Article

Uncertainty Assessment in Restraint System Optimization for Occupants of Tactical Vehicles

2016-04-05
2016-01-0316
We have recently obtained experimental data and used them to develop computational models to quantify occupant impact responses and injury risks for military vehicles during frontal crashes. The number of experimental tests and model runs are however, relatively small due to their high cost. While this is true across the auto industry, it is particularly critical for the Army and other government agencies operating under tight budget constraints. In this study we investigate through statistical simulations how the injury risk varies if a large number of experimental tests were conducted. We show that the injury risk distribution is skewed to the right implying that, although most physical tests result in a small injury risk, there are occasional physical tests for which the injury risk is extremely large. We compute the probabilities of such events and use them to identify optimum design conditions to minimize such probabilities.
Journal Article

Reanalysis of Linear Dynamic Systems using Modified Combined Approximations with Frequency Shifts

2016-04-05
2016-01-1338
Weight reduction is very important in automotive design because of stringent demand on fuel economy. Structural optimization of dynamic systems using finite element (FE) analysis plays an important role in reducing weight while simultaneously delivering a product that meets all functional requirements for durability, crash and NVH. With advancing computer technology, the demand for solving large FE models has grown. Optimization is however costly due to repeated full-order analyses. Reanalysis methods can be used in structural vibrations to reduce the analysis cost from repeated eigenvalue analyses for both deterministic and probabilistic problems. Several reanalysis techniques have been introduced over the years including Parametric Reduced Order Modeling (PROM), Combined Approximations (CA) and the Epsilon algorithm, among others.
Journal Article

Computational Efficiency Improvements in Topography Optimization Using Reanalysis

2016-04-05
2016-01-1395
To improve fuel economy, there is a trend in automotive industry to use light weight, high strength materials. Automotive body structures are composed of several panels which must be downsized to reduce weight. Because this affects NVH (Noise, Vibration and Harshness) performance, engineers are challenged to recover the lost panel stiffness from down-gaging in order to improve the structure borne noise transmitted through the lightweight panels in the frequency range of 100-300 Hz where most of the booming and low medium frequency noise occurs. The loss in performance can be recovered by optimized panel geometry using beading or damping treatment. Topography optimization is a special class of shape optimization for changing sheet metal shapes by introducing beads. A large number of design variables can be handled and the process is easy to setup in commercial codes. However, optimization methods are computationally intensive because of repeated full-order analyses.
Technical Paper

Inverse Modeling: Theory and Engineering Examples

2016-04-05
2016-01-0267
Over the last two decades inverse problems have become increasingly popular due to their widespread applications. This popularity continuously demands designers to find alternative methods, to solve the inverse problems, which are efficient and accurate. It is important to use effective techniques that are both accurate and computationally efficient. This paper presents a method for solving inverse problems through Artificial Neural Network (ANN) theory. The paper also presents a method to apply Grey Wolf optimizer (GWO) algorithm to inverse problems. GWO is a recent optimization method producing superior results. Both methods are then compared to traditional methods such as Particle Swarm Optimization (PSO) and Markov Chain Monte Carlo (MCMC). Four typical engineering design problems are used to compare the four methods. The results show that the GWO outperforms other methods both in terms of efficiency and accuracy.
Journal Article

An Improved Reanalysis Method Using Parametric Reduced Order Modeling for Linear Dynamic Systems

2016-04-05
2016-01-1318
Finite element analysis is a standard tool for deterministic or probabilistic design optimization of dynamic systems. The optimization process requires repeated eigenvalue analyses which can be computationally expensive. Several reanalysis techniques have been proposed to reduce the computational cost including Parametric Reduced Order Modeling (PROM), Combined Approximations (CA), and the Modified Combined Approximations (MCA) method. Although the cost of reanalysis is substantially reduced, it can still be high for models with a large number of degrees of freedom and a large number of design variables. Reanalysis methods use a basis composed of eigenvectors from both the baseline and the modified designs which are in general linearly dependent. To eliminate the linear dependency and improve accuracy, Gram Schmidt orthonormalization is employed which is costly itself.
Journal Article

Bootstrapping and Separable Monte Carlo Simulation Methods Tailored for Efficient Assessment of Probability of Failure of Structural Systems

2015-04-14
2015-01-0420
There is randomness in both the applied loads and the strength of systems. Therefore, to account for the uncertainty, the safety of the system must be quantified using its reliability. Monte Carlo Simulation (MCS) is widely used for probabilistic analysis because of its robustness. However, the high computational cost limits the accuracy of MCS. Smarslok et al. [2010] developed an improved sampling technique for reliability assessment called Separable Monte Carlo (SMC) that can significantly increase the accuracy of estimation without increasing the cost of sampling. However, this method was applied to time-invariant problems involving two random variables. This paper extends SMC to problems with multiple random variables and develops a novel method for estimation of the standard deviation of the probability of failure of a structure. The method is demonstrated and validated on reliability assessment of an offshore wind turbine under turbulent wind loads.
Technical Paper

Combined Approximation for Efficient Reliability Analysis of Linear Dynamic Systems

2015-04-14
2015-01-0424
The Combined Approximation (CA) method is an efficient reanalysis method that aims at reducing the cost of optimization problems. The CA uses results of a single exact analysis, and it is suitable for different types of structures and design variables. The second author utilized CA to calculate the frequency response function of a system at a frequency of interest by using the results at a frequency in the vicinity of that frequency. He showed that the CA yields accurate results for small frequency perturbations. This work demonstrates a methodology that utilizes CA to reduce the cost of Monte Carlo simulation (MCs) of linear systems under random dynamic loads. The main idea is to divide the power spectral density function (PSD) of the input load into several frequency bins before calculating the load realizations.
Technical Paper

Multi-Level Decoupled Optimization of Wind Turbine Structures

2015-04-14
2015-01-0434
This paper proposes a multi-level decoupled method for optimizing the structural design of a wind turbine blade. The proposed method reduces the design space by employing a two-level optimization process. At the high-level, the structural properties of each section are approximated by an exponential function of the distance of that section from the blade root. High-level design variables are the coefficients of this approximating function. Target values for the structural properties of the blade are determined at that level. At the low-level, sections are divided into small decoupled groups. For each section, the low-level optimizer finds the thickness of laminate layers with a minimum mass, whose structural properties meet the targets determined by the high-level optimizer. In the proposed method, each low-level optimizer only considers a small number of design variables for a particular section, while traditional, single-level methods consider all design variables simultaneously.
X