Refine Your Search

Topic

Search Results

Journal Article

Efficient Global Surrogate Modeling Based on Multi-Layer Sampling

2018-04-03
2018-01-0616
Global surrogate modeling aims to build surrogate model with high accuracy in the whole design domain. A major challenge to achieve this objective is how to reduce the number of function evaluations to the original computer simulation model. To date, the most widely used approach for global surrogate modeling is the adaptive surrogate modeling method. It starts with an initial surrogate model, which is then refined adaptively using the mean square error (MSE) or maximizing the minimum distance criteria. It is observed that current methods may not be able to effectively construct a global surrogate model when the underlying black box function is highly nonlinear in only certain regions. A new surrogate modeling method which can allocate more training points in regions with high nonlinearity is needed to overcome this challenge. This article proposes an efficient global surrogate modeling method based on a multi-layer sampling scheme.
Journal Article

Reliability and Cost Trade-Off Analysis of a Microgrid

2018-04-03
2018-01-0619
Optimizing the trade-off between reliability and cost of operating a microgrid, including vehicles as both loads and sources, can be a challenge. Optimal energy management is crucial to develop strategies to improve the efficiency and reliability of microgrids, as well as new communication networks to support optimal and reliable operation. Prior approaches modeled the grid using MATLAB, but did not include the detailed physics of loads and sources, and therefore missed the transient effects that are present in real-time operation of a microgrid. This article discusses the implementation of a physics-based detailed microgrid model including a diesel generator, wind turbine, photovoltaic array, and utility. All elements are modeled as sources in Simulink. Various loads are also implemented including an asynchronous motor. We show how a central control algorithm optimizes the microgrid by trying to maximize reliability while reducing operational cost.
Journal Article

A Group-Based Space-Filling Design of Experiments Algorithm

2018-04-03
2018-01-1102
Computer-aided engineering (CAE) is an important tool routinely used to simulate complex engineering systems. Virtual simulations enhance engineering insight into prospective designs and potential design issues and can limit the need for expensive engineering prototypes. For complex engineering systems, however, the effectiveness of virtual simulations is often hindered by excessive computational cost. To minimize the cost of running expensive computer simulations, approximate models of the original model (often called surrogate models or metamodels) can provide sufficient accuracy at a lower computing overhead compared to repeated runs of a full simulation. Metamodel accuracy improves if constructed using space-filling designs of experiments (DOEs). The latter provide a collection of sample points in the design space preferably covering the entire space.
Journal Article

Long Life Axial Fatigue Strength Models for Ferrous Powder Metals

2018-04-03
2018-01-1395
Two models are presented for the long life (107 cycles) axial fatigue strength of four ferrous powder metal (PM) material series: sintered and heat-treated iron-carbon steel, iron-copper and copper steel, iron-nickel and nickel steel, and pre-alloyed steel. The materials are defined at ranges of carbon content and densities using the broad data available in the Metal Powder Industries Federation (MPIF) Standard 35 for PM structural parts. The first model evaluates 107 cycles axial fatigue strength as a function of ultimate strength and the second model as a function of hardness. For all 118 studied materials, both models are found to have a good correlation between calculated and 107 cycles axial fatigue strength with a high Pearson correlation coefficient of 0.97. The article provides details on the model development and the reasoning for selecting the ultimate strength and hardness as the best predictors for 107 cycles axial fatigue strength.
Journal Article

Reanalysis of Linear Dynamic Systems using Modified Combined Approximations with Frequency Shifts

2016-04-05
2016-01-1338
Weight reduction is very important in automotive design because of stringent demand on fuel economy. Structural optimization of dynamic systems using finite element (FE) analysis plays an important role in reducing weight while simultaneously delivering a product that meets all functional requirements for durability, crash and NVH. With advancing computer technology, the demand for solving large FE models has grown. Optimization is however costly due to repeated full-order analyses. Reanalysis methods can be used in structural vibrations to reduce the analysis cost from repeated eigenvalue analyses for both deterministic and probabilistic problems. Several reanalysis techniques have been introduced over the years including Parametric Reduced Order Modeling (PROM), Combined Approximations (CA) and the Epsilon algorithm, among others.
Journal Article

Computational Efficiency Improvements in Topography Optimization Using Reanalysis

2016-04-05
2016-01-1395
To improve fuel economy, there is a trend in automotive industry to use light weight, high strength materials. Automotive body structures are composed of several panels which must be downsized to reduce weight. Because this affects NVH (Noise, Vibration and Harshness) performance, engineers are challenged to recover the lost panel stiffness from down-gaging in order to improve the structure borne noise transmitted through the lightweight panels in the frequency range of 100-300 Hz where most of the booming and low medium frequency noise occurs. The loss in performance can be recovered by optimized panel geometry using beading or damping treatment. Topography optimization is a special class of shape optimization for changing sheet metal shapes by introducing beads. A large number of design variables can be handled and the process is easy to setup in commercial codes. However, optimization methods are computationally intensive because of repeated full-order analyses.
Journal Article

New Metrics to Assess Reliability and Functionality of Repairable Systems

2013-04-08
2013-01-0606
The classical definition of reliability may not be readily applicable for repairable systems. Commonly used concepts such as the Mean Time Between Failures (MTBF) and availability can be misleading because they only report limited information about the system functionality. In this paper, we discuss a set of metrics that can help with the design of repairable systems. Based on a set of desirable properties for these metrics, we select a minimal set of metrics (MSOM) which provides the most information about a system, with the smallest number of metrics. The metric of Minimum Failure Free Period (MFFP) with a given probability generalizes MTBF because the latter is simply the MFFP with a 0.5 probability. It also generalizes availability because coupled with repair times it provides a clearer picture of the length of the expected uninterrupted service. Two forms of MFFP are used: transient and steady state.
Journal Article

Managing the Computational Cost of Monte Carlo Simulation with Importance Sampling by Considering the Value of Information

2013-04-08
2013-01-0943
Importance Sampling is a popular method for reliability assessment. Although it is significantly more efficient than standard Monte Carlo simulation if a suitable sampling distribution is used, in many design problems it is too expensive. The authors have previously proposed a method to manage the computational cost in standard Monte Carlo simulation that views design as a choice among alternatives with uncertain reliabilities. Information from simulation has value only if it helps the designer make a better choice among the alternatives. This paper extends their method to Importance Sampling. First, the designer estimates the prior probability density functions of the reliabilities of the alternative designs and calculates the expected utility of the choice of the best design. Subsequently, the designer estimates the likelihood function of the probability of failure by performing an initial simulation with Importance Sampling.
Journal Article

A Methodology for Design Decisions using Block Diagrams

2013-04-08
2013-01-0947
Our recent work has shown that representation of systems using a reliability block diagram can be used as a decision making tool. In decision making, we called these block diagrams decision topologies. In this paper, we generalize the results and show that decision topologies can be used to make many engineering decisions and can in fact replace decision analysis for most decisions. We also provide a meta-proof that the proposed method using decision topologies is entirely consistent with decision analysis at the limit. The main advantages of the method are that (1) it provides a visual representation of a decision situation, (2) it can easily model tradeoffs, (3) it can incorporate binary attributes, (4) it can model preferences with limited information, and (5) it can be used in a low-fidelity sense to quickly make a decision.
Technical Paper

Reliability and Resiliency Definitions for Smart Microgrids Based on Utility Theory

2017-03-28
2017-01-0205
Reliability and resiliency (R&R) definitions differ depending on the system under consideration. Generally, each engineering sector defines relevant R&R metrics pertinent to their system. While this can impede cross-disciplinary engineering projects as well as research, it is a necessary strategy to capture all the relevant system characteristics. This paper highlights the difficulties associated with defining performance of such systems while using smart microgrids as an example. Further, it develops metrics and definitions that are useful in assessing their performance, based on utility theory. A microgrid must not only anticipate load conditions but also tolerate partial failures and remain optimally operating. Many of these failures happen infrequently but unexpectedly and therefore are hard to plan for. We discuss real life failure scenarios and show how the proposed definitions and metrics are beneficial.
Technical Paper

Optimal Water Jacket Flow Distribution Using a New Group-Based Space-Filling Design of Experiments Algorithm

2018-04-03
2018-01-1017
The availability of computational resources has enabled an increased utilization of Design of Experiments (DoE) and metamodeling (response surface generation) for large-scale optimization problems. Despite algorithmic advances however, the analysis of systems such as water jackets of an automotive engine, can be computationally demanding in part due to the required accuracy of metamodels. Because the metamodels may have many inputs, their accuracy depends on the number of training points and how well they cover the entire design (input) space. For this reason, the space-filling properties of the DoE are very important. This paper utilizes a new group-based DoE algorithm with space-filling groups of points to construct a metamodel. Points are added sequentially so that the space-filling properties of the entire group of points is preserved. The addition of points is continuous until a specified metamodel accuracy is met.
Technical Paper

A Cost-Driven Method for Design Optimization Using Validated Local Domains

2013-04-08
2013-01-1385
Design optimization often relies on computational models, which are subjected to a validation process to ensure their accuracy. Because validation of computer models in the entire design space can be costly, we have previously proposed an approach where design optimization and model validation, are concurrently performed using a sequential approach with variable-size local domains. We used test data and statistical bootstrap methods to size each local domain where the prediction model is considered validated and where design optimization is performed. The method proceeds iteratively until the optimum design is obtained. This method however, requires test data to be available in each local domain along the optimization path. In this paper, we refine our methodology by using polynomial regression to predict the size and shape of a local domain at some steps along the optimization process without using test data.
Technical Paper

A Methodology of Design for Fatigue Using an Accelerated Life Testing Approach with Saddlepoint Approximation

2019-04-02
2019-01-0159
We present an Accelerated Life Testing (ALT) methodology along with a design for fatigue approach, using Gaussian or non-Gaussian excitations. The accuracy of fatigue life prediction at nominal loading conditions is affected by model and material uncertainty. This uncertainty is reduced by performing tests at a higher loading level, resulting in a reduction in test duration. Based on the data obtained from experiments, we formulate an optimization problem to calculate the Maximum Likelihood Estimator (MLE) values of the uncertain model parameters. In our proposed ALT method, we lift all the assumptions on the type of life distribution or the stress-life relationship and we use Saddlepoint Approximation (SPA) method to calculate the fatigue life Probability Density Functions (PDFs).
Journal Article

Prediction of Fuel Maps in Variable Valve Timing Spark Ignited Gasoline Engines Using Kriging Metamodels

2020-04-14
2020-01-0744
Creating a fuel map for simulation of an engine with Variable Valve Actuation (VVA) can be computationally demanding. Design of Experiments (DOE) and metamodeling is one way to address this issue. In this paper, we introduce a sequential process to generate an engine fuel map using Kriging metamodels which account for different engine characteristics such as load and fuel consumption at different operating conditions. The generated map predicts engine output parameters such as fuel rate and load. We first create metamodels to accurately predict the Brake Mean Effective Pressure (BMEP), fuel rate, Residual Gas Fraction (RGF) and CA50 (Crank Angle for 50% Heat Release after top dead center). The last two quantities are used to ensure acceptable combustion. The metamodels are created sequentially to ensure acceptable accuracy is achieved with a small number of simulations.
Journal Article

Prediction of Spark Timing to Achieve a Specified Torque Profile in Spark-Ignition Engines Using Time-Dependent Metamodeling

2021-04-06
2021-01-0238
The internal combustion engine is a source of unwanted vibration on the vehicle body. The unwanted vibration comes from forces on the engine mounts which depend on the engine torque during a transient maneuver. In particular, during a tip-in or a tip-out maneuver, different torque profiles result in different magnitudes of vibration. A desired engine torque shape can be thus obtained to minimize the unwanted vibration. The desired torque shape can be achieved by controlling a set of engine calibration parameters. This paper provides a methodology to determine the spark timing profile to achieve a desired engine torque profile during a tip-out maneuver. The spark timing profiles are described by a third-order polynomial as a function of time. A set of coefficients to define a third-order polynomial (design sites) are first generated using design of experiments (DOE).
Journal Article

A Methodology for Fatigue Life Estimation of Linear Vibratory Systems under Non-Gaussian Loads

2017-03-28
2017-01-0197
Fatigue life estimation, reliability and durability are important in acquisition, maintenance and operation of vehicle systems. Fatigue life is random because of the stochastic load, the inherent variability of material properties, and the uncertainty in the definition of the S-N curve. The commonly used fatigue life estimation methods calculate the mean (not the distribution) of fatigue life under Gaussian loads using the potentially restrictive narrow-band assumption. In this paper, a general methodology is presented to calculate the statistics of fatigue life for a linear vibratory system under stationary, non-Gaussian loads considering the effects of skewness and kurtosis. The input loads are first characterized using their first four moments (mean, standard deviation, skewness and kurtosis) and a correlation structure equivalent to a given Power Spectral Density (PSD).
Journal Article

Time-Dependent Reliability-Based Design Optimization of Vibratory Systems

2017-03-28
2017-01-0194
A methodology for time-dependent reliability-based design optimization of vibratory systems with random parameters under stationary excitation is presented. The time-dependent probability of failure is computed using an integral equation which involves up-crossing and joint up-crossing rates. The total probability theorem addresses the presence of the system random parameters and a sparse grid quadrature method calculates the integral of the total probability theorem efficiently. The sensitivity derivatives of the time-dependent probability of failure with respect to the design variables are computed using finite differences. The Modified Combined Approximations (MCA) reanalysis method is used to reduce the overall computational cost from repeated evaluations of the system frequency response or equivalently impulse response function. The method is applied to the shape optimization of a vehicle frame under stochastic loading.
Journal Article

Mean-Value Second-Order Saddlepoint Approximation for Reliability Analysis

2017-03-28
2017-01-0207
A new second-order Saddlepoint Approximation (SA) method for structural reliability analysis is introduced. The Mean-value Second-order Saddlepoint Approximation (MVSOSA) is presented as an extension to the Mean-value First-order Saddlepoint Approximation (MVFOSA). The proposed method is based on a second-order Taylor expansion of the limit state function around the mean value of the input random variables. It requires not only the first but also the second-order sensitivity derivatives of the limit state function. If sensitivity analysis must be avoided because of computational cost, a quadrature integration approach, based on sparse grids, is also presented and linked to the saddlepoint approximation (SGSA - Sparse Grid Saddlepoint Approximation). The SGSA method is compared with the first and second-order SA methods in terms of accuracy and efficiency. The proposed MVSOSA and SGSA methods are used in the reliability analysis of two examples.
Journal Article

Time-Dependent Reliability Analysis Using a Modified Composite Limit State Approach

2017-03-28
2017-01-0206
Recent developments in time-dependent reliability have introduced the concept of a composite limit state. The composite limit state method can be used to calculate the time-dependent probability of failure for dynamic systems with limit-state functions of input random variables, input random processes and explicit in time. The probability of failure can be calculated exactly using the composite limit state if the instantaneous limit states are linear, forming an open or close polytope, and are functions of only two random variables. In this work, the restriction on the number of random variables is lifted. The proposed algorithm is accurate and efficient for linear instantaneous limit state functions of any number of random variables. An example on the design of a hydrokinetic turbine blade under time-dependent river flow load demonstrates the accuracy of the proposed general composite limit state approach.
Journal Article

An Improved Reanalysis Method Using Parametric Reduced Order Modeling for Linear Dynamic Systems

2016-04-05
2016-01-1318
Finite element analysis is a standard tool for deterministic or probabilistic design optimization of dynamic systems. The optimization process requires repeated eigenvalue analyses which can be computationally expensive. Several reanalysis techniques have been proposed to reduce the computational cost including Parametric Reduced Order Modeling (PROM), Combined Approximations (CA), and the Modified Combined Approximations (MCA) method. Although the cost of reanalysis is substantially reduced, it can still be high for models with a large number of degrees of freedom and a large number of design variables. Reanalysis methods use a basis composed of eigenvectors from both the baseline and the modified designs which are in general linearly dependent. To eliminate the linear dependency and improve accuracy, Gram Schmidt orthonormalization is employed which is costly itself.
X