Refine Your Search

Topic

Search Results

Journal Article

A Subdomain Approach for Uncertainty Quantification of Long Time Horizon Random Processes

2023-04-11
2023-01-0083
This paper addresses the uncertainty quantification of time-dependent problems excited by random processes represented by Karhunen Loeve (KL) expansion. The latter expresses a random process as a series of terms involving the dominant eigenvalues and eigenfunctions of the process covariance matrix weighted by samples of uncorrelated standard normal random variables. For many engineering appli bn vb nmcations, such as random vibrations, durability or fatigue, a long-time horizon is required for meaningful results. In this case however, a large number of KL terms is needed resulting in a very high computational effort for uncertainty propagation. This paper presents a new approach to generate time trajectories (sample functions) of a random process using KL expansion, if the time horizon (duration) is much larger than the process correlation length.
Journal Article

Prediction of Spark Timing to Achieve a Specified Torque Profile in Spark-Ignition Engines Using Time-Dependent Metamodeling

2021-04-06
2021-01-0238
The internal combustion engine is a source of unwanted vibration on the vehicle body. The unwanted vibration comes from forces on the engine mounts which depend on the engine torque during a transient maneuver. In particular, during a tip-in or a tip-out maneuver, different torque profiles result in different magnitudes of vibration. A desired engine torque shape can be thus obtained to minimize the unwanted vibration. The desired torque shape can be achieved by controlling a set of engine calibration parameters. This paper provides a methodology to determine the spark timing profile to achieve a desired engine torque profile during a tip-out maneuver. The spark timing profiles are described by a third-order polynomial as a function of time. A set of coefficients to define a third-order polynomial (design sites) are first generated using design of experiments (DOE).
Technical Paper

A Methodology of Design for Fatigue Using an Accelerated Life Testing Approach with Saddlepoint Approximation

2019-04-02
2019-01-0159
We present an Accelerated Life Testing (ALT) methodology along with a design for fatigue approach, using Gaussian or non-Gaussian excitations. The accuracy of fatigue life prediction at nominal loading conditions is affected by model and material uncertainty. This uncertainty is reduced by performing tests at a higher loading level, resulting in a reduction in test duration. Based on the data obtained from experiments, we formulate an optimization problem to calculate the Maximum Likelihood Estimator (MLE) values of the uncertain model parameters. In our proposed ALT method, we lift all the assumptions on the type of life distribution or the stress-life relationship and we use Saddlepoint Approximation (SPA) method to calculate the fatigue life Probability Density Functions (PDFs).
Journal Article

Reliability and Cost Trade-Off Analysis of a Microgrid

2018-04-03
2018-01-0619
Optimizing the trade-off between reliability and cost of operating a microgrid, including vehicles as both loads and sources, can be a challenge. Optimal energy management is crucial to develop strategies to improve the efficiency and reliability of microgrids, as well as new communication networks to support optimal and reliable operation. Prior approaches modeled the grid using MATLAB, but did not include the detailed physics of loads and sources, and therefore missed the transient effects that are present in real-time operation of a microgrid. This article discusses the implementation of a physics-based detailed microgrid model including a diesel generator, wind turbine, photovoltaic array, and utility. All elements are modeled as sources in Simulink. Various loads are also implemented including an asynchronous motor. We show how a central control algorithm optimizes the microgrid by trying to maximize reliability while reducing operational cost.
Journal Article

Long Life Axial Fatigue Strength Models for Ferrous Powder Metals

2018-04-03
2018-01-1395
Two models are presented for the long life (107 cycles) axial fatigue strength of four ferrous powder metal (PM) material series: sintered and heat-treated iron-carbon steel, iron-copper and copper steel, iron-nickel and nickel steel, and pre-alloyed steel. The materials are defined at ranges of carbon content and densities using the broad data available in the Metal Powder Industries Federation (MPIF) Standard 35 for PM structural parts. The first model evaluates 107 cycles axial fatigue strength as a function of ultimate strength and the second model as a function of hardness. For all 118 studied materials, both models are found to have a good correlation between calculated and 107 cycles axial fatigue strength with a high Pearson correlation coefficient of 0.97. The article provides details on the model development and the reasoning for selecting the ultimate strength and hardness as the best predictors for 107 cycles axial fatigue strength.
Technical Paper

Optimal Water Jacket Flow Distribution Using a New Group-Based Space-Filling Design of Experiments Algorithm

2018-04-03
2018-01-1017
The availability of computational resources has enabled an increased utilization of Design of Experiments (DoE) and metamodeling (response surface generation) for large-scale optimization problems. Despite algorithmic advances however, the analysis of systems such as water jackets of an automotive engine, can be computationally demanding in part due to the required accuracy of metamodels. Because the metamodels may have many inputs, their accuracy depends on the number of training points and how well they cover the entire design (input) space. For this reason, the space-filling properties of the DoE are very important. This paper utilizes a new group-based DoE algorithm with space-filling groups of points to construct a metamodel. Points are added sequentially so that the space-filling properties of the entire group of points is preserved. The addition of points is continuous until a specified metamodel accuracy is met.
Technical Paper

Reliability and Resiliency Definitions for Smart Microgrids Based on Utility Theory

2017-03-28
2017-01-0205
Reliability and resiliency (R&R) definitions differ depending on the system under consideration. Generally, each engineering sector defines relevant R&R metrics pertinent to their system. While this can impede cross-disciplinary engineering projects as well as research, it is a necessary strategy to capture all the relevant system characteristics. This paper highlights the difficulties associated with defining performance of such systems while using smart microgrids as an example. Further, it develops metrics and definitions that are useful in assessing their performance, based on utility theory. A microgrid must not only anticipate load conditions but also tolerate partial failures and remain optimally operating. Many of these failures happen infrequently but unexpectedly and therefore are hard to plan for. We discuss real life failure scenarios and show how the proposed definitions and metrics are beneficial.
Journal Article

Time-Dependent Reliability-Based Design Optimization of Vibratory Systems

2017-03-28
2017-01-0194
A methodology for time-dependent reliability-based design optimization of vibratory systems with random parameters under stationary excitation is presented. The time-dependent probability of failure is computed using an integral equation which involves up-crossing and joint up-crossing rates. The total probability theorem addresses the presence of the system random parameters and a sparse grid quadrature method calculates the integral of the total probability theorem efficiently. The sensitivity derivatives of the time-dependent probability of failure with respect to the design variables are computed using finite differences. The Modified Combined Approximations (MCA) reanalysis method is used to reduce the overall computational cost from repeated evaluations of the system frequency response or equivalently impulse response function. The method is applied to the shape optimization of a vehicle frame under stochastic loading.
Journal Article

Mean-Value Second-Order Saddlepoint Approximation for Reliability Analysis

2017-03-28
2017-01-0207
A new second-order Saddlepoint Approximation (SA) method for structural reliability analysis is introduced. The Mean-value Second-order Saddlepoint Approximation (MVSOSA) is presented as an extension to the Mean-value First-order Saddlepoint Approximation (MVFOSA). The proposed method is based on a second-order Taylor expansion of the limit state function around the mean value of the input random variables. It requires not only the first but also the second-order sensitivity derivatives of the limit state function. If sensitivity analysis must be avoided because of computational cost, a quadrature integration approach, based on sparse grids, is also presented and linked to the saddlepoint approximation (SGSA - Sparse Grid Saddlepoint Approximation). The SGSA method is compared with the first and second-order SA methods in terms of accuracy and efficiency. The proposed MVSOSA and SGSA methods are used in the reliability analysis of two examples.
Journal Article

Time-Dependent Reliability Analysis Using a Modified Composite Limit State Approach

2017-03-28
2017-01-0206
Recent developments in time-dependent reliability have introduced the concept of a composite limit state. The composite limit state method can be used to calculate the time-dependent probability of failure for dynamic systems with limit-state functions of input random variables, input random processes and explicit in time. The probability of failure can be calculated exactly using the composite limit state if the instantaneous limit states are linear, forming an open or close polytope, and are functions of only two random variables. In this work, the restriction on the number of random variables is lifted. The proposed algorithm is accurate and efficient for linear instantaneous limit state functions of any number of random variables. An example on the design of a hydrokinetic turbine blade under time-dependent river flow load demonstrates the accuracy of the proposed general composite limit state approach.
Journal Article

Uncertainty Assessment in Restraint System Optimization for Occupants of Tactical Vehicles

2016-04-05
2016-01-0316
We have recently obtained experimental data and used them to develop computational models to quantify occupant impact responses and injury risks for military vehicles during frontal crashes. The number of experimental tests and model runs are however, relatively small due to their high cost. While this is true across the auto industry, it is particularly critical for the Army and other government agencies operating under tight budget constraints. In this study we investigate through statistical simulations how the injury risk varies if a large number of experimental tests were conducted. We show that the injury risk distribution is skewed to the right implying that, although most physical tests result in a small injury risk, there are occasional physical tests for which the injury risk is extremely large. We compute the probabilities of such events and use them to identify optimum design conditions to minimize such probabilities.
Journal Article

Reanalysis of Linear Dynamic Systems using Modified Combined Approximations with Frequency Shifts

2016-04-05
2016-01-1338
Weight reduction is very important in automotive design because of stringent demand on fuel economy. Structural optimization of dynamic systems using finite element (FE) analysis plays an important role in reducing weight while simultaneously delivering a product that meets all functional requirements for durability, crash and NVH. With advancing computer technology, the demand for solving large FE models has grown. Optimization is however costly due to repeated full-order analyses. Reanalysis methods can be used in structural vibrations to reduce the analysis cost from repeated eigenvalue analyses for both deterministic and probabilistic problems. Several reanalysis techniques have been introduced over the years including Parametric Reduced Order Modeling (PROM), Combined Approximations (CA) and the Epsilon algorithm, among others.
Journal Article

Computational Efficiency Improvements in Topography Optimization Using Reanalysis

2016-04-05
2016-01-1395
To improve fuel economy, there is a trend in automotive industry to use light weight, high strength materials. Automotive body structures are composed of several panels which must be downsized to reduce weight. Because this affects NVH (Noise, Vibration and Harshness) performance, engineers are challenged to recover the lost panel stiffness from down-gaging in order to improve the structure borne noise transmitted through the lightweight panels in the frequency range of 100-300 Hz where most of the booming and low medium frequency noise occurs. The loss in performance can be recovered by optimized panel geometry using beading or damping treatment. Topography optimization is a special class of shape optimization for changing sheet metal shapes by introducing beads. A large number of design variables can be handled and the process is easy to setup in commercial codes. However, optimization methods are computationally intensive because of repeated full-order analyses.
Journal Article

An Improved Reanalysis Method Using Parametric Reduced Order Modeling for Linear Dynamic Systems

2016-04-05
2016-01-1318
Finite element analysis is a standard tool for deterministic or probabilistic design optimization of dynamic systems. The optimization process requires repeated eigenvalue analyses which can be computationally expensive. Several reanalysis techniques have been proposed to reduce the computational cost including Parametric Reduced Order Modeling (PROM), Combined Approximations (CA), and the Modified Combined Approximations (MCA) method. Although the cost of reanalysis is substantially reduced, it can still be high for models with a large number of degrees of freedom and a large number of design variables. Reanalysis methods use a basis composed of eigenvectors from both the baseline and the modified designs which are in general linearly dependent. To eliminate the linear dependency and improve accuracy, Gram Schmidt orthonormalization is employed which is costly itself.
Journal Article

A New Metamodeling Approach for Time-Dependent Reliability of Dynamic Systems with Random Parameters Excited by Input Random Processes

2014-04-01
2014-01-0717
We propose a new metamodeling method to characterize the output (response) random process of a dynamic system with random parameters, excited by input random processes. The metamodel can be then used to efficiently estimate the time-dependent reliability of a dynamic system using analytical or simulation-based methods. The metamodel is constructed by decomposing the input random processes using principal components or wavelets and then using a few simulations to estimate the distributions of the decomposition coefficients. A similar decomposition is also performed on the output random process. A kriging model is then established between the input and output decomposition coefficients and subsequently used to quantify the output random process corresponding to a realization of the input random parameters and random processes. What distinguishes our approach from others in metamodeling is that the system input is not deterministic but random.
Journal Article

Enhancing Decision Topology Assessment in Engineering Design

2014-04-01
2014-01-0719
Implications of decision analysis (DA) on engineering design are important and well-documented. However, widespread adoption has not occurred. To that end, the authors recently proposed decision topologies (DT) as a visual method for representing decision situations and proved that they are entirely consistent with normative decision analysis. This paper addresses the practical issue of assessing the DTs of a designer using their responses. As in classical DA, this step is critical to encoding the DA's preferences so that further analysis and mathematical optimization can be performed on the correct set of preferences. We show how multi-attribute DTs can be directly assessed from DM responses. Furthermore, we show that preferences under uncertainty can be trivially incorporated and that topologies can be constructed using single attribute topologies similarly to multi-linear functions in utility analysis. This incremental construction simplifies the process of topology construction.
Journal Article

Flexible Design and Operation of a Smart Charging Microgrid

2014-04-01
2014-01-0716
The reliability theory of repairable systems is vastly different from that of non-repairable systems. The authors have recently proposed a ‘decision-based’ framework to design and maintain repairable systems for optimal performance and reliability using a set of metrics such as minimum failure free period, number of failures in planning horizon (lifecycle), and cost. The optimal solution includes the initial design, the system maintenance throughout the planning horizon, and the protocol to operate the system. In this work, we extend this idea by incorporating flexibility and demonstrate our approach using a smart charging electric microgrid architecture. The flexibility is realized by allowing the architecture to change with time. Our approach “learns” the working characteristics of the microgrid. We use actual load and supply data over a short time to quantify the load and supply random processes and also establish the correlation between them.
Journal Article

Managing the Computational Cost of Monte Carlo Simulation with Importance Sampling by Considering the Value of Information

2013-04-08
2013-01-0943
Importance Sampling is a popular method for reliability assessment. Although it is significantly more efficient than standard Monte Carlo simulation if a suitable sampling distribution is used, in many design problems it is too expensive. The authors have previously proposed a method to manage the computational cost in standard Monte Carlo simulation that views design as a choice among alternatives with uncertain reliabilities. Information from simulation has value only if it helps the designer make a better choice among the alternatives. This paper extends their method to Importance Sampling. First, the designer estimates the prior probability density functions of the reliabilities of the alternative designs and calculates the expected utility of the choice of the best design. Subsequently, the designer estimates the likelihood function of the probability of failure by performing an initial simulation with Importance Sampling.
Journal Article

A Methodology for Design Decisions using Block Diagrams

2013-04-08
2013-01-0947
Our recent work has shown that representation of systems using a reliability block diagram can be used as a decision making tool. In decision making, we called these block diagrams decision topologies. In this paper, we generalize the results and show that decision topologies can be used to make many engineering decisions and can in fact replace decision analysis for most decisions. We also provide a meta-proof that the proposed method using decision topologies is entirely consistent with decision analysis at the limit. The main advantages of the method are that (1) it provides a visual representation of a decision situation, (2) it can easily model tradeoffs, (3) it can incorporate binary attributes, (4) it can model preferences with limited information, and (5) it can be used in a low-fidelity sense to quickly make a decision.
Technical Paper

A Cost-Driven Method for Design Optimization Using Validated Local Domains

2013-04-08
2013-01-1385
Design optimization often relies on computational models, which are subjected to a validation process to ensure their accuracy. Because validation of computer models in the entire design space can be costly, we have previously proposed an approach where design optimization and model validation, are concurrently performed using a sequential approach with variable-size local domains. We used test data and statistical bootstrap methods to size each local domain where the prediction model is considered validated and where design optimization is performed. The method proceeds iteratively until the optimum design is obtained. This method however, requires test data to be available in each local domain along the optimization path. In this paper, we refine our methodology by using polynomial regression to predict the size and shape of a local domain at some steps along the optimization process without using test data.
X