Refine Your Search


Search Results

Journal Article

Managing the Computational Cost of Monte Carlo Simulation with Importance Sampling by Considering the Value of Information

Importance Sampling is a popular method for reliability assessment. Although it is significantly more efficient than standard Monte Carlo simulation if a suitable sampling distribution is used, in many design problems it is too expensive. The authors have previously proposed a method to manage the computational cost in standard Monte Carlo simulation that views design as a choice among alternatives with uncertain reliabilities. Information from simulation has value only if it helps the designer make a better choice among the alternatives. This paper extends their method to Importance Sampling. First, the designer estimates the prior probability density functions of the reliabilities of the alternative designs and calculates the expected utility of the choice of the best design. Subsequently, the designer estimates the likelihood function of the probability of failure by performing an initial simulation with Importance Sampling.
Journal Article

Random Vibration Testing Development for Engine Mounted Products Considering Customer Usage

In this paper, the development of random vibration testing schedules for durability design verification of engine mounted products is presented, based on the equivalent fatigue damage concept and the 95th-percentile customer engine usage data for 150,000 miles. Development of the 95th-percentile customer usage profile is first discussed. Following that, the field engine excitation and engine duty cycle definition is introduced. By using a simplified transfer function of a single degree-of-freedom (SDOF) system subjected to a base excitation, the response acceleration and stress PSDs are related to the input excitation in PSD, which is the equivalent fatigue damage concept. Also, the narrow-band fatigue damage spectrum (FDS) is calculated in terms of the input excitation PSD based on the Miner linear damage rule, the Rayleigh statistical distribution for stress amplitude, a material's S-N curve, and the Miles approximate solution.
Journal Article

Comparison of Tribological Performance of WS2 Nanoparticles, Microparticles and Mixtures Thereof

Tribological performance of tungsten sulfide (WS2) nanoparticles, microparticles and mixtures of the two were investigated. Previous research showed that friction and wear reduction can be achieved with nanoparticles. Often these improvements were mutually exclusive, or achieved under special conditions (high temperature, high vacuum) or with hard-to-synthesize inorganic-fullerene WS2 nanoparticles. This study aimed at investigating the friction and wear reduction of WS2 of nanoparticles and microparticles that can be synthesized in bulk and/or purchased off the shelf. Mixtures of WS2 nanoparticles and microparticles were also tested to see if a combination of reduced friction and wear would be achieved. The effect of the mixing process on the morphology of the particles was also reported. The microparticles showed the largest reduction in coefficient of friction while the nanoparticles showed the largest wear scar area reduction.
Technical Paper

Measurement of Aluminum Edge Stretching Limit Using 3D Digital Image Correlation

This paper introduces an industrial application of digital image correlation technique on the measurement of aluminum edge stretching limit. In this study, notch-shape aluminum coupons with three different pre-strain conditions are tested. The edge stretching is proceeded by standard MTS machine. A dual-camera 3D Digital Image Correlation (DIC) system is used for the full field measurement of strain distribution in the thickness direction. Selected air brush is utilized to form a random distributed speckle pattern on the edge of sheet metal. A pair of special optical lens systems are used to observe the small measurement edge area. From the test results, it demonstrate that refer to the notched coupon thickness, pre-tension does not affect the fracture limit; refer to the virgin sheet thickness, the average edge stretch thinning limits show a consistent increasing trend as the pre-stretch strain increased.
Technical Paper

Determination of Whole Field Residual Strain Measurement Using 3D-DSPI and Incremental Hole Drilling

An experimental setup utilizing 3D-Digital Speckle Pattern Interferometry (DSPI) 1,2 and Incremental hole drilling is being applied for the non-contact, fast and accurate determination of residual strain as a function of depth. From the measured phase maps using the DSPI technique we can determine the surface deformations in a whole field area around a drilled hole and thus relate these released strains to the residual strains existing in the material. Incremental hole drilling3,4 has been coupled with residual stress measurement to provide a means to estimate the residual stresses as a function of depth. Unlike the traditional holography with a manualevaluation5 in this case the system can quantitatively determine the deformation data in x, y and z directions for various depth increments and thus finally provides us with the residual strains as a function of depth.
Technical Paper

Engine Simulation of a Restricted FSAE Engine, Focusing on Restrictor Modelling

One-dimensional (1D) engine simulation packages are limited in modeling flows through an adverse pressure gradient where boundary layer separation is more likely to occur, as in the case of the diffuser part of the restrictor. The restrictor modeling difficulty usually manifests itself as an engine model that consumes a lot of effort (both computational and from the user) in the modeling of the restrictor. The approach sought in this work was to provide a flow vs pressure drop dependency to the code such that it does not consume too much effort in the analysis of the restrictor. This approach is similar to that used for the valve flow, where a look up table is typically provided for determining the flow. Experimentally determined flow measurements on a thin-plate orifice, a short restrictor and a long restrictor are presented and discussed. The developed model gave excellent results in an acyclic steady-state simulation and is being integrated in the full engine model.
Technical Paper

An Efficient Possibility-Based Design Optimization Method for a Combination of Interval and Random Variables

Reliability-based design optimization accounts for variation. However, it assumes that statistical information is available in the form of fully defined probabilistic distributions. This is not true for a variety of engineering problems where uncertainty is usually given in terms of interval ranges. In this case, interval analysis or possibility theory can be used instead of probability theory. This paper shows how possibility theory can be used in design and presents a computationally efficient sequential optimization algorithm. The algorithm handles problems with only uncertain or a combination of random and uncertain design variables and parameters. It consists of a sequence of cycles composed of a deterministic design optimization followed by a set of worst-case reliability evaluation loops. A crank-slider mechanism example demonstrates the accuracy and efficiency of the proposed sequential algorithm.
Technical Paper

Design Optimization and Reliability Estimation with Incomplete Uncertainty Information

Existing methods for design optimization under uncertainty assume that a high level of information is available, typically in the form of data. In reality, however, insufficient data prevents correct inference of probability distributions, membership functions, or interval ranges. In this article we use an engine design example to show that optimal design decisions and reliability estimations depend strongly on uncertainty characterization. We contrast the reliability-based optimal designs to the ones obtained using worst-case optimization, and ask the question of how to obtain non-conservative designs with incomplete uncertainty information. We propose an answer to this question through the use of Bayesian statistics. We estimate the truck's engine reliability based only on available samples, and demonstrate that the accuracy of our estimates increases as more samples become available.
Technical Paper

Investigation of the Effects of Autoignition on the Heat Release Histories of a Knocking SI Engine Using Wiebe Functions

In this paper, we develop a methodology to enable the isolation of the heat release contribution of knocking combustion from flame-propagation combustion. We first address the empirical modeling of individual non-autoigniting combustion history using the Wiebe function, and subsequently apply this methodology to investigate the effect of autoignition on the heat release history of knocking cycles in a spark ignition (SI) engine. We start by re-visiting the Wiebe function, which is widely used to model empirically mass burned histories in SI engines. We propose a method to tune the parameters of the Wiebe function on a cycle-by-cycle basis, i.e., generating a different Wiebe to suitably fit the heat release history of each cycle. Using non-autoigniting cycles, we show that the Wiebe function can reliably simulate the heat release history of an entire cycle, if only data from the first portion of the cycle is used in the tuning process.
Journal Article

Reliability Estimation for Multiple Failure Region Problems using Importance Sampling and Approximate Metamodels

An efficient reliability estimation method is presented for engineering systems with multiple failure regions and potentially multiple most probable points. The method can handle implicit, nonlinear limit-state functions, with correlated or non-correlated random variables, which can be described by any probabilistic distribution. It uses a combination of approximate or “accurate-on-demand,” global and local metamodels which serve as indicators to determine the failure and safe regions. Samples close to limit states define transition regions between safe and failure domains. A clustering technique identifies all transition regions which can be in general disjoint, and local metamodels of the actual limit states are generated for each transition region.
Journal Article

Probabilistic Reanalysis Using Monte Carlo Simulation

An approach for Probabilistic Reanalysis (PRA) of a system is presented. PRA calculates very efficiently the system reliability or the average value of an attribute of a design for many probability distributions of the input variables, by performing a single Monte Carlo simulation. In addition, PRA calculates the sensitivity derivatives of the reliability to the parameters of the probability distributions. The approach is useful for analysis problems where reliability bounds need to be calculated because the probability distribution of the input variables is uncertain or for design problems where the design variables are random. The accuracy and efficiency of PRA is demonstrated on vibration analysis of a car and on system reliability-based optimization (RBDO) of an internal combustion engine.
Technical Paper

Imprecise Reliability Assessment When the Type of the Probability Distribution of the Random Variables is Unknown

In reliability design, often, there is scarce data for constructing probabilistic models. It is particularly challenging to model uncertainty in variables when the type of their probability distribution is unknown. Moreover, it is expensive to estimate the upper and lower bounds of the reliability of a system involving such variables. A method for modeling uncertainty by using Polynomial Chaos Expansion is presented. The method requires specifying bounds for statistical summaries such as the first four moments and credible intervals. A constrained optimization problem, in which decision variables are the coefficients of the Polynomial Chaos Expansion approximation, is formulated and solved in order to estimate the minimum and maximum values of a system’s reliability. This problem is solved efficiently by employing a probabilistic re-analysis approach to approximate the system reliability as a function of the moments of the random variables.
Technical Paper

FEA Simulation of Induction Hardening and Residual Stress of Auto Components

The paper studies the distributions of residual stresses in auto components after induction hardening. Three prototype parts are analyzed in this paper. Firstly, the temperature fields of the analyzed parts are quantitatively simulated during quenching by simulating surface heating to the austenitization temperature of the material. Secondly, the formation and states of the residual stresses are predicted. Therefore the distribution of residual stress is simulated and shows compressive stresses on the surface of components so that the strength can be improved. The simulated results by computer are compared with experimental results. The good comparison indicates that the results obtained by the FEA analysis are reliable. Thus, it can be concluded that the FEA (Finite element analysis) program is effectively developed to simulate heating and quenching processes and residual stresses distribution.
Technical Paper

“The Creation, Development and Implementation of a Lean Systems Course at Oakland University, Rochester, MI”

Countless articles and publications3,4,5 have documented and proven the efficacy, benefits and value of operating within a lean system. Furthermore, there exists common agreement amongst leading organizations successfully implementing a lean system that in order to do so it must take into consideration the entire enterprise, that is, from supplier to customer and everything in between6. One of the core issues this paper addresses is when the optimal time is to train and educate the people who currently have, or will have, influence over the ‘enterprise’.
Technical Paper

Reliability Analysis Using Monte Carlo Simulation and Response Surface Methods

An accurate and efficient Monte Carlo simulation (MCS) method is developed in this paper for limit state-based reliability analysis, especially at system levels, by using a response surface approximation of the failure indicator function. The Moving Least Squares (MLS) method is used to construct the response surface of the indicator function, along with an Optimum Symmetric Latin Hypercube (OSLH) as the sampling technique. Similar to MCS, the proposed method can easily handle implicit, highly nonlinear limit-state functions, with variables of any statistical distributions and correlations. However, the efficiency of MCS can be greatly improved. The method appears to be particularly efficient for multiple limit state and multiple design point problem. A mathematical example and a practical example are used to highlight the superior accuracy and efficiency of the proposed method over traditional reliability methods.
Technical Paper

Tensile Test for Polymer Plastics with Extreme Large Elongation Using Quad-Camera Digital Image Correlation

Polymer plastics are widely used in automotive light weight design. Tensile tests are generally used to obtain material stress-strain curves. Due to the natural of the plastic materials, it could be elongated more than several hundred percent of its original length before breaking. Digital Image Correlation (DIC) Analysis is a precise, full field, optical measurement method. It has been accepted as a practical in-field testing method by the industry. However, with the traditional single-camera or dual-camera DIC system, it is nearly impossible to measure the extreme large strain. This paper introduces a unique experimental procedure for large elongation measurement. By utilization of quad-camera DIC system and data stitch technique, the strain history for plastic material under hundreds percent of elongation can be measured. With a quad-camera DIC system, the correlation was conducted between two adjacent cameras.
Journal Article

A Methodology for Fatigue Life Estimation of Linear Vibratory Systems under Non-Gaussian Loads

Fatigue life estimation, reliability and durability are important in acquisition, maintenance and operation of vehicle systems. Fatigue life is random because of the stochastic load, the inherent variability of material properties, and the uncertainty in the definition of the S-N curve. The commonly used fatigue life estimation methods calculate the mean (not the distribution) of fatigue life under Gaussian loads using the potentially restrictive narrow-band assumption. In this paper, a general methodology is presented to calculate the statistics of fatigue life for a linear vibratory system under stationary, non-Gaussian loads considering the effects of skewness and kurtosis. The input loads are first characterized using their first four moments (mean, standard deviation, skewness and kurtosis) and a correlation structure equivalent to a given Power Spectral Density (PSD).
Technical Paper

Reliability and Resiliency Definitions for Smart Microgrids Based on Utility Theory

Reliability and resiliency (R&R) definitions differ depending on the system under consideration. Generally, each engineering sector defines relevant R&R metrics pertinent to their system. While this can impede cross-disciplinary engineering projects as well as research, it is a necessary strategy to capture all the relevant system characteristics. This paper highlights the difficulties associated with defining performance of such systems while using smart microgrids as an example. Further, it develops metrics and definitions that are useful in assessing their performance, based on utility theory. A microgrid must not only anticipate load conditions but also tolerate partial failures and remain optimally operating. Many of these failures happen infrequently but unexpectedly and therefore are hard to plan for. We discuss real life failure scenarios and show how the proposed definitions and metrics are beneficial.
Technical Paper

3-D Machine-Vision Technique for Rapid 3D Shape Measurement and Surface Quality Inspection

A novel computer vision technique for rapid measurement of surface coordinates is presented. The technique is based on the marriage of a digital fringe projection technique and a fringe-phase extraction algorithm. A digitally controlled video signal in the form of linear and parallel fringes of cosinusoidal intensity variation is projected onto an object. The fringe pattern is perturbed by the three-dimensional object surface with fringe-phase containing information on the depth of the object. A phase extraction algorithm is used to determine the fringe-phase distribution, from which the three-dimensional surface coordinates are determined. The theoretical basis of this technique and some experimental results are presented in this paper.
Technical Paper

Modelling of a Discrete Variable Compression Ratio (VCR) System for Fuel Consumption Evaluation - Part 2: Modelling Results

Variable Compression Ratio systems are an increasingly attractive solution for car manufacturers in order to reduce vehicle fuel consumption. By having the capability to operate with a range of compression ratios, engine efficiency can be significantly increased by operating with a high compression ratio at low loads, where the engine is normally not knock-limited, and with a low compression ratio at high load, where the engine is more prone to knock. In this way, engine efficiency can be maximized without sacrificing performance. This study aims to analyze how the effectiveness of a VCR system is affected by various powertrain and vehicle parameters. By using a Matlab model of a VCR system developed in Part 1 of this work, the influence of the vehicle characteristics, the drive cycle, and of the number of stages used in the VCR system was studied.