Refine Your Search

Search Results

Journal Article

Value of Information for Comparing Dependent Repairable Assemblies and Systems

2018-04-03
2018-01-1103
This article presents an approach for comparing alternative repairable systems and calculating the value of information obtained by testing a specified number of such systems. More specifically, an approach is presented to determine the value of information that comes from field testing a specified number of systems in order to appropriately estimate the reliability metric associated with each of the respective repairable systems. Here the reliability of a repairable system will be measured by its failure rate. In support of the decision-making effort, the failure rate is translated into an expected utility based on a utility curve that represents the risk tolerance of the decision-maker. The algorithm calculates the change of the expected value of the decision with the sample size. The change in the value of the decision represents the value of information obtained from testing.
Technical Paper

Random Vibration Analysis Using Quasi-Random Bootstrapping

2018-04-03
2018-01-1104
Reliability analysis of engineering structures such as bridges, airplanes, and cars require calculation of small failure probabilities. These probabilities can be calculated using standard Monte Carlo simulation, but this method is impractical for most real-life systems because of its high computational cost. Many studies have focused on reducing the computational cost of a reliability assessment. These include bootstrapping, Separable Monte Carlo, Importance Sampling, and the Combined Approximations. The computational cost can also be reduced using an efficient method for deterministic analysis such as the mode superposition, mode acceleration, and the combined acceleration method. This paper presents and demonstrates a method that uses a combination of Sobol quasi-random sequences and bootstrapping to reduce the number of function calls. The study demonstrates that the use of quasi-random numbers in conjunction bootstrapping reduces dramatically computational cost.
Journal Article

Assessing the Value of Information for Multiple, Correlated Design Alternatives

2017-03-28
2017-01-0208
Design optimization occurs through a series of decisions that are a standard part of the product development process. Decisions are made anywhere from concept selection to the design of the assembly and manufacturing processes. The effectiveness of these decisions is based on the information available to the decision maker. Decision analysis provides a structured approach for quantifying the value of information that may be provided to the decision maker. This paper presents a process for determining the value of information that can be gained by evaluating linearly correlated design alternatives. A unique approach to the application of Bayesian Inference is used to provide simulated estimates in the expected utility with increasing observations sizes. The results provide insight into the optimum observation size that maximizes the expected utility when assessing correlated decision alternatives.
Technical Paper

Inverse Modeling: Theory and Engineering Examples

2016-04-05
2016-01-0267
Over the last two decades inverse problems have become increasingly popular due to their widespread applications. This popularity continuously demands designers to find alternative methods, to solve the inverse problems, which are efficient and accurate. It is important to use effective techniques that are both accurate and computationally efficient. This paper presents a method for solving inverse problems through Artificial Neural Network (ANN) theory. The paper also presents a method to apply Grey Wolf optimizer (GWO) algorithm to inverse problems. GWO is a recent optimization method producing superior results. Both methods are then compared to traditional methods such as Particle Swarm Optimization (PSO) and Markov Chain Monte Carlo (MCMC). Four typical engineering design problems are used to compare the four methods. The results show that the GWO outperforms other methods both in terms of efficiency and accuracy.
Journal Article

Bootstrapping and Separable Monte Carlo Simulation Methods Tailored for Efficient Assessment of Probability of Failure of Structural Systems

2015-04-14
2015-01-0420
There is randomness in both the applied loads and the strength of systems. Therefore, to account for the uncertainty, the safety of the system must be quantified using its reliability. Monte Carlo Simulation (MCS) is widely used for probabilistic analysis because of its robustness. However, the high computational cost limits the accuracy of MCS. Smarslok et al. [2010] developed an improved sampling technique for reliability assessment called Separable Monte Carlo (SMC) that can significantly increase the accuracy of estimation without increasing the cost of sampling. However, this method was applied to time-invariant problems involving two random variables. This paper extends SMC to problems with multiple random variables and develops a novel method for estimation of the standard deviation of the probability of failure of a structure. The method is demonstrated and validated on reliability assessment of an offshore wind turbine under turbulent wind loads.
Technical Paper

Combined Approximation for Efficient Reliability Analysis of Linear Dynamic Systems

2015-04-14
2015-01-0424
The Combined Approximation (CA) method is an efficient reanalysis method that aims at reducing the cost of optimization problems. The CA uses results of a single exact analysis, and it is suitable for different types of structures and design variables. The second author utilized CA to calculate the frequency response function of a system at a frequency of interest by using the results at a frequency in the vicinity of that frequency. He showed that the CA yields accurate results for small frequency perturbations. This work demonstrates a methodology that utilizes CA to reduce the cost of Monte Carlo simulation (MCs) of linear systems under random dynamic loads. The main idea is to divide the power spectral density function (PSD) of the input load into several frequency bins before calculating the load realizations.
Technical Paper

Multi-Level Decoupled Optimization of Wind Turbine Structures

2015-04-14
2015-01-0434
This paper proposes a multi-level decoupled method for optimizing the structural design of a wind turbine blade. The proposed method reduces the design space by employing a two-level optimization process. At the high-level, the structural properties of each section are approximated by an exponential function of the distance of that section from the blade root. High-level design variables are the coefficients of this approximating function. Target values for the structural properties of the blade are determined at that level. At the low-level, sections are divided into small decoupled groups. For each section, the low-level optimizer finds the thickness of laminate layers with a minimum mass, whose structural properties meet the targets determined by the high-level optimizer. In the proposed method, each low-level optimizer only considers a small number of design variables for a particular section, while traditional, single-level methods consider all design variables simultaneously.
Technical Paper

Reliability Analysis of Composite Inflatable Space Structures Considering Fracture Failure

2014-04-01
2014-01-0715
Inflatable space structures can have lower launching cost and larger habitat volume than their conventional rigid counterparts. These structures are made of composite laminates, and they are flexible when folded and partially inflated. They contain light-activated resins, and can be cured with the sun light after being inflated in space. A spacecraft can burst due to cracks caused by meteor showers or debris. Therefore, it is critical to identify the important fracture failure modes, and assess their probability. This information will help a designer minimize the risk of failure and keep the mass and cost low. This paper presents a probabilistic approach for finding the required thickness of an inflatable habitat shell for a prescribed reliability level, and demonstrates the superiority of probabilistic design to its deterministic counterpart.
Journal Article

Probability of Failure of Dynamic Systems by Importance Sampling

2013-04-08
2013-01-0607
Estimation of the probability of failure of mechanical systems under random loads is computationally expensive, especially for very reliable systems with low probabilities of failure. Importance Sampling can be an efficient tool for static problems if a proper sampling distribution is selected. This paper presents a methodology to apply Importance Sampling to dynamic systems in which both the load and response are stochastic processes. The method is applicable to problems for which the input loads are stationary and Gaussian and are represented by power spectral density functions. Shinozuka's method is used to generate random time histories of excitation. The method is demonstrated on a linear quarter car model. This approach is more efficient than standard Monte Carlo simulation by several orders of magnitude.
Journal Article

Managing the Computational Cost of Monte Carlo Simulation with Importance Sampling by Considering the Value of Information

2013-04-08
2013-01-0943
Importance Sampling is a popular method for reliability assessment. Although it is significantly more efficient than standard Monte Carlo simulation if a suitable sampling distribution is used, in many design problems it is too expensive. The authors have previously proposed a method to manage the computational cost in standard Monte Carlo simulation that views design as a choice among alternatives with uncertain reliabilities. Information from simulation has value only if it helps the designer make a better choice among the alternatives. This paper extends their method to Importance Sampling. First, the designer estimates the prior probability density functions of the reliabilities of the alternative designs and calculates the expected utility of the choice of the best design. Subsequently, the designer estimates the likelihood function of the probability of failure by performing an initial simulation with Importance Sampling.
Journal Article

Estimation of High-Cycle Fatigue Life by using Re-analysis

2012-04-16
2012-01-0066
In design of real-life systems, such as the suspension of a car, an offshore platform or a wind turbine, there are significant uncertainties in the model of the inputs. For example, scarcity of data leads to inaccuracies in the power spectral density function of the waves and the probability distribution of the wind speed. Therefore, it is necessary to evaluate the performance and safety of a system for different probability distributions. This is computationally expensive or even impractical. This paper presents a methodology to assess efficiently the fatigue life of structures for different power spectra of the applied loads. We accomplish that by reweighting the incremental damage calculated in one simulation. We demonstrate the accuracy and efficiency of the proposed method on an example which involves a nonlinear quarter car under a random dynamic load. The fatigue life of the suspension spring under loads generated by a sampling spectrum is calculated.
Journal Article

Efficient Random Vibration Analysis Using Markov Chain Monte Carlo Simulation

2012-04-16
2012-01-0067
Reliability assessment of dynamic systems with low failure probability can be very expensive. This paper presents and demonstrates a method that uses the Metropolis-Hastings algorithm to sample from an optimal probability density function (PDF) of the random variables. This function is the true PDF truncated over the failure region. For a system subjected to time varying excitation, Shinozuka's method is employed to generate time histories of the excitation. Random values of the frequencies and the phase angles of the excitation are drawn from the optimal PDF. It is shown that running the subset simulation by the proposed approach, which uses Shinozuka's method, is more efficient than the original subset simulation. The main reasons are that the approach involves only 10 to 20 random variables, and it takes advantage of the symmetry of the expression of the displacement as a function of the inputs. The paper demonstrates the method on two examples.
Technical Paper

Managing the Computational Cost in a Monte Carlo Simulation by Considering the Value of Information

2012-04-16
2012-01-0915
Monte Carlo simulation is a popular tool for reliability assessment because of its robustness and ease of implementation. A major concern with this method is its computational cost; standard Monte Carlo simulation requires quadrupling the number of replications for halving the standard deviation of the estimated failure probability. Efforts to increase efficiency focus on intelligent sampling procedures and methods for efficient calculation of the performance function of a system. This paper proposes a new method to manage cost that views design as a decision among alternatives with uncertain reliabilities. Information from a simulation has value only if it enables the designer to make a better choice among the alternative options. Consequently, the value of information from the simulation is equal to the gain from using this information to improve the decision. A designer can determine the number of replications that are worth performing by using the method.
Technical Paper

System Failure Identification using Linear Algebra: Application to Cost-Reliability Tradeoffs under Uncertain Preferences

2012-04-16
2012-01-0914
Reaching a system level reliability target is an inverse problem. Component level reliabilities are determined for a required system level reliability. Because this inverse problem does not have a unique solution, one approach is to tradeoff system reliability with cost and to allow the designer to select a design with a target system reliability, using his/her preferences. In this case, the component reliabilities are readily available from the calculation of the reliability-cost tradeoff. To arrive at the set of solutions to be traded off, one encounters two problems. First, the system reliability calculation is based on repeated system simulations where each system state, indicating which components work and which have failed, is tested to determine if it causes system failure, and second, the task of eliciting and encoding the decision maker's preferences is extremely difficult because of uncertainty in modeling the decision maker's preferences.
Journal Article

Time-Dependent Reliability of Random Dynamic Systems Using Time-Series Modeling and Importance Sampling

2011-04-12
2011-01-0728
Reliability is an important engineering requirement for consistently delivering acceptable product performance through time. As time progresses, the product may fail due to time-dependent operating conditions and material properties, component degradation, etc. The reliability degradation with time may increase the lifecycle cost due to potential warranty costs, repairs and loss of market share. Reliability is the probability that the system will perform its intended function successfully for a specified time interval. In this work, we consider the first-passage reliability which accounts for the first time failure of non-repairable systems. Methods are available in the literature, which provide an upper bound to the true reliability which may overestimate the true value considerably. Monte-Carlo simulations are accurate but computationally expensive.
Journal Article

Multi-Objective Decision Making under Uncertainty and Incomplete Knowledge of Designer Preferences

2011-04-12
2011-01-1080
Multi-attribute decision making and multi-objective optimization complement each other. Often, while making design decisions involving multiple attributes, a Pareto front is generated using a multi-objective optimizer. The end user then chooses the optimal design from the Pareto front based on his/her preferences. This seemingly simple methodology requires sufficient modification if uncertainty is present. We explore two kinds of uncertainties in this paper: uncertainty in the decision variables which we call inherent design problem (IDP) uncertainty and that in knowledge of the preferences of the decision maker which we refer to as preference assessment (PA) uncertainty. From a purely utility theory perspective a rational decision maker maximizes his or her expected multi attribute utility.
Journal Article

Efficient Probabilistic Reanalysis and Optimization of a Discrete Event System

2011-04-12
2011-01-1081
This paper presents a methodology to evaluate and optimize discrete event systems, such as an assembly line or a call center. First, the methodology estimates the performance of a system for a single probability distribution of the inputs. Probabilistic Reanalysis (PRRA) uses this information to evaluate the effect of changes in the system configuration on its performance. PRRA is integrated with a program to optimize the system. The proposed methodology is dramatically more efficient than one requiring a new Monte Carlo simulation each time we change the system. We demonstrate the approach on a drilling center and an electronic parts factory.
Journal Article

A Re-Analysis Methodology for System RBDO Using a Trust Region Approach with Local Metamodels

2010-04-12
2010-01-0645
A simulation-based, system reliability-based design optimization (RBDO) method is presented that can handle problems with multiple failure regions and correlated random variables. Copulas are used to represent the correlation. The method uses a Probabilistic Re-Analysis (PRRA) approach in conjunction with a trust-region optimization approach and local metamodels covering each trust region. PRRA calculates very efficiently the system reliability of a design by performing a single Monte Carlo (MC) simulation per trust region. Although PRRA is based on MC simulation, it calculates “smooth” sensitivity derivatives, allowing therefore, the use of a gradient-based optimizer. The PRRA method is based on importance sampling. It provides accurate results, if the support of the sampling PDF contains the support of the joint PDF of the input random variables. The sequential, trust-region optimization approach satisfies this requirement.
Technical Paper

Modeling Dependence and Assessing the Effect of Uncertainty in Dependence in Probabilistic Analysis and Decision Under Uncertainty

2010-04-12
2010-01-0697
A complete probabilistic model of uncertainty in probabilistic analysis and design problems is the joint probability distribution of the random variables. Often, it is impractical to estimate this joint probability distribution because the mechanism of the dependence of the variables is not completely understood. This paper proposes modeling dependence by using copulas and demonstrates their representational power. It also compares this representation with a Monte-Carlo simulation using dispersive sampling.
Journal Article

An RBDO Method for Multiple Failure Region Problems using Probabilistic Reanalysis and Approximate Metamodels

2009-04-20
2009-01-0204
A Reliability-Based Design Optimization (RBDO) method for multiple failure regions is presented. The method uses a Probabilistic Re-Analysis (PRRA) approach in conjunction with an approximate global metamodel with local refinements. The latter serves as an indicator to determine the failure and safe regions. PRRA calculates very efficiently the system reliability of a design by performing a single Monte Carlo (MC) simulation. Although PRRA is based on MC simulation, it calculates “smooth” sensitivity derivatives, allowing therefore, the use of a gradient-based optimizer. An “accurate-on-demand” metamodel is used in the PRRA that allows us to handle problems with multiple disjoint failure regions and potentially multiple most-probable points (MPP). The multiple failure regions are identified by using a clustering technique. A maximin “space-filling” sampling technique is used to construct the metamodel. A vibration absorber example highlights the potential of the proposed method.
X