Refine Your Search

Search Results

Viewing 1 to 10 of 10
Journal Article

Probabilistic Life and Damage Assessment of Components under Fatigue Loading

2015-09-29
2015-01-2759
This study presents a probabilistic life (failure) and damage assessment approach for components under general fatigue loadings, including constant amplitude loading, step-stress loading, and variable amplitude loading. The approach consists of two parts: (1) an empirical probabilistic distribution obtained by fitting the fatigue failure data at various stress range levels, and (2) an inverse technique, which transforms the probabilistic life distribution to the probabilistic damage distribution at any applied cycle. With this approach, closed-form solutions of damage as function of the applied cycle can be obtained for constant amplitude loading. Under step-stress and variable amplitude loadings, the damage distribution at any cycle can be calculated based on the accumulative damage model in a cycle-by-cycle manner. For Gaussian-type random loading, a cycle-by-cycle equivalent, but a much simpler closed-form solution can be derived.
Journal Article

Statistical Characterization, Pattern Identification, and Analysis of Big Data

2017-03-28
2017-01-0236
In the Big Data era, the capability in statistical and probabilistic data characterization, data pattern identification, data modeling and analysis is critical to understand the data, to find the trends in the data, and to make better use of the data. In this paper the fundamental probability concepts and several commonly used probabilistic distribution functions, such as the Weibull for spectrum events and the Pareto for extreme/rare events, are described first. An event quadrant is subsequently established based on the commonality/rarity and impact/effect of the probabilistic events. Level of measurement, which is the key for quantitative measurement of the data, is also discussed based on the framework of probability. The damage density function, which is a measure of the relative damage contribution of each constituent is proposed. The new measure demonstrates its capability in distinguishing between the extreme/rare events and the spectrum events.
Technical Paper

Development of Probabilistic Fatigue Life Distribution Functions with Lower and Upper Bounds

2017-03-28
2017-01-0354
A probabilistic distribution function roughly consists of two parts: the middle part and the tails. The fatigue life distribution at a stress/load level is often described with two-parameter lognormal or Weibull distribution functions, which are more suitable for modeling the mean (middle) behaviors. The domains of the conventional probabilistic distribution functions are often unbounded, either infinite small (0 for the two-parameter Weibull) or infinite large or both. For most materials in low- and medium-cycle fatigue regimes, the domains of fatigue lives are usually bounded, and the inclusion of the bounds in a probabilistic model is often critical in some applications, such as product validation and life management. In this paper, four- and five-parameter Weibull distribution functions for the probabilistic distributions with bounds are developed. Finally, the applications of these new models in fatigue data analysis and damage assessment are provided and discussed.
Technical Paper

Accelerated Reliability Demonstration Methods Based on Three-Parameter Weibull Distribution

2017-03-28
2017-01-0202
Life testing or test-to-failure method and binomial testing method are the two most commonly used methods in product validation and reliability demonstration. The two-parameter Weibull distribution function is often used in the life testing and almost exclusively used in the extended time testing, which can be considered as an accelerated testing method by appropriately extending the testing time but with significantly reduced testing samples. However, the fatigue data from a wide variety of sources indicate that the three-parameter Weibull distribution function with a threshold parameter at the left tail is more appropriate for fatigue life data with large sample sizes. The uncertainties introduced from the assumptions about the underlying probabilistic distribution would significantly affect the interpretation of the test data and the assessment of the performance of the accelerated binomial testing methods, therefore, the selection of a probabilistic model is critically important.
Technical Paper

Equilibrium Mechanism Based Linear Curve Fitting Method and Its Application

2011-04-12
2011-01-0785
The equilibrium mechanism, which can be considered as the basis of least squares method for linear curve fitting, is investigated in this paper. Both conventional methods, such as vertical offsets method, and total least squares methods, such as perpendicular offsets method, are examined. It is found that both methods have the equilibrium bases. However, the conventional methods may give inaccurate prediction if using vertical offsets method to fit data with variation in horizontal direction or using horizontal offsets method to fit data with variation in vertical direction while the perpendicular method can give best fit solution to data with variation in both vertical and horizontal directions. The application of these methods is also presented in fatigue S-N curve data analysis and two-parameter Weibull distribution in exhaust component fatigue life prediction.
Technical Paper

A Fatigue S-N Curve Transformation Technique and Its Applications in Fatigue Data Analysis

2018-04-03
2018-01-0791
The approaches of obtaining both fatigue strength distribution and fatigue life distribution for a given set of fatigue S-N data are reviewed in this paper. A new fatigue S-N curve transformation technique, which is based on the fundamental statistics definition and some reasonable assumptions, is specifically developed in this paper to transform a fatigue life distribution to a fatigue strength distribution. The procedures of applying the technique to multiple-stress level, two-stress level, and one-stress level fatigue S-N data are presented.
Technical Paper

The Uncertainty of Estimated Lognormal and Weibull Parameters for Test Data with Small Sample Size

2013-04-08
2013-01-0945
In this paper, the uncertainty of the estimated parameters of lognormal and Weibull distributions for test data with small sample size is investigated. The confidence intervals of the estimated parameters are determined by solving available analytical equations, and the scatters of the estimated parameters with respect to the true values are estimated by using Monte Carlo simulation approaches. Important parameters such as mean, standard deviation, and design curve are considered. The emphasis is on the interpretation and the implication of the obtained shape parameter β of the Weibull distribution function and the design curve obtained from a lognormal distribution function. Finally, the possible impact of this study on the current engineering practice is discussed.
Technical Paper

Uncertainty Characterization and Quantification in Product Validation and Reliability Demonstration

2016-04-05
2016-01-0270
Product validation and reliability demonstration require testing of limited samples and probabilistic analyses of the test data. The uncertainties introduced from the tests with limited sample sizes and the assumptions made about the underlying probabilistic distribution will significantly impact the results and the results interpretation. Therefore, understanding the nature of these uncertainties is critical to test method development, uncertainty reduction, data interpretation, and the effectiveness of the validation and reliability demonstration procedures. In this paper, these uncertainties are investigated with the focuses on the following two aspects: (1) fundamentals of the RxxCyy criterion used in both the life testing and the binomial testing methods, (2) issues and benefits of using the two-parameter Weibull probabilistic distribution function.
Technical Paper

Quality Control and Improvement Based on Design of Experiments and Statistical Data Analysis

2014-04-01
2014-01-0774
A modern definition of quality control and improvement is the reduction of variability in processes and products. The reduced variability can be directly translated into lower costs, better functions and fewer repairs. However, the final quality of processes and products is sometimes derived from other measured variables through some implicit or explicit functional relationships. Sometimes, a tiny uncertainty in a variable can produce a huge uncertainty in a derived quantity. Therefore, the propagation of uncertainty needs to be understood and the individual variables need to be well controlled. More importantly, the critical factors that affect quality the most should be identified and thoroughly investigated. Design of experiments and statistical control plays central roles in finding root cause of failure, reduction of variability and quality improvement.
Technical Paper

Comparison of Verity and Volvo Methods for Fatigue Life Assessment of Welded Structures

2013-09-24
2013-01-2357
Great efforts have been made to develop the ability to accurately and quickly predict the durability and reliability of vehicles in the early development stage, especially for welded joints, which are usually the weakest locations in a vehicle system. A reliable and validated life assessment method is needed to accurately predict how and where a welded part fails, while iterative testing is expensive and time consuming. Recently, structural stress methods based on nodal force/moment are becoming widely accepted in fatigue life assessment of welded structures. There are several variants of structural stress approaches available and two of the most popular methods being used in automotive industry are the Volvo method and the Verity method. Both methods are available in commercial software and some concepts and procedures related the nodal force/moment have already been included in several engineering codes.
X