Refine Your Search

Topic

Search Results

Journal Article

A Re-Analysis Methodology for System RBDO Using a Trust Region Approach with Local Metamodels

2010-04-12
2010-01-0645
A simulation-based, system reliability-based design optimization (RBDO) method is presented that can handle problems with multiple failure regions and correlated random variables. Copulas are used to represent the correlation. The method uses a Probabilistic Re-Analysis (PRRA) approach in conjunction with a trust-region optimization approach and local metamodels covering each trust region. PRRA calculates very efficiently the system reliability of a design by performing a single Monte Carlo (MC) simulation per trust region. Although PRRA is based on MC simulation, it calculates “smooth” sensitivity derivatives, allowing therefore, the use of a gradient-based optimizer. The PRRA method is based on importance sampling. It provides accurate results, if the support of the sampling PDF contains the support of the joint PDF of the input random variables. The sequential, trust-region optimization approach satisfies this requirement.
Journal Article

A Comparative Benchmark Study of using Different Multi-Objective Optimization Algorithms for Restraint System Design

2014-04-01
2014-01-0564
Vehicle restraint system design is a difficult optimization problem to solve because (1) the nature of the problem is highly nonlinear, non-convex, noisy, and discontinuous; (2) there are large numbers of discrete and continuous design variables; (3) a design has to meet safety performance requirements for multiple crash modes simultaneously, hence there are a large number of design constraints. Based on the above knowledge of the problem, it is understandable why design of experiment (DOE) does not produce a high-percentage of feasible solutions, and it is difficult for response surface methods (RSM) to capture the true landscape of the problem. Furthermore, in order to keep the restraint system more robust, the complexity of restraint system content needs to be minimized in addition to minimizing the relative risk score to achieve New Car Assessment Program (NCAP) 5-star rating.
Journal Article

Uncertainty Assessment in Restraint System Optimization for Occupants of Tactical Vehicles

2016-04-05
2016-01-0316
We have recently obtained experimental data and used them to develop computational models to quantify occupant impact responses and injury risks for military vehicles during frontal crashes. The number of experimental tests and model runs are however, relatively small due to their high cost. While this is true across the auto industry, it is particularly critical for the Army and other government agencies operating under tight budget constraints. In this study we investigate through statistical simulations how the injury risk varies if a large number of experimental tests were conducted. We show that the injury risk distribution is skewed to the right implying that, although most physical tests result in a small injury risk, there are occasional physical tests for which the injury risk is extremely large. We compute the probabilities of such events and use them to identify optimum design conditions to minimize such probabilities.
Journal Article

A Variable-Size Local Domain Approach to Computer Model Validation in Design Optimization

2011-04-12
2011-01-0243
A common approach to the validation of simulation models focuses on validation throughout the entire design space. A more recent methodology validates designs as they are generated during a simulation-based optimization process. The latter method relies on validating the simulation model in a sequence of local domains. To improve its computational efficiency, this paper proposes an iterative process, where the size and shape of local domains at the current step are determined from a parametric bootstrap methodology involving maximum likelihood estimators of unknown model parameters from the previous step. Validation is carried out in the local domain at each step. The iterative process continues until the local domain does not change from iteration to iteration during the optimization process ensuring that a converged design optimum has been obtained.
Journal Article

Time-Dependent Reliability of Random Dynamic Systems Using Time-Series Modeling and Importance Sampling

2011-04-12
2011-01-0728
Reliability is an important engineering requirement for consistently delivering acceptable product performance through time. As time progresses, the product may fail due to time-dependent operating conditions and material properties, component degradation, etc. The reliability degradation with time may increase the lifecycle cost due to potential warranty costs, repairs and loss of market share. Reliability is the probability that the system will perform its intended function successfully for a specified time interval. In this work, we consider the first-passage reliability which accounts for the first time failure of non-repairable systems. Methods are available in the literature, which provide an upper bound to the true reliability which may overestimate the true value considerably. Monte-Carlo simulations are accurate but computationally expensive.
Journal Article

A Simulation and Optimization Methodology for Reliability of Vehicle Fleets

2011-04-12
2011-01-0725
Understanding reliability is critical in design, maintenance and durability analysis of engineering systems. A reliability simulation methodology is presented in this paper for vehicle fleets using limited data. The method can be used to estimate the reliability of non-repairable as well as repairable systems. It can optimally allocate, based on a target system reliability, individual component reliabilities using a multi-objective optimization algorithm. The algorithm establishes a Pareto front that can be used for optimal tradeoff between reliability and the associated cost. The method uses Monte Carlo simulation to estimate the system failure rate and reliability as a function of time. The probability density functions (PDF) of the time between failures for all components of the system are estimated using either limited data or a user-supplied MTBF (mean time between failures) and its coefficient of variation.
Journal Article

System Topology Identification with Limited Test Data

2012-04-16
2012-01-0064
In this article we present an approach to identify the system topology using simulation for reliability calculations. The system topology provides how all components in a system are functionally connected. Most reliability engineering literature assumes that either the system topology is known and therefore all failure modes can be deduced or if the system topology is not known we are only interested in identifying the dominant failure modes. The authors contend that we should try to extract as much information about the system topology from failure or success information of a system as possible. This will not only identify the dominant failure modes but will also provide an understanding of how the components are functionally connected, allowing for more complicated analyses, if needed. We use an evolutionary approach where system topologies are generated at random and then tested against failure or success data. The topologies evolve based on how consistent they are with test data.
Journal Article

A Nonparametric Bootstrap Approach to Variable-size Local-domain Design Optimization and Computer Model Validation

2012-04-16
2012-01-0226
Design optimization often relies on computational models, which are subjected to a validation process to ensure their accuracy. Because validation of computer models in the entire design space can be costly, a recent approach was proposed where design optimization and model validation were concurrently performed using a sequential approach with both fixed and variable-size local domains. The variable-size approach used parametric distributions such as Gaussian to quantify the variability in test data and model predictions, and a maximum likelihood estimation to calibrate the prediction model. Also, a parametric bootstrap method was used to size each local domain. In this article, we generalize the variable-size approach, by not assuming any distribution such as Gaussian. A nonparametric bootstrap methodology is instead used to size the local domains. We expect its generality to be useful in applications where distributional assumptions are difficult to verify, or not met at all.
Technical Paper

Modeling Dependence and Assessing the Effect of Uncertainty in Dependence in Probabilistic Analysis and Decision Under Uncertainty

2010-04-12
2010-01-0697
A complete probabilistic model of uncertainty in probabilistic analysis and design problems is the joint probability distribution of the random variables. Often, it is impractical to estimate this joint probability distribution because the mechanism of the dependence of the variables is not completely understood. This paper proposes modeling dependence by using copulas and demonstrates their representational power. It also compares this representation with a Monte-Carlo simulation using dispersive sampling.
Technical Paper

Model-Based Embedded Controls Test and Verification

2010-04-12
2010-01-0487
Embedded systems continue to become more complex. As a result, more companies are utilizing model-based design (MBD) development methods and tools. The use of MBD methods and tools is helpful in increasing time to market and having instant feedback on the system design. One area that continues to mature is the testing and verification of the MBD systems. This paper introduces a hybrid approach to functional tests. The test system is composed of simulation software and real-time hardware. It is not always necessary to test a system in a real-time environment, but it is recommended if the goal is to deploy the system to a situation that requires real-time response. Vehicle drive cycles and powertrain control are utilized in this research as the example test case for this paper. In order to test the algorithms on a real-time system, it is necessary to understand the target controller's computing limitations and adjust the algorithms to meet these limitations.
Technical Paper

Modeling the Stiffness and Damping Properties of Styrene-Butadiene Rubber

2011-05-17
2011-01-1628
Styrene-Butadiene Rubber (SBR), a copolymer of butadiene and styrene, is widely used in the automotive industry due to its high durability and resistance to abrasion, oils and oxidation. Some of the common applications include tires, vibration isolators, and gaskets, among others. This paper characterizes the dynamic behavior of SBR and discusses the suitability of a visco-elastic model of elastomers, known as the Kelvin model, from a mathematical and physical point of view. An optimization algorithm is used to estimate the parameters of the Kelvin model. The resulting model was shown to produce reasonable approximations of measured dynamic stiffness. The model was also used to calculate the self heating of the elastomer due to energy dissipation by the viscous damping components in the model. Developing such a predictive capability is essential in understanding the dynamic behavior of elastomers considering that their dynamic stiffness can in general depend on temperature.
Technical Paper

Managing the Computational Cost in a Monte Carlo Simulation by Considering the Value of Information

2012-04-16
2012-01-0915
Monte Carlo simulation is a popular tool for reliability assessment because of its robustness and ease of implementation. A major concern with this method is its computational cost; standard Monte Carlo simulation requires quadrupling the number of replications for halving the standard deviation of the estimated failure probability. Efforts to increase efficiency focus on intelligent sampling procedures and methods for efficient calculation of the performance function of a system. This paper proposes a new method to manage cost that views design as a decision among alternatives with uncertain reliabilities. Information from a simulation has value only if it enables the designer to make a better choice among the alternative options. Consequently, the value of information from the simulation is equal to the gain from using this information to improve the decision. A designer can determine the number of replications that are worth performing by using the method.
Technical Paper

System Failure Identification using Linear Algebra: Application to Cost-Reliability Tradeoffs under Uncertain Preferences

2012-04-16
2012-01-0914
Reaching a system level reliability target is an inverse problem. Component level reliabilities are determined for a required system level reliability. Because this inverse problem does not have a unique solution, one approach is to tradeoff system reliability with cost and to allow the designer to select a design with a target system reliability, using his/her preferences. In this case, the component reliabilities are readily available from the calculation of the reliability-cost tradeoff. To arrive at the set of solutions to be traded off, one encounters two problems. First, the system reliability calculation is based on repeated system simulations where each system state, indicating which components work and which have failed, is tested to determine if it causes system failure, and second, the task of eliciting and encoding the decision maker's preferences is extremely difficult because of uncertainty in modeling the decision maker's preferences.
Technical Paper

Buckling of Structures Subject to Multiple Forces

2013-04-08
2013-01-1370
Frames are important structures found in many transportation applications such as automotive bodies and train cars. They are also widely employed in buildings, bridges, and other load bearing designs. When a frame is carrying multiple loads, it can potentially risk a catastrophic buckling failure. The loads on the frame may be non-proportional in that one force stays constant while the other is increased until buckling occurs. In this study the buckling problem is formulated as a constrained eigenvalue problem (CEVP). As opposed to other CEVP in which the eigenvectors are forced to comply with a number of the constraints, the eigenvalues in the current CEVP are subject to some equality constraints. A numerical algorithm for solving the constrained eigenvalue problem is presented. The algorithm is a simple trapping scheme in which the computation starts with an initial guess and a window containing the potential target for the eigenvalue is identified.
Technical Paper

Comparative Benchmark Studies of Response Surface Model-Based Optimization and Direct Multidisciplinary Design Optimization

2014-04-01
2014-01-0400
Response Surface Model (RSM)-based optimization is widely used in engineering design. The major strength of RSM-based optimization is its short computational time. The expensive real simulation models are replaced with fast surrogate models. However, this method may have some difficulties to reach the full potential due to the errors between RSM and the real simulations. RSM's accuracy is limited by the insufficient number of Design of Experiments (DOE) points and the inherent randomness of DOE. With recent developments in advanced optimization algorithms and High Performance Computing (HPC) capability, Direct Multidisciplinary Design Optimization (DMDO) receives more attention as a promising future optimization strategy. Advanced optimization algorithm reduces the number of function evaluations, and HPC cut down the computational turnaround time of function evaluations through fully utilizing parallel computation.
Technical Paper

Experimental and Computer Simulation Analysis of Transients on an Automobile Communication Bus

1995-02-01
950038
Voltage and current surges are a major concern when it comes to ensuring the functional integrity of electrical and electronic components and modules in an automobile system. This paper presents a computer simulation study for analyzing the effect of high voltage spikes and current load dump on a new Integrated Driver/Receiver (IDR) IC, currently being developed for a J1850 Data Communication Bus in an automobile. It describes the modeling and simulation of the protection structure proposed for the device. The simulation study yields a prediction of current and voltage capability of the protection circuit based on thermal breakdown and transient responses of the circuit. Two levels of modeling, namely, the behavioral level model and the component level model, are used to generate the simulation results. Experimental data will be acquired and used to validate the simulation model when the actual device becomes available.
Journal Article

Time-Dependent Reliability Estimation for Dynamic Systems Using a Random Process Approach

2010-04-12
2010-01-0644
Reliability is an important engineering requirement for consistently delivering acceptable product performance through time. As time progresses, the product may fail due to time-dependent operating conditions and material properties, component degradation, etc. The reliability degradation with time may increase the lifecycle cost due to potential warranty costs, repairs and loss of market share. Reliability is the probability that the system will perform its intended function successfully for a specified time interval. In this work, we consider the first-passage reliability which accounts for the first time failure of non-repairable systems. Methods are available in the literature, which provide an upper bound to the true reliability which may overestimate the true value considerably. This paper proposes a methodology to calculate the cumulative probability of failure (probability of first passage or upcrossing) of a dynamic system, driven by an ergodic input random process.
Technical Paper

Decision-Based Universal Design - Using Copulas to Model Disability

2015-04-14
2015-01-0418
This paper develops a design paradigm for universal products. Universal design is term used for designing products and systems that are equally accessible to and usable by people with and without disabilities. Two common challenges for research in this area are that (1) There is a continuum of disabilities making it hard to optimize product features, and (2) There is no effective benchmark for evaluating such products. To exacerbate these issues, data regarding customer disabilities and their preferences is hard to come by. We propose a copula-based approach for modeling market coverage of a portfolio of universal products. The multiattribute preference of customers to purchase a product is modeled as Frank's Archimedean Copula. The inputs from various disparate sources can be collected and incorporated into a decision system.
Technical Paper

Energy Efficient Routing for Electric Vehicles using Particle Swarm Optimization

2014-04-01
2014-01-1815
Growing concerns about the environment, energy dependency, and unstable fuel prices have increased the market share of electric vehicles. This has led to an increased demand for energy efficient routing algorithms that are optimized for electric vehicles. Traditional routing algorithms are focused on finding the shortest distance or the least time route between two points. These approaches have been working well for fossil fueled vehicles. Electric vehicles, on the other hand, require different route optimization techniques. Negative edge costs, battery power and capacity limits, as well as vehicle parameters that are only available at query time, make the task of electric vehicle routing a challenging problem. In this paper, we present a simulated solution to the energy efficient routing for electric vehicles using Particle Swarm Optimization. Simulation results show improvements in the energy consumption of the electric vehicle when applied to a start-to-destination routing problem.
Technical Paper

Assessment of Different Joining Techniques for Dissimilar Materials

2014-04-01
2014-01-0790
In this paper, experimental study and FEA simulation are performed to investigate the effect of three different methods for joining dissimilar metal coupons in terms of their strength and load transferring capacity. The joining techniques considered include adhesive bonding, bolting and hybrid bolting-and-bonding. Elastic-plastic material model with damage consideration is used for each of the joint components. Traction-separation rule and failure criterion is defined for adhesive. Load transfer capacity and the failure mode are assessed for each type of joining. Joint strength is examined in terms of the effects of adhesive property, bolt preload level, and friction coefficient. Results show that load transferred and failure mechanism vary significantly between samples with different joint methods; preload evolution in bolt changes with friction coefficient; hybrid joint generally has advantage over the other two methods, namely, bolting-only and bonding-only.
X