Refine Your Search

Topic

Author

Affiliation

Search Results

Journal Article

A Re-Analysis Methodology for System RBDO Using a Trust Region Approach with Local Metamodels

2010-04-12
2010-01-0645
A simulation-based, system reliability-based design optimization (RBDO) method is presented that can handle problems with multiple failure regions and correlated random variables. Copulas are used to represent the correlation. The method uses a Probabilistic Re-Analysis (PRRA) approach in conjunction with a trust-region optimization approach and local metamodels covering each trust region. PRRA calculates very efficiently the system reliability of a design by performing a single Monte Carlo (MC) simulation per trust region. Although PRRA is based on MC simulation, it calculates “smooth” sensitivity derivatives, allowing therefore, the use of a gradient-based optimizer. The PRRA method is based on importance sampling. It provides accurate results, if the support of the sampling PDF contains the support of the joint PDF of the input random variables. The sequential, trust-region optimization approach satisfies this requirement.
Journal Article

Uncertainty Assessment in Restraint System Optimization for Occupants of Tactical Vehicles

2016-04-05
2016-01-0316
We have recently obtained experimental data and used them to develop computational models to quantify occupant impact responses and injury risks for military vehicles during frontal crashes. The number of experimental tests and model runs are however, relatively small due to their high cost. While this is true across the auto industry, it is particularly critical for the Army and other government agencies operating under tight budget constraints. In this study we investigate through statistical simulations how the injury risk varies if a large number of experimental tests were conducted. We show that the injury risk distribution is skewed to the right implying that, although most physical tests result in a small injury risk, there are occasional physical tests for which the injury risk is extremely large. We compute the probabilities of such events and use them to identify optimum design conditions to minimize such probabilities.
Journal Article

Efficient Global Surrogate Modeling Based on Multi-Layer Sampling

2018-04-03
2018-01-0616
Global surrogate modeling aims to build surrogate model with high accuracy in the whole design domain. A major challenge to achieve this objective is how to reduce the number of function evaluations to the original computer simulation model. To date, the most widely used approach for global surrogate modeling is the adaptive surrogate modeling method. It starts with an initial surrogate model, which is then refined adaptively using the mean square error (MSE) or maximizing the minimum distance criteria. It is observed that current methods may not be able to effectively construct a global surrogate model when the underlying black box function is highly nonlinear in only certain regions. A new surrogate modeling method which can allocate more training points in regions with high nonlinearity is needed to overcome this challenge. This article proposes an efficient global surrogate modeling method based on a multi-layer sampling scheme.
Journal Article

Reliability and Cost Trade-Off Analysis of a Microgrid

2018-04-03
2018-01-0619
Optimizing the trade-off between reliability and cost of operating a microgrid, including vehicles as both loads and sources, can be a challenge. Optimal energy management is crucial to develop strategies to improve the efficiency and reliability of microgrids, as well as new communication networks to support optimal and reliable operation. Prior approaches modeled the grid using MATLAB, but did not include the detailed physics of loads and sources, and therefore missed the transient effects that are present in real-time operation of a microgrid. This article discusses the implementation of a physics-based detailed microgrid model including a diesel generator, wind turbine, photovoltaic array, and utility. All elements are modeled as sources in Simulink. Various loads are also implemented including an asynchronous motor. We show how a central control algorithm optimizes the microgrid by trying to maximize reliability while reducing operational cost.
Journal Article

A Group-Based Space-Filling Design of Experiments Algorithm

2018-04-03
2018-01-1102
Computer-aided engineering (CAE) is an important tool routinely used to simulate complex engineering systems. Virtual simulations enhance engineering insight into prospective designs and potential design issues and can limit the need for expensive engineering prototypes. For complex engineering systems, however, the effectiveness of virtual simulations is often hindered by excessive computational cost. To minimize the cost of running expensive computer simulations, approximate models of the original model (often called surrogate models or metamodels) can provide sufficient accuracy at a lower computing overhead compared to repeated runs of a full simulation. Metamodel accuracy improves if constructed using space-filling designs of experiments (DOEs). The latter provide a collection of sample points in the design space preferably covering the entire space.
Journal Article

Reanalysis of Linear Dynamic Systems using Modified Combined Approximations with Frequency Shifts

2016-04-05
2016-01-1338
Weight reduction is very important in automotive design because of stringent demand on fuel economy. Structural optimization of dynamic systems using finite element (FE) analysis plays an important role in reducing weight while simultaneously delivering a product that meets all functional requirements for durability, crash and NVH. With advancing computer technology, the demand for solving large FE models has grown. Optimization is however costly due to repeated full-order analyses. Reanalysis methods can be used in structural vibrations to reduce the analysis cost from repeated eigenvalue analyses for both deterministic and probabilistic problems. Several reanalysis techniques have been introduced over the years including Parametric Reduced Order Modeling (PROM), Combined Approximations (CA) and the Epsilon algorithm, among others.
Journal Article

A Variable-Size Local Domain Approach to Computer Model Validation in Design Optimization

2011-04-12
2011-01-0243
A common approach to the validation of simulation models focuses on validation throughout the entire design space. A more recent methodology validates designs as they are generated during a simulation-based optimization process. The latter method relies on validating the simulation model in a sequence of local domains. To improve its computational efficiency, this paper proposes an iterative process, where the size and shape of local domains at the current step are determined from a parametric bootstrap methodology involving maximum likelihood estimators of unknown model parameters from the previous step. Validation is carried out in the local domain at each step. The iterative process continues until the local domain does not change from iteration to iteration during the optimization process ensuring that a converged design optimum has been obtained.
Journal Article

Time-Dependent Reliability of Random Dynamic Systems Using Time-Series Modeling and Importance Sampling

2011-04-12
2011-01-0728
Reliability is an important engineering requirement for consistently delivering acceptable product performance through time. As time progresses, the product may fail due to time-dependent operating conditions and material properties, component degradation, etc. The reliability degradation with time may increase the lifecycle cost due to potential warranty costs, repairs and loss of market share. Reliability is the probability that the system will perform its intended function successfully for a specified time interval. In this work, we consider the first-passage reliability which accounts for the first time failure of non-repairable systems. Methods are available in the literature, which provide an upper bound to the true reliability which may overestimate the true value considerably. Monte-Carlo simulations are accurate but computationally expensive.
Journal Article

A Simulation and Optimization Methodology for Reliability of Vehicle Fleets

2011-04-12
2011-01-0725
Understanding reliability is critical in design, maintenance and durability analysis of engineering systems. A reliability simulation methodology is presented in this paper for vehicle fleets using limited data. The method can be used to estimate the reliability of non-repairable as well as repairable systems. It can optimally allocate, based on a target system reliability, individual component reliabilities using a multi-objective optimization algorithm. The algorithm establishes a Pareto front that can be used for optimal tradeoff between reliability and the associated cost. The method uses Monte Carlo simulation to estimate the system failure rate and reliability as a function of time. The probability density functions (PDF) of the time between failures for all components of the system are estimated using either limited data or a user-supplied MTBF (mean time between failures) and its coefficient of variation.
Journal Article

Reliability Prediction for the HMMWV Suspension System

2011-04-12
2011-01-0726
This research paper addresses the ground vehicle reliability prediction process based on a new integrated reliability prediction framework. The integrated stochastic framework combines the computational physics-based predictions with experimental testing information for assessing vehicle reliability. The integrated reliability prediction approach incorporates the following computational steps: i) simulation of stochastic operational environment, ii) vehicle multi-body dynamics analysis, iii) stress prediction in subsystems and components, iv) stochastic progressive damage analysis, and v) component life prediction, including the effects of maintenance and, finally, iv) reliability prediction at component and system level. To solve efficiently and accurately the challenges coming from large-size computational mechanics models and high-dimensional stochastic spaces, a HPC simulation-based approach to the reliability problem was implemented.
Journal Article

System Topology Identification with Limited Test Data

2012-04-16
2012-01-0064
In this article we present an approach to identify the system topology using simulation for reliability calculations. The system topology provides how all components in a system are functionally connected. Most reliability engineering literature assumes that either the system topology is known and therefore all failure modes can be deduced or if the system topology is not known we are only interested in identifying the dominant failure modes. The authors contend that we should try to extract as much information about the system topology from failure or success information of a system as possible. This will not only identify the dominant failure modes but will also provide an understanding of how the components are functionally connected, allowing for more complicated analyses, if needed. We use an evolutionary approach where system topologies are generated at random and then tested against failure or success data. The topologies evolve based on how consistent they are with test data.
Journal Article

A Nonparametric Bootstrap Approach to Variable-size Local-domain Design Optimization and Computer Model Validation

2012-04-16
2012-01-0226
Design optimization often relies on computational models, which are subjected to a validation process to ensure their accuracy. Because validation of computer models in the entire design space can be costly, a recent approach was proposed where design optimization and model validation were concurrently performed using a sequential approach with both fixed and variable-size local domains. The variable-size approach used parametric distributions such as Gaussian to quantify the variability in test data and model predictions, and a maximum likelihood estimation to calibrate the prediction model. Also, a parametric bootstrap method was used to size each local domain. In this article, we generalize the variable-size approach, by not assuming any distribution such as Gaussian. A nonparametric bootstrap methodology is instead used to size the local domains. We expect its generality to be useful in applications where distributional assumptions are difficult to verify, or not met at all.
Journal Article

Multi-Objective Decision Making under Uncertainty and Incomplete Knowledge of Designer Preferences

2011-04-12
2011-01-1080
Multi-attribute decision making and multi-objective optimization complement each other. Often, while making design decisions involving multiple attributes, a Pareto front is generated using a multi-objective optimizer. The end user then chooses the optimal design from the Pareto front based on his/her preferences. This seemingly simple methodology requires sufficient modification if uncertainty is present. We explore two kinds of uncertainties in this paper: uncertainty in the decision variables which we call inherent design problem (IDP) uncertainty and that in knowledge of the preferences of the decision maker which we refer to as preference assessment (PA) uncertainty. From a purely utility theory perspective a rational decision maker maximizes his or her expected multi attribute utility.
Journal Article

Managing the Computational Cost of Monte Carlo Simulation with Importance Sampling by Considering the Value of Information

2013-04-08
2013-01-0943
Importance Sampling is a popular method for reliability assessment. Although it is significantly more efficient than standard Monte Carlo simulation if a suitable sampling distribution is used, in many design problems it is too expensive. The authors have previously proposed a method to manage the computational cost in standard Monte Carlo simulation that views design as a choice among alternatives with uncertain reliabilities. Information from simulation has value only if it helps the designer make a better choice among the alternatives. This paper extends their method to Importance Sampling. First, the designer estimates the prior probability density functions of the reliabilities of the alternative designs and calculates the expected utility of the choice of the best design. Subsequently, the designer estimates the likelihood function of the probability of failure by performing an initial simulation with Importance Sampling.
Journal Article

Efficient Probabilistic Reanalysis and Optimization of a Discrete Event System

2011-04-12
2011-01-1081
This paper presents a methodology to evaluate and optimize discrete event systems, such as an assembly line or a call center. First, the methodology estimates the performance of a system for a single probability distribution of the inputs. Probabilistic Reanalysis (PRRA) uses this information to evaluate the effect of changes in the system configuration on its performance. PRRA is integrated with a program to optimize the system. The proposed methodology is dramatically more efficient than one requiring a new Monte Carlo simulation each time we change the system. We demonstrate the approach on a drilling center and an electronic parts factory.
Technical Paper

Validation of an EFEA Formulation for Computing the Vibrational Response of Complex Structures

2007-05-15
2007-01-2324
This paper presents a validation case study for an Energy Finite Element Analysis (EFEA) formulation through comparison to test data. The EFEA comprises a simulation tool for computing the structural response of a complex structure and the amount of the radiated power. The EFEA formulation presented in this paper can account for periodic stiffeners, for partial fluid loading effects on the outer part of the structure, and for internal compartments filled with heavy fluid. In order to validate these modeling capabilities of the EFEA two 1/8th scale structures representing an advanced double hull design and a conventional hull design of a surface ship are analyzed. Results for the structural vibration induced on the outer bottom part of the structure are compared to available test data. The excitation is applied at two different locations of the deck structure. Good correlation is observed between the numerical results and the test data.
Technical Paper

A Substructuring Formulation for the Energy Finite Element Analysis

2007-05-15
2007-01-2325
In applications of the Energy Finite Element Analysis (EFEA) there is an increasing need for developing comprehensive models with a large number of elements which include both structural and interior fluid elements, while certain parts of the structure are considered to be exposed to an external fluid loading. In order to accommodate efficient computations when using simulation models with a large number of elements, joints, and domains, a substructuring computational capability has been developed. The new algorithm is based on dividing the EFEA model into substructures with internal and interface degrees of freedom. The system of equations for each substructure is assembled and solved separately and the information is condensed to the interface degrees of freedom. The condensed systems of equations from each substructure are assembled in a reduced global system of equations. Once the global system of equations has been solved the solution for each substructure is pursued.
Technical Paper

Validation of a Hybrid Finite Element Formulation for Mid-Frequency Analysis of Vehicle Structures

2007-05-15
2007-01-2303
The hybrid Finite Element Analysis (hybrid FEA) has been developed for performing structure-borne computations in automotive vehicle structures [1, 2 and 3]. The hybrid FEA method combines conventional FEA with Energy FEA (EFEA). Conventional FEA models are employed for modeling the behavior of the stiff members in a system. Appropriate damping and spring or mass elements are introduced in the connections between stiff and flexible members in order to capture the presence of the flexible members during the analyses of the stiff ones. The component mode synthesis method is combined with analytical solutions for determining the driving point conductance at joints between stiff and flexible members and for defining the properties of the concentrated elements which represent the flexible members when analyzing the stiff components.
Technical Paper

Combining an Energy Boundary Element with an Energy Finite Element Analysis for Airborne Noise Simulations

2007-05-15
2007-01-2178
The Energy Boundary Element Analysis (EBEA) has been utilized in the past for computing the exterior acoustic field at high frequencies (above ∼400Hz) around vehicle structures and numerical results have been compared successfully to test data [1, 2 and 3]. The Energy Finite Element Analysis (EFEA) has been developed for computing the structural vibration of complex structures at high frequencies and validations have been presented in previous publications [4, 5]. In this paper the EBEA is utilized for computing the acoustic field around a vehicle structure due to external acoustic noise sources. The computed exterior acoustic field comprises the excitation for the EFEA analysis. Appropriate loading functions have been developed for representing the exterior acoustic loading in the EFEA simulations, and a formulation has been developed for considering the acoustic treatment applied on the interior side of structural panels.
Technical Paper

Predicting Military Ground Vehicle Reliability using High Performance Computing

2007-04-16
2007-01-1421
To impact the decision making for military ground vehicles, we are using High Performance Computing (HPC) to speed up the time for analyzing the reliability of a design in modeling and simulation. We use parallelization to get accurate results in days rather than months. We can obtain accurate reliability prediction with modeling and simulation, using uncertainties and multiple physics-of-failure, but by utilizing parallel computing we get results in much less time than conventional analysis techniques.
X