Refine Your Search

Topic

Search Results

Technical Paper

High Dimensional Preference Learning: Topological Data Analysis Informed Sampling for Engineering Decision Making

2024-04-09
2024-01-2422
Engineering design-decisions often involve many attributes which can differ in the levels of their importance to the decision maker (DM), while also exhibiting complex statistical relationships. Learning a decision-making policy which accurately represents the DM’s actions has long been the goal of decision analysts. To circumvent elicitation and modeling issues, this process is often oversimplified in how many factors are considered and how complicated the relationships considered between them are. Without these simplifications, the classical lottery-based preference elicitation is overly expensive, and the responses degrade rapidly in quality as the number of attributes increase. In this paper, we investigate the ability of deep preference machine learning to model high-dimensional decision-making policies utilizing rankings elicited from decision makers.
Technical Paper

Topological Data Analysis for Navigation in Unstructured Environments

2023-04-11
2023-01-0088
Autonomous vehicle navigation, both global and local, makes use of large amounts of multifactorial data from onboard sensors, prior information, and simulations to safely navigate a chosen terrain. Additionally, as each mission has a unique set of requirements, operational environment and vehicle capabilities, any fixed formulation for the cost associated with these attributes is sub-optimal across different missions. Much work has been done in the literature on finding the optimal cost definition and subsequent mission pathing given sufficient measurements of the preference over the mission factors. However, obtaining these measurements can be an arduous and computationally expensive task. Furthermore, the algorithms that utilize this large amount of multifactorial data themselves are time consuming and expensive.
Journal Article

Quantum Explanations for Interference Effects in Engineering Decision Making

2022-03-29
2022-01-0215
Engineering practice routinely involves decision making under uncertainty. Much of this decision making entails reconciling multiple pieces of information to form a suitable model of uncertainty. As more information is collected, one expectedly makes better and better decisions. However, conditional probability assessments made by human decision makers, as new information arrives does not always follow expected trends and instead exhibits inconsistencies. Understanding them is necessary for a better modeling of the cognitive processes taking place in their mind, whether it be the designer or the end-user. Doing so can result in better products and product features. Quantum probability has been used in the literature to explain many commonly observed deviations from the classical probability such as: question order effect, response replicability effect, Machina and Ellsberg paradoxes and the effect of positive and negative interference between events.
Technical Paper

Fault Diagnosis and Prediction in Automotive Systems with Real-Time Data Using Machine Learning

2022-03-29
2022-01-0217
In the automotive industry, a Malfunction Indicator Light (MIL) is commonly employed to signify a failure or error in a vehicle system. To identify the root cause that has triggered a particular fault, a technician or engineer will typically run diagnostic tests and analyses. This type of analysis can take a significant amount of time and resources at the cost of customer satisfaction and perceived quality. Predicting an impending error allows for preventative measures or actions which might mitigate the effects of the error. Modern vehicles generate data in the form of sensor readings accessible through the vehicle’s Controller Area Network (CAN). Such data is generally too extensive to aid in analysis and decision making unless machine learning-based methods are used. This paper proposes a method utilizing a recurrent neural network (RNN) to predict an impending fault before it occurs through the use of CAN data.
Journal Article

Decision-Making for Autonomous Mobility Using Remotely Sensed Terrain Parameters in Off-Road Environments

2021-04-06
2021-01-0233
Off-road vehicle operation requires constant decision-making under great uncertainty. Such decisions are multi-faceted and range from acquisition decisions to operational decisions. A major input to these decisions is terrain information in the form of soil properties. This information needs to be propagated to path planning algorithms that augment them with other inputs such as visual terrain assessment and other sensors. In this sequence of steps, many resources are needed, and it is not often clear how best to utilize them. We present an integrated approach where a mission’s overall performance is measured using a multiattribute utility function. This framework allows us to evaluate the value of acquiring terrain information and then its use in path planning. The computational effort of optimizing the vehicle path is also considered and optimized. We present our approach using the data acquired from the Keweenaw Research Center terrains and present some results.
Journal Article

Balancing Lifecycle Sustainment Cost with Value of Information during Design Phase

2020-04-14
2020-01-0176
The complete lifecycle of complex systems, such as ground vehicles, consists of multiple phases including design, manufacturing, operation and sustainment (O&S) and finally disposal. For many systems, the majority of the lifecycle costs are incurred during the operation and sustainment phase, specifically in the form of uncertain maintenance costs. Testing and analysis during the design phase, including reliability and supportability analysis, can have a major influence on costs during the O&S phase. However, the cost of the analysis itself must be reconciled with the expected benefits of the reduction in uncertainty. In this paper, we quantify the value of performing the tests and analyses in the design phase by treating it as imperfect information obtained to better estimate uncertain maintenance costs.
Journal Article

Value of Information for Comparing Dependent Repairable Assemblies and Systems

2018-04-03
2018-01-1103
This article presents an approach for comparing alternative repairable systems and calculating the value of information obtained by testing a specified number of such systems. More specifically, an approach is presented to determine the value of information that comes from field testing a specified number of systems in order to appropriately estimate the reliability metric associated with each of the respective repairable systems. Here the reliability of a repairable system will be measured by its failure rate. In support of the decision-making effort, the failure rate is translated into an expected utility based on a utility curve that represents the risk tolerance of the decision-maker. The algorithm calculates the change of the expected value of the decision with the sample size. The change in the value of the decision represents the value of information obtained from testing.
Technical Paper

Random Vibration Analysis Using Quasi-Random Bootstrapping

2018-04-03
2018-01-1104
Reliability analysis of engineering structures such as bridges, airplanes, and cars require calculation of small failure probabilities. These probabilities can be calculated using standard Monte Carlo simulation, but this method is impractical for most real-life systems because of its high computational cost. Many studies have focused on reducing the computational cost of a reliability assessment. These include bootstrapping, Separable Monte Carlo, Importance Sampling, and the Combined Approximations. The computational cost can also be reduced using an efficient method for deterministic analysis such as the mode superposition, mode acceleration, and the combined acceleration method. This paper presents and demonstrates a method that uses a combination of Sobol quasi-random sequences and bootstrapping to reduce the number of function calls. The study demonstrates that the use of quasi-random numbers in conjunction bootstrapping reduces dramatically computational cost.
Journal Article

Assessing the Value of Information for Multiple, Correlated Design Alternatives

2017-03-28
2017-01-0208
Design optimization occurs through a series of decisions that are a standard part of the product development process. Decisions are made anywhere from concept selection to the design of the assembly and manufacturing processes. The effectiveness of these decisions is based on the information available to the decision maker. Decision analysis provides a structured approach for quantifying the value of information that may be provided to the decision maker. This paper presents a process for determining the value of information that can be gained by evaluating linearly correlated design alternatives. A unique approach to the application of Bayesian Inference is used to provide simulated estimates in the expected utility with increasing observations sizes. The results provide insight into the optimum observation size that maximizes the expected utility when assessing correlated decision alternatives.
Technical Paper

Inverse Modeling: Theory and Engineering Examples

2016-04-05
2016-01-0267
Over the last two decades inverse problems have become increasingly popular due to their widespread applications. This popularity continuously demands designers to find alternative methods, to solve the inverse problems, which are efficient and accurate. It is important to use effective techniques that are both accurate and computationally efficient. This paper presents a method for solving inverse problems through Artificial Neural Network (ANN) theory. The paper also presents a method to apply Grey Wolf optimizer (GWO) algorithm to inverse problems. GWO is a recent optimization method producing superior results. Both methods are then compared to traditional methods such as Particle Swarm Optimization (PSO) and Markov Chain Monte Carlo (MCMC). Four typical engineering design problems are used to compare the four methods. The results show that the GWO outperforms other methods both in terms of efficiency and accuracy.
Journal Article

Bootstrapping and Separable Monte Carlo Simulation Methods Tailored for Efficient Assessment of Probability of Failure of Structural Systems

2015-04-14
2015-01-0420
There is randomness in both the applied loads and the strength of systems. Therefore, to account for the uncertainty, the safety of the system must be quantified using its reliability. Monte Carlo Simulation (MCS) is widely used for probabilistic analysis because of its robustness. However, the high computational cost limits the accuracy of MCS. Smarslok et al. [2010] developed an improved sampling technique for reliability assessment called Separable Monte Carlo (SMC) that can significantly increase the accuracy of estimation without increasing the cost of sampling. However, this method was applied to time-invariant problems involving two random variables. This paper extends SMC to problems with multiple random variables and develops a novel method for estimation of the standard deviation of the probability of failure of a structure. The method is demonstrated and validated on reliability assessment of an offshore wind turbine under turbulent wind loads.
Technical Paper

Decision-Based Universal Design - Using Copulas to Model Disability

2015-04-14
2015-01-0418
This paper develops a design paradigm for universal products. Universal design is term used for designing products and systems that are equally accessible to and usable by people with and without disabilities. Two common challenges for research in this area are that (1) There is a continuum of disabilities making it hard to optimize product features, and (2) There is no effective benchmark for evaluating such products. To exacerbate these issues, data regarding customer disabilities and their preferences is hard to come by. We propose a copula-based approach for modeling market coverage of a portfolio of universal products. The multiattribute preference of customers to purchase a product is modeled as Frank's Archimedean Copula. The inputs from various disparate sources can be collected and incorporated into a decision system.
Technical Paper

Combined Approximation for Efficient Reliability Analysis of Linear Dynamic Systems

2015-04-14
2015-01-0424
The Combined Approximation (CA) method is an efficient reanalysis method that aims at reducing the cost of optimization problems. The CA uses results of a single exact analysis, and it is suitable for different types of structures and design variables. The second author utilized CA to calculate the frequency response function of a system at a frequency of interest by using the results at a frequency in the vicinity of that frequency. He showed that the CA yields accurate results for small frequency perturbations. This work demonstrates a methodology that utilizes CA to reduce the cost of Monte Carlo simulation (MCs) of linear systems under random dynamic loads. The main idea is to divide the power spectral density function (PSD) of the input load into several frequency bins before calculating the load realizations.
Technical Paper

Multi-Level Decoupled Optimization of Wind Turbine Structures

2015-04-14
2015-01-0434
This paper proposes a multi-level decoupled method for optimizing the structural design of a wind turbine blade. The proposed method reduces the design space by employing a two-level optimization process. At the high-level, the structural properties of each section are approximated by an exponential function of the distance of that section from the blade root. High-level design variables are the coefficients of this approximating function. Target values for the structural properties of the blade are determined at that level. At the low-level, sections are divided into small decoupled groups. For each section, the low-level optimizer finds the thickness of laminate layers with a minimum mass, whose structural properties meet the targets determined by the high-level optimizer. In the proposed method, each low-level optimizer only considers a small number of design variables for a particular section, while traditional, single-level methods consider all design variables simultaneously.
Journal Article

Enhancing Decision Topology Assessment in Engineering Design

2014-04-01
2014-01-0719
Implications of decision analysis (DA) on engineering design are important and well-documented. However, widespread adoption has not occurred. To that end, the authors recently proposed decision topologies (DT) as a visual method for representing decision situations and proved that they are entirely consistent with normative decision analysis. This paper addresses the practical issue of assessing the DTs of a designer using their responses. As in classical DA, this step is critical to encoding the DA's preferences so that further analysis and mathematical optimization can be performed on the correct set of preferences. We show how multi-attribute DTs can be directly assessed from DM responses. Furthermore, we show that preferences under uncertainty can be trivially incorporated and that topologies can be constructed using single attribute topologies similarly to multi-linear functions in utility analysis. This incremental construction simplifies the process of topology construction.
Journal Article

Flexible Design and Operation of a Smart Charging Microgrid

2014-04-01
2014-01-0716
The reliability theory of repairable systems is vastly different from that of non-repairable systems. The authors have recently proposed a ‘decision-based’ framework to design and maintain repairable systems for optimal performance and reliability using a set of metrics such as minimum failure free period, number of failures in planning horizon (lifecycle), and cost. The optimal solution includes the initial design, the system maintenance throughout the planning horizon, and the protocol to operate the system. In this work, we extend this idea by incorporating flexibility and demonstrate our approach using a smart charging electric microgrid architecture. The flexibility is realized by allowing the architecture to change with time. Our approach “learns” the working characteristics of the microgrid. We use actual load and supply data over a short time to quantify the load and supply random processes and also establish the correlation between them.
Technical Paper

Reliability Analysis of Composite Inflatable Space Structures Considering Fracture Failure

2014-04-01
2014-01-0715
Inflatable space structures can have lower launching cost and larger habitat volume than their conventional rigid counterparts. These structures are made of composite laminates, and they are flexible when folded and partially inflated. They contain light-activated resins, and can be cured with the sun light after being inflated in space. A spacecraft can burst due to cracks caused by meteor showers or debris. Therefore, it is critical to identify the important fracture failure modes, and assess their probability. This information will help a designer minimize the risk of failure and keep the mass and cost low. This paper presents a probabilistic approach for finding the required thickness of an inflatable habitat shell for a prescribed reliability level, and demonstrates the superiority of probabilistic design to its deterministic counterpart.
Journal Article

Probability of Failure of Dynamic Systems by Importance Sampling

2013-04-08
2013-01-0607
Estimation of the probability of failure of mechanical systems under random loads is computationally expensive, especially for very reliable systems with low probabilities of failure. Importance Sampling can be an efficient tool for static problems if a proper sampling distribution is selected. This paper presents a methodology to apply Importance Sampling to dynamic systems in which both the load and response are stochastic processes. The method is applicable to problems for which the input loads are stationary and Gaussian and are represented by power spectral density functions. Shinozuka's method is used to generate random time histories of excitation. The method is demonstrated on a linear quarter car model. This approach is more efficient than standard Monte Carlo simulation by several orders of magnitude.
Journal Article

Managing the Computational Cost of Monte Carlo Simulation with Importance Sampling by Considering the Value of Information

2013-04-08
2013-01-0943
Importance Sampling is a popular method for reliability assessment. Although it is significantly more efficient than standard Monte Carlo simulation if a suitable sampling distribution is used, in many design problems it is too expensive. The authors have previously proposed a method to manage the computational cost in standard Monte Carlo simulation that views design as a choice among alternatives with uncertain reliabilities. Information from simulation has value only if it helps the designer make a better choice among the alternatives. This paper extends their method to Importance Sampling. First, the designer estimates the prior probability density functions of the reliabilities of the alternative designs and calculates the expected utility of the choice of the best design. Subsequently, the designer estimates the likelihood function of the probability of failure by performing an initial simulation with Importance Sampling.
Technical Paper

A Cost-Driven Method for Design Optimization Using Validated Local Domains

2013-04-08
2013-01-1385
Design optimization often relies on computational models, which are subjected to a validation process to ensure their accuracy. Because validation of computer models in the entire design space can be costly, we have previously proposed an approach where design optimization and model validation, are concurrently performed using a sequential approach with variable-size local domains. We used test data and statistical bootstrap methods to size each local domain where the prediction model is considered validated and where design optimization is performed. The method proceeds iteratively until the optimum design is obtained. This method however, requires test data to be available in each local domain along the optimization path. In this paper, we refine our methodology by using polynomial regression to predict the size and shape of a local domain at some steps along the optimization process without using test data.
X