Refine Your Search

Search Results

Viewing 1 to 3 of 3
Journal Article

Monte Carlo Techniques for Correlated Variables in Crash Reconstruction

2009-04-20
2009-01-0104
The results of a traffic crash reconstruction often include vehicle speeds to address causation and changes in velocity to indicate crash severity. Since these results are related, they should be modeled in a probabilistic context as a joint distribution. Current techniques in the traffic crash reconstruction literature assume the the input parameters and results of an analysis are independent, which may or may not be appropriate. Therefore, a discussion of uncertainty propagation techniques with correlation and Monte Carlo simulation of correlated variables is presented in this paper. The idea that measuring a parameter with a common instrument induces correlation is explored by examining the process of determining vehicle weights. Also, an example of determining the energy from crush is presented since the A and B stiffness coefficients are correlated. Results show the difference between accounting for correlation and assuming independence may be significant.
Journal Article

Sensitivity of Monte Carlo Modeling in Crash Reconstruction

2010-04-12
2010-01-0071
The Monte Carlo method is a well-known technique for propagating uncertainty in complex systems and has been applied to traffic crash reconstruction analysis. The Monte Carlo method is a probabilistic technique that randomly samples input distributions and then combines these samples according to a deterministic model. However, describing every input variable as a distribution requires knowledge of the distribution, which may or may not be available, and the time and expense of determining the distribution parameters may be prohibitive. Therefore, the most influential parameters from the input data, such as mean values, standard deviations, shape parameters, and correlation coefficients, can be determined using an analytical sensitivity calculation based on the score function.
Journal Article

On the Digital Forensics of Heavy Truck Electronic Control Modules

2014-04-01
2014-01-0495
Concepts of forensic soundness as they are currently understood in the field of digital forensics are related to the digital data on heavy vehicle electronic control modules (ECMs). An assessment for forensic soundness addresses: 1) the integrity of the data, 2) the meaning of the data, 3) the processes for detecting or predicting errors, 4) transparency of the operation, and 5) the expertise of the practitioners. The integrity of the data can be verified using cryptographic hash functions. Interpreting and understanding the meaning of the data is based on standards or manufacturer software. Comparison of interpreted ECM data to external reference measurements is reviewed from the current literature. Meaning is also extracted from interpreting hexadecimal data based on the J1939 and J1587 standards. Error detection and mitigation strategies are discussed in the form of sensor simulators to eliminate artificial fault codes.
X