Refine Your Search

Topic

Author

Affiliation

Search Results

Training / Education

Exploration of Machine Learning and Neural Networks for ADAS and L4 Vehicle Perception

2024-07-18
Convolutional neural networks are the de facto method of processing camera, radar, and lidar data for use in perception in ADAS and L4 vehicles, yet their operation is a black box to many engineers. Unlike traditional rules-based approaches to coding intelligent systems, networks are trained and the internal structure created during the training process is too complex to be understood by humans, yet in operation networks are able to classify objects of interest at error rates better than rates achieved by humans viewing the same input data.
Technical Paper

On-Center Steering Model for Realistic Steering Feel based on Real Measurement Data

2024-07-02
2024-01-2994
Driving simulators allow the testing of driving functions, vehicle models and acceptance assessment at an early stage. For a real driving experience, it's necessary that all immersions are depicted as realistically as possible. When driving manually, the perceived haptic steering wheel torque plays a key role in conveying a realistic steering feel. To ensure this, complex multi-body systems are used with numerous of parameters that are difficult to identify. Therefore, this study shows a method how to generate a realistic steering feel with a nonlinear open-loop model which only contains significant parameters, particularly the friction of the steering gear. This is suitable for the steering feel in the most driving on-center area. Measurements from test benches and real test drives with an Electric Power Steering (EPS) were used for the Identification and Validation of the model.
Technical Paper

Enhancing Urban AEB Systems: Simulation-Based Analysis of Error Tolerance in Distance Estimation and Road-Tire Friction Coefficients

2024-07-02
2024-01-2992
Autonomous Emergency Braking (AEB) systems are critical in preventing collisions, yet their effectiveness hinges on accurately estimating the distance between the vehicle and other road users, as well as understanding road conditions. Errors in distance estimation can result in premature or delayed braking and varying road conditions alter road-tire friction coefficients, affecting braking distances. Advancements in sensor technology and deep learning have improved vehicle perception and real-world understanding. The integration of advanced sensors like LiDARs has significantly enhanced distance estimation. Cameras and deep neural networks are also employed to estimate the road conditions. However, AEB systems face notable challenges in urban environments, influenced by complex scenarios and adverse weather conditions such as rain and fog. Therefore, investigating the error tolerance of these estimations is essential for the performance of AEB systems.
Technical Paper

Set-up of an in-car system for investigating driving style on the basis of the 3D-method

2024-07-02
2024-01-3001
Investigating human driver behavior enhances the acceptance of the autonomous driving and increases road safety in heterogeneous environments with human-operated and autonomous vehicles. The previously established driver fingerprint model, focuses on the classification of driving style based on CAN bus signals. However, driving styles are inherently complex and influenced by multiple factors, including changing driving environments and driver states. To comprehensively create a driver profile, an in-car measurement system based on the Driver-Driven vehicle-Driving environment (3D) framework is developed. The measurement system records emotional and physiological signals from the driver, including ECG signal and heart rate. A Raspberry Pi camera is utilized on the dashboard to capture the driver's facial expressions and a trained convolutional neural network (CNN) recognizes emotion. To conduct unobtrusive ECG measurements, an ECG sensor is integrated into the steering wheel.
X