Refine Your Search

Topic

Author

Affiliation

Search Results

Training / Education

ADAS Application Automatic Emergency Braking

2024-09-19
Active safety and (ADAS) are now being introduced to the marketplace as they serve as key enablers for anticipated autonomous driving systems. ...Automatic emergency braking (AEB) is one ADAS application which is either in the marketplace presently or under development as nearly all automakers have pledged to offer this technology by the year 2022.
Training / Education

Exploration of Machine Learning and Neural Networks for ADAS and L4 Vehicle Perception

2024-07-18
Convolutional neural networks are the de facto method of processing camera, radar, and lidar data for use in perception in ADAS and L4 vehicles, yet their operation is a black box to many engineers. Unlike traditional rules-based approaches to coding intelligent systems, networks are trained and the internal structure created during the training process is too complex to be understood by humans, yet in operation networks are able to classify objects of interest at error rates better than rates achieved by humans viewing the same input data.
Technical Paper

Radar-based Approach for Side-Slip Gradient Estimation

2024-07-02
2024-01-2976
In vehicle ego-motion estimation, vehicle control, and advanced driver assist systems the vehicle dynamics are described by a few key parameters. The side-slip gradient, being one of them, is used to model the lateral behavior of the vehicle. This parameter is rarely known precisely, since it depends on the vehicle’s mass distribution, its tires, and even the chassis setup. Thus, an online-estimation of the side-slip gradient is beneficial, especially in serial applications. Estimating the side-slip gradient with conventional vehicle sensors such as wheel-speed, steering, and inertial sensors poses a significant challenge since considerable dynamic excitation of the vehicle is required, which is uncommon in normal driving. Here, radar sensors open new opportunities in the estimation of such vehicle dynamics parameters since they allow for an instantaneous measurement of the lateral velocity.
Technical Paper

Set-up of an in-car system for investigating driving style on the basis of the 3D-method

2024-07-02
2024-01-3001
Investigating human driver behavior enhances the acceptance of the autonomous driving and increases road safety in heterogeneous environments with human-operated and autonomous vehicles. The previously established driver fingerprint model, focuses on the classification of driving style based on CAN bus signals. However, driving styles are inherently complex and influenced by multiple factors, including changing driving environments and driver states. To comprehensively create a driver profile, an in-car measurement system based on the Driver-Driven vehicle-Driving environment (3D) framework is developed. The measurement system records emotional and physiological signals from the driver, including ECG signal and heart rate. A Raspberry Pi camera is utilized on the dashboard to capture the driver's facial expressions and a trained convolutional neural network (CNN) recognizes emotion. To conduct unobtrusive ECG measurements, an ECG sensor is integrated into the steering wheel.
Training / Education

Introduction to Highly Automated Vehicles

2024-06-10
This course highlights the technologies enabling ADAS and how they integrate with existing passive occupant crash protection systems, how ADAS functions perceive the world, make decisions, and either warn drivers or actively intervene in controlling the vehicle to avoid or mitigate crashes. ...Examples of current and future ADAS functions, and various sensors utilized in ADAS, including their operation and limitations, and sample algorithms, will be discussed and demonstrated.
Training / Education

Sensors and Perception for Autonomous Vehicle Development

This 4-week virtual-only experience, conducted by leading experts in the autonomous vehicle industry and academia, provides an in-depth look at the most common sensor types used in autonomous vehicle applications. By reviewing the theory, working through examples, viewing sensor data, and programming movement of a turtlebot, you will develop a solid, hands-on understanding of the common sensors and data provided by each. This course consists of asynchronous videos you will work through at your own pace throughout each week, followed by a live-online synchronous experience each Friday. The videos are led by Dr.
X