This basic course introduces the intent of the DO-254 standard for commercial avionics hardware development. The content will cover many aspects of avionic hardware including, aircraft safety, systems, hardware planning, requirements, design, implementation, and testing. Participants will learn industry-best practices for real-world hardware development, common DO-254 mistakes and how to prevent them, and how to minimize risks and costs while maximizing hardware quality.
On-board diagnosis of engine and transmission systems has been mandated by government regulation for light and medium vehicles since the 1996 model year. The regulations specify many of the detailed features that on-board diagnostics must exhibit. In addition, the penalties for not meeting the requirements or providing in-field remedies can be very expensive. This course is designed to provide a fundamental understanding of how and why OBD systems function and the technical features that a diagnostic should have in order to ensure compliant and successful implementation.
Photographs and video recordings of vehicle crashes and accident sites are more prevalent than ever, with dash mounted cameras, surveillance footage, and personal cell phones now ubiquitous. The information contained in these pictures and videos provide critical information to understanding how crashes occurred, and analyze physical evidence. This course teaches the theory and techniques for getting the most out of digital media, including correctly processing raw video and photographs, correcting for lens distortion, and using photogrammetric techniques to convert the information in digital media to usable scaled three-dimensional data.
Convolutional neural networks are the de facto method of processing camera, radar, and lidar data for use in perception in ADAS and L4 vehicles, yet their operation is a black box to many engineers. Unlike traditional rules-based approaches to coding intelligent systems, networks are trained and the internal structure created during the training process is too complex to be understood by humans, yet in operation networks are able to classify objects of interest at error rates better than rates achieved by humans viewing the same input data.
In the evolving landscape of automated driving systems, the critical role of vehicle localization within the autonomous driving stack is increasingly evident. Traditional reliance on Global Navigation Satellite Systems (GNSS) proves to be inadequate, especially in urban areas where signal obstruction and multipath effects degrade accuracy. Addressing this challenge, this paper details the enhancement of a localization system for autonomous public transport vehicles, focusing on mitigating GNSS errors through the integration of a LiDAR sensor. The approach involves creating a 3D map using the factor graph-based LIO-SAM algorithm based on GNSS, vehicle odometry, IMU and LiDAR data. The algorithm is adapted to the use-case by adding a velocity factor and altitude data from a Digital Terrain model. Based on the map a state estimator is proposed, which combines high-frequency LiDAR odometry based on FAST-LIO with low-frequency absolute multiscale ICP-based LiDAR position estimation.
Autonomous Driving is being utilized in various settings, including indoor areas such as industrial halls. Additionally, LIDAR sensors are currently popular due to their superior spatial resolution and accuracy compared to RADAR, as well as their robustness to varying lighting conditions compared to cameras. They enable precise and real-time perception of the surrounding environment. Several datasets for on-road scenarios such as KITTI or Waymo are publicly available. However, there is a notable lack of open-source datasets specifically designed for industrial hall scenarios, particularly for 3D LIDAR data. Furthermore, for industrial areas where vehicle platforms with omnidirectional drive are often used, 360° FOV LIDAR sensors are necessary to monitor all critical objects. Although high-resolution sensors would be optimal, mechanical LIDAR sensors with 360° FOV exhibit a significant price increase with increasing resolution.
This paper proposes a novel approach to the design of a Hardware Abstraction Layer (HAL) specifically tailored to embedded systems, placing a significant emphasis on time-controlled hardware access. The general concept and utilization of a HAL in industrial projects are widespread, serving as a well-established method in embedded systems development. HALs enhance application software portability, simplify underlying hardware usage by abstracting its inherent complexity and reduce overall development costs through software reusability. Beyond these established advantages, this paper introduces a conceptual framework that addresses critical challenges related to debugging and mitigates input-related problems often encountered in embedded systems. This becomes particularly pertinent in the automotive context, where the intricate operational environment of embedded systems demands robust solutions. The HAL design presented in this paper mitigates these issues.
In electrified vehicles, auxiliary units can be a dominant source of noise, one of which is the refrigerant scroll compressor. Compared to vehicles with combustion engines, e-vehicles require larger refrigerant compressors, as in addition to the interior, also the battery and the electric motors have to be cooled. Currently, scroll compressors are widely used in the automotive industry, which generate one pressure pulse per revolution due to their discontinuous compression principle. This results in speed-dependent pressure fluctuations as well as higher-harmonic pulsations that arise from reflections. These fluctuations spread through the refrigeration cycle and cause the vibration excitation of refrigerant lines and heat exchangers. The sound transmission path in the air conditioning heat exchanger integrated in the dashboard is particularly critical. Various silencer configurations can be used to dampen these pulsations.
In pursuit of safety validation of automated driving functions, efforts are being made to accompany real world test drives by test drives in virtual environments. To be able to transfer highly automated driving functions into a simulation, models of the vehicle’s perception sensors such as lidar, radar and camera are required. In addition to the classic pulsed time-of-flight (ToF) lidars, the growing availability of commercial frequency modulated continuous wave (FMCW) lidars sparks interest in the field of environment perception. This is due to advanced capabilities such as directly measuring the target’s relative radial velocity based on the Doppler effect. In this work, an FMCW lidar sensor simulation model is introduced, which is divided into the components of signal propagation and signal processing. The signal propagation is modeled by a ray tracing approach simulating the interaction of light waves with the environment.