Browse Publications Technical Papers 2019-01-1378

Raw Data Injection and Failures of Camera, Radar, and Lidar for Highly Automated systems 2019-01-1378

Introduction This paper will explore how to enhance your autonomous vehicle (AV) testing capabilities and quality assurance using a completely automated HIL test environment that interfaces to or simulates autonomous sensor technology, such as cameras, RADAR, LIDAR, and other key technologies such as GNSS/maps and V2X communication. The key to performing such real-time testing is the ability to stimulate the various ECUs/sensors through closed-loop simulation of the vehicle, its environment, traffic, surroundings, etc., along with playback of captured sensor data and its synchronization with key vehicle bus and application data. The latest technologies are introduced, which allow for direct sensor data injection to ECUs/FRUs for test interaction and stimulus, in addition to dynamic on-the-fly modification of sensor data streams. It will be shown how these techniques are integrated with current HIL systems. This paper will also address new technologies and effective techniques for optimization of test processes and resources, along with application examples of specific use-cases for development of aero-applications utilizing simulated scenarios. Automated Testing and Validation of Highly Automated systems Validating these highly complex systems is a big challenge. Dependency on environment situations and scenarios for testing, infrastructure and resources required for testing can be quite expensive, time consuming and rarely effective to test the majority of the test scenarios in a regressive manner. Hardware-In the-Loop (HIL) systems are the back bone for our current embedded system validation process, but even with these efforts the limited availability of resources and project time constraints leave many engineers looking for more efficient alternatives. With new testing technologies, a single scenario (captured or generated) can be run through the HIL system with differential distortions or faults that help improve the test coverage (i.e. varying levels of pixel errors, lens shadings, and color gains for a video stream). Key test scenarios can also be analyzed by AI based algorithms that allow multiple permutations of a single test scenario to be executed. These approaches provide an exponential increase in the set of key scenarios that can be used during validation of the system. New tools will also be shown that allow for recorded data to be leveraged in these advanced HIL test systems, allowing the data replay of camera, RADAR, and LIDAR data in real-time along with traditional embedded control interfaces. In order to achieve validation of Autonomous control systems, it is critical to utilize state of the art Model-Based Development (MBD) techniques, such as virtual validation, to get the full bandwidth required for Verification and Validation (V&V). This approach works well in the framework of MIL (model-in-the-loop), SIL (Software-in-the-loop), and HIL simulation to establish a testing process that is geared to provide greater coverage for V&V systems. The technologies discussed here also apply the benefit of full test automation giving further possibilities to extend V&V capabilities.


Subscribers can view annotate, and download all of SAE's content. Learn More »


Attention: This item is not yet published. Pre-Order to be notified, via email, when it becomes available.
Members save up to 40% off list price.
Login to see discount.
Special Offer: With TechSelect, you decide what SAE Technical Papers you need, when you need them, and how much you want to pay.