The advancement of Advanced Driver Assistance System (ADAS) technologies offers tremendous benefits. ADAS features such as emergency braking, blind-spot monitoring, lane departure warning, adaptive cruise control, etc., are promising to lower on-road accident rates and severity. With a common goal for the automotive industry to achieve higher levels of autonomy, maintaining ADAS sensor performance and reliability is the core to ensuring adequate ADAS functionality. Currently, the challenges faced by ADAS sensors include performance degradation in adverse weather conditions and a lack of controlled evaluation methods. Outdoor testing encounters repeatability issues, while indoor testing with a stationary vehicle lacks realistic conditions. This study proposes a hybrid method to combine the advantages of both outdoor and indoor testing approaches in a Drive-thru Climate Tunnel (DCT).
Autonomous Driving (AD) and Advanced Driver Assistance Systems (ADAS) are being actively developed to prevent traffic accidents. As the complexity of AD/ADAS increases, the number of test scenarios increases as well. An efficient development process that meets AD/ADAS quality and performance specifications is thus required. The European New Car Assessment Programme (Euro NCAP®1) and the Japan Automobile Manufacturers Association (JAMA®2) have both defined test scenarios, but some of these scenarios are difficult to carry out with real-vehicle testing due to the risk of harm to human participants. Due to the challenge of covering various scenarios and situations with only real-vehicle testing, we utilize simulation-based testing in this work. Specifically, we construct a Model-in-the-Loop Simulation (MILS) environment for virtual testing of AD/ADAS control logic.
Testing vision-based advanced driver assistance systems (ADAS) in a Camera-in-the-Loop (CiL) bench setup, where external visual inputs are used to stimulate the system, provides an opportunity to experiment with a wide variety of test scenarios, different types of vehicle actors, vulnerable road users, and weather conditions that may be difficult to replicate in the real world. In addition, once the CiL bench is setup and operating, experiments can be performed in less time when compared to track testing alternatives. In order to better quantify normal operating zones, track testing results were used to identify behavior corridors via a statistical methodology. After determining normal operational variability via track testing of baseline stationary surrogate vehicle and pedestrian scenarios, these operating zones were applied to screen-based testing in a CiL test setup to determine particularly challenging scenarios which might benefit from replication in a track testing environment.
The use of platforms to carry vulnerable road user (VRU) targets has become increasingly necessary with the rise of automated driver assistance systems (ADAS) on vehicles. These ADAS features must be tested in a wide variety of collision-imminent scenarios which necessitates the use of strikable targets carried by an overrun-able platform. To enable the testing of ADAS sensors such as lidar, radar, and vision systems, S-E-A, a longtime supplier of vehicle testing equipment, has created the STRIDE robotic platform (Small Test Robot for Individuals in Dangerous Environments). This platform contains many of the key ingredients of other platforms on the market, such as a hot-swappable battery, E-stop, and mounting points for targets. However, the STRIDE platform additionally provides features which can enable non-routine testing such as: turning in place, driving with an app on a mobile phone, user-scripting, and steep grade climbing capability.
This study aimed to construct driver models for overtaking behavior using long short-term memory (LSTM). During the overtaking maneuver, an ego vehicle changes lanes to the overtaking lane while paying attention to both the preceding vehicle in the travel lane and the following vehicle in the overtaking lane and returns to the travel lane after overtaking the preceding vehicle in the travel lane. This scenario was segregated into four phases in this study: Car-Following, Lane-Change-1, Overtaking, and Lane-Change-2. In the Car-Following phase, the ego vehicle follows the preceding vehicle in the travel lane. Meanwhile, in the Lane-Change-1 phase, the ego vehicle changes from the travel lane to the overtaking lane. Overtaking is the phase in which the ego vehicle in the overtaking lane overtakes the preceding vehicle in the travel lane.
Advanced driver assistance systems rely on external sensors that encompass the vehicle. The reliability of such systems can be compromised by adverse weather, with performance hindered by both direct impingement on sensors and spray suspended between the vehicle and potential obstacles. The transportation of road spray is known to be an unsteady phenomenon, driven by the turbulent structures that characterise automotive flow fields. Further understanding of this unsteadiness is a key aspect in the development of robust sensor implementations. This paper outlines an experimental method used to analyse the spray ejected by an automotive body, presented through a study of a simplified vehicle model with interchangeable rear-end geometries. Particles are illuminated by laser light sheets as they pass through measurement planes downstream of the vehicle, facilitating imaging of the instantaneous structure of the spray.
Accurate tire pressure monitoring system (TPMS) is of great practical importance and the reliability and safety of its power supply module has great concern. The piezoelectric-based surface acoustic wave (SAW) sensor is considered to have great potential in this field because of its passive, wireless and small size advantages. This paper presents the application of passive and wireless SAW sensors for real-time tire condition monitoring. The pressure sensitive structure is optimized and a three-resonator structure is also designed sensing temperature and pressure. Furthermore, a fast detection system is developed to realize high-speed signal acquisition. At last, experiments are executed and the SAW temperature and pressure sensor property is measured.
Image segmentation has historically been a technique for analyzing terrain for military autonomous vehicles. One of the weaknesses of image segmentation from camera data is that it lacks depth information, and it can be affected by environment lighting. Light detection and ranging (LiDAR) is an emerging technology in image segmentation that is able to estimate distances to the objects it detects. One advantage of LiDAR is the ability to gather accurate distances regardless of day, night, shadows, or glare. This study examines LiDAR and camera image segmentation fusion to improve an advanced driver-assistance systems (ADAS) algorithm for off-road autonomous military vehicles. The volume of points generated by LiDAR provides the vehicle with distance and spatial data surrounding the vehicle.
Platooning vehicles present novel pathways to saving fuel during transportation. With the rise of autonomous solutions, platooning becomes an increasingly apparent sector requiring the application of this new technology. Platooning vehicles travel together intending to reduce aerodynamic resistance during operation. Drafting allows following vehicles to increase fuel economy and save money on refueling, whether that be at the pump or at a charging station. However, autonomous solutions are still in infancy, and controller evaluation is an exciting challenge proposed to researchers. This work brings forth a new application of an emissions quantification metric called vehicle-specific power (VSP). Rather than utilize its emissions investigative benefits, the present work applies VSP to heterogeneous Class 8 Heavy-Duty truck platoons as a means of evaluating the efficacy of Cooperative Adaptive Cruise Control (CACC).
Improvement of vehicle path-tracking performance not only affects the vehicle driving safety and comfort but is also essential for autonomous driving technology. The current research focuses on vehicle path-tracking control study and application of dual-motor SBW system. The preview driver model is developed by considering the lateral and yaw tracking. MPC (model predictive control) and LQR (linear quadratic regulator) path following controllers are developed to compare the tracking control performance. A steer-by-wire (SBW) system of dual-motor configuration is designed with permanent magnet synchronous motor (PMSM) control scheme. Finally, the proposed control methods are verified with different driving cases, which shows that the system can effectively achieve small tracking errors in the simulation, and also can be applied in the future autonomous driving or advanced driver assistance system to maintain the lateral and yaw errors within a safe range during path-tracking.
ADAS (Advanced Driver Assistance System) functions can help the driver avoid accidents or mitigate their effect when they occur, and are pre-cursors to full autonomous driving (SAE defined as Level 4+). The main goal of this work is to develop a Model-Based system to actuate the Evasive Maneuver Assist (EMA) function. A typical scenario is the situation in which longitudinal Autonomous Emergency Braking (AEB) is too late and the driver has to adopt an evasive maneuver to avoid an object suddenly appearing on the road ahead. At this time, EMA can help improve the driver’s steering and braking operation in a coordinated way. The vehicle maneuverability and response performance will be enhanced when the driver is facing the collision. The function will additionally let the vehicle steer in a predetermined optimized trajectory based on a yaw rate set point and stabilize the vehicle. The EMA function is introduced with some analysis of benchmarking data.
For cooperative adaptive cruise control (CACC) system, a robust following control algorithm based on fuzzy PID principle is adopted in this paper. Firstly, a nonlinear vehicle dynamics model considering the lag of driving force and acceleration constraints was established. Then, with the vehicle’s control hierarchic, the upper controller takes the relative speed between vehicles and the spacing error as inputs to output the following vehicle's target acceleration, while the lower controller takes the target acceleration as inputs and the throttle opening and brake master cylinder pressure as outputs. For the setting of target spacing, this paper additionally considers the relative speed between vehicles and the acceleration of the front vehicle. Through testing, compared with the traditional variable safety distance model, the average distance reduces by 5.43% when leading vehicle is accelerating, while increases by 2.74% in deceleration.
The efficiency in energy consumption of an electric vehicle (EV) has significant value to both vehicle manufacturers and vehicle owners. Such efficiency will directly impact the cost of energy and vehicle range while relieving the stringent requirements on the DC motor and battery specs. Nowadays, with the development of advanced driver assistance systems (ADAS), such as adaptive cruise control (ACC) or cooperative adaptive cruise control (CACC), drivers enjoy a much safer driving experience. ADAS capabilities in sensory, computing and communication can be leveraged in EVs for the purpose of optimizing energy consumption. This paper introduces an energy-optimized ACC platform, which utilizes a forecast of the speed profile of the host vehicle in a short (few seconds) horizon. Such speed information can be available through ADAS or similar systems. This paper focuses on optimization in longitudinal tracks.
The presented study is dedicated to the technology supporting vehicle state estimation and motion control with a concept drone, which helps the vehicle in sensing the surroundings and driving conditions. This concept allows also extending the functionality of the sensors mounted on the vehicle by replacing or including additional parameter observation channels. The paper discusses the feasibility of such a drone-vehicle interaction as well as demonstrates several design configurations. In this regard, the paper presents a general description of the proposed drone system that assists the vehicle and describes an experiment in measuring the profile of the road with a range sensor. The results obtained in the experiment are described in terms of the accuracy to be achieved using the drone and are compared with other studies, which use the methods of estimation from the sensors mounted on the vehicle.
Testing was conducted to evaluate the performance of the 2014 Subaru Forester’s North American Generation 1 EyeSight system at speeds between 6 and 57 miles per hour (mph). The testing utilized a custom-built foam stationary vehicle target designed to withstand 60+ mph impact speeds. Testing measured the Time to Collision (TTC) values of the visual/audible component of the forward collision warning that was presented to the driver. In addition, the testing quantified the TTC and Time to Collision 2 (TTC2) response of the Automatic Emergency Braking (AEB) system including the timing and magnitude of the stage one braking response and the timing and magnitude of the stage two braking response. The results of the testing add higher speed Forward Collision Warning (FCW) and AEB testing scenarios to the database of publicly available tests from sources like the Insurance Institute for Highway Safety (IIHS), which currently evaluates vehicles’ AEB systems at speeds of 12 and 25 mph.
The connectivity between vehicles, infrastructure, and other traffic participants brings a new dimension to automotive safety applications. Soon all the newly produced cars will have Vehicle to Everything (V2X) communication modems alongside the existing Advanced Driver Assistant Systems (ADAS). It is essential to identify the different sensor measurements for the same targets (Data Association) to use connectivity reliably as a safety feature alongside the standard ADAS functionality. Considering the camera is the most common sensor available for ADAS systems, in this paper, we present an experimental implementation of a Mahalanobis distance-based data association algorithm between the camera and the Vehicle to Vehicle (V2V) communication sensors. The implemented algorithm has low computational complexity and the capability of running in real-time. One can use the presented algorithm for sensor fusion algorithms or higher-level decision-making applications in ADAS modules.
Automotive system functionalities spread over a wide range of sub-domains ranging from non-driving related components to complex autonomous driving related components. The requirements to design and develop these components span across software, hardware, firmware, etc. elements. The successful development of these components to achieve the needs from the stockholders requires accurate understanding and traceability of the requirements of these component systems. The high-level customer requirements transformation into low level granularity requires an efficient requirement engineer. The manual understanding of the customer requirements from the requirement documents are influenced by the context and the knowledge gap of the requirement engineer in understanding and transforming the requirements.
Accurate identification of driver’s braking intention is essential in advanced driver assistance system and can make the driving process more comfortable and trustworthy. In this paper, a novel method for driver braking intention identification in cut-in scenarios was proposed by using driver’s gaze information and motion information of cut-in vehicles. Firstly, a "looking in and looking out" experimental platform including three eye-tracking cameras and one front-view camera was built to collect driver's gaze information and the vehicle motion information. Secondly, driver’s gaze features and motion features of cut-in vehicles were selected and the braking intention identification performance of several decision tree-based ensemble learning algorithms was compared. Thirdly, the feature importance was analyzed by using SHAP (SHapley Additive exPlanations) values. This novel method of braking intention identification makes full use of in-vehicle camera sensors.