Refine Your Search

Topic

Search Results

Viewing 1 to 10 of 10
Technical Paper

Autonomous Driving Development Rapid Prototyping Using ROS and Simulink

2019-04-02
2019-01-0695
Recent years have witnessed increasing interest in Advanced Driver Assistance Systems (ADAS) and Autonomous Driving (AD) development, motivating the growth of new sensor technologies and control platforms. However, to keep pace with this acceleration and to evaluate system performance, a cost and time effective software development and testing framework is required. This paper presents an overview utilizing Robotic Operating System (ROS) middleware and MATLAB/Simulink® Robotics System Toolbox to achieve these goals. As an example of employing this framework for autonomous development and testing, this article utilizes the FEV Smart Vehicle Demonstrator. The demonstrator is a reconfigurable and modular platform highlighting the power and flexibility of using ROS and MATLAB/Simulink® for AD rapid prototyping. High-level autonomous path following and braking are presented as two case studies.
Technical Paper

Autonomous Vehicle Multi-Sensors Localization in Unstructured Environment

2020-04-14
2020-01-1029
Autonomous driving in unstructured environments is a significant challenge due to the inconsistency of important information for localization such as lane markings. To reduce the uncertainty of vehicle localization in such environments, sensor fusion of LiDAR, Radar, Camera, GPS/IMU, and Odometry sensors is utilized. This paper discusses a hybrid localization technique developed using: LiDAR-based Simultaneous Localization and Mapping (SLAM), GPS/IMU, Odometry data, and object lists from Radar, LiDAR, and Camera sensors. An Extended Kalman Filter (EKF) is utilized to fuse data from all sensors in two phases. In the preliminary stage, the SLAM-based vehicle coordinates are fused with the GPS-based positioning. The output of this stage is then fused with the object-based localization. This approach was successfully tested on FEV’s Smart Vehicle Demonstrator at FEV’s HQ. It represented a complicated test environment with dynamic and static objects.
Technical Paper

Drivable Area Estimation for Autonomous Agriculture Applications

2023-04-11
2023-01-0054
Autonomous farming has gained a vast interest due to the need for increased farming efficiency and productivity as well as reducing operating cost. Technological advancement enabled the development of Autonomous Driving (AD) features in unstructured environments such as farms. This paper discusses an approach of utilizing satellite images to estimate the drivable areas of agriculture fields with the aid of LiDAR sensor data to provide the necessary information for the vehicle to navigate autonomously. The images are used to detect the field boundaries while the LiDAR sensor detects the obstacles that the vehicle encounters during the autonomous driving as well as its type. These detections are fused with the information from the satellite images to help the path planning and control algorithms in making safe maneuvers. The image and point cloud processing algorithms were developed in MATLAB®/C++ software and implemented within the Robot Operating System (ROS) middleware.
Technical Paper

HD-Map Based Ground Truth to Test Automated Vehicles

2022-03-29
2022-01-0097
Over the past decade there has been significant development in Automated Driving (AD) with continuous evolution towards higher levels of automation. Higher levels of autonomy increase the vehicle Dynamic Driving Task (DDT) responsibility under certain predefined Operational Design Domains (in SAE level 3, 4) to unlimited ODD (in SAE level 5). The AD system should not only be sophisticated enough to be operable at any given condition but also be reliable and safe. Hence, there is a need for Automated Vehicles (AV) to undergo extensive open road testing to traverse a wide variety of roadway features and challenging real-world scenarios. There is a serious need for accurate Ground Truth (GT) to locate the various roadway features which helps in evaluating the perception performance of the AV at any given condition. The results from open road testing provide a feedback loop to achieve a mature AD system.
Technical Paper

LiDAR-Based Fail-Safe Emergency Maneuver for Autonomous Vehicles

2023-04-11
2023-01-0578
Although SAE level 5 autonomous vehicles are not yet commercially available, they will need to be the most intelligent, secure, and safe autonomous vehicles with the highest level of automation. The vehicle will be able to drive itself in all lighting and weather conditions, at all times of the day, on all types of roads and in any traffic scenario. The human intervention in level 5 vehicles will be limited to passenger voice commands, which means level 5 autonomous vehicles need to be safe and capable of recovering fail operational with no intervention from the driver to guarantee the maximum safety for the passengers. In this paper a LiDAR-based fail-safe emergency maneuver system is proposed to be implemented in the level 5 autonomous vehicle.
Technical Paper

LiDAR-Based Urban Autonomous Platooning Simulation

2020-04-14
2020-01-0717
The technological advancements of Advanced Driver Assistance Systems (ADAS) sensors enable the ability to; achieve autonomous vehicle platooning, increase the capacity of road lanes, and reduce traffic. This article focuses on developing urban autonomous platooning using LiDAR and GPS/IMU sensors in a simulation environment. Gazebo simulation is utilized to simulate the sensors, vehicles, and testing environment. Two vehicles are used in this study; a Lead vehicle that follows a preplanned trajectory, while the remaining vehicle (Follower) uses the LiDAR object detection and tracking information to mimic the Lead vehicle. The LiDAR object detection is handled in multiple stages: point cloud frame transformation, filtering and down-sampling, ground segmentation, and clustering. The tracking algorithm uses the clustering information to provide position and velocity of the Lead vehicle which allows for vehicle platooning.
Technical Paper

Multi-Sensor Fusion in Slow Lanes for Lane Keep Assist System

2021-04-06
2021-01-0084
Implementing Advanced Driver Assistance Systems (ADAS) features that are available in all road scenarios and weather conditions is a big challenge for automotive companies and considered key enablers to achieve autonomous Level 4 (L4) vehicles. One important feature is the Lane Keep Assist System (LKAS). Most LKAS systems are based on lane line detection cameras and lane coefficient estimations by the camera is the key point for LKAS where the camera recognizes the lane lines using edge detection. But when the lane markers are not available due to high traffic and slow driving on the roads, another source of data for the lane lines needs to be available for the LKAS. In this paper a multi-sensor fusion approach based on camera, Lidar, and GPS is used to allow the vehicle to maintain its lateral location within the lane.
Technical Paper

Operational Design Domain Feature Optimization Route Planning Tool for Automated Vehicle Open Road Testing

2023-04-11
2023-01-0686
Autonomous vehicles must be able to function safely in complex contexts, involving unpredictable situations and interactions. To ensure this, the system must be tested at various stages as described by the V-model. This process iteratively tests and validates distinct parts of the system, starting with small components to system level assessment. However, this framework presents challenges when adapted to deal with testing problems that face autonomous vehicles. Open road testing is an effective way to expose the system to real world scenarios in combination with specific driving situations described by the Operational Design Domain (ODD). The task of finding a path between two points that maximizes the ODD exposure is not a trivial task, without mentioning that in most cases, the developers must design routes in unfamiliar regions. This represents a significant effort and resources consumption, which makes it important to optimize this task.
Technical Paper

Powertrain Level Target Setting for Impulsive Noise based on Interior Noise Levels

2015-06-15
2015-01-2295
The definition of vehicle and powertrain level NVH targets is one of the first tasks toward establishing where a vehicle's NVH behavior will reside with respect to the current or future state of industry. Realization of vehicle level NVH targets relies on a combination of competitive powertrain (source) and vehicle (path) NVH performance. Assessment of vehicle NVH sensitivity is well understood, and can be accomplished through determination of customer interface NVH response to measured excitations at the source input locations. However, development of appropriate powertrain source targets can be more difficult, particularly related to sound quality. This paper discusses various approaches for definition of powertrain targets for sound quality, with a specific focus on impulsive noise.
Technical Paper

V2X Connectivity with ROS for AD Applications

2021-04-06
2021-01-0060
Increased levels of autonomy requires an increasing amount of sensory data for the vehicle to make appropriate decisions. Sensors like camera, Lidar and Radar will help to perceive the surroundings, exposing nearby objects and blind spots within the line of sight. With V2X connectivity, vehicles communicate to other vehicles and infrastructure using wireless communications even in non-line of sight conditions. This paper presents an approach to utilize a V2X system to develop Automated Driving (AD) features in a Robot Operating System (ROS) environment. This was achieved by developing ROS drivers and creating custom messages to enable the communication between the V2X system and vehicle sensors. The developed algorithms were tested on FEV’s Smart Vehicle Demonstrator. The test results show that the proposed V2X automated driving approach has increased reliability compared to that of camera, Radar and Lidar based autonomous driving.
X