Refine Your Search

Topic

Author

Search Results

Book

LiDAR Technologies and Systems

2019-07-10
The first part of LiDAR Technologies and Systems introduces LiDAR and its history, and then covers the LiDAR range equation and the link budget (how much signal a LiDAR must emit in order to get a certain number of reflected photons back), as well as the rich phenomenology of LiDAR, which results in a diverse array of LiDAR types. ...The first part of LiDAR Technologies and Systems introduces LiDAR and its history, and then covers the LiDAR range equation and the link budget (how much signal a LiDAR must emit in order to get a certain number of reflected photons back), as well as the rich phenomenology of LiDAR, which results in a diverse array of LiDAR types. The middle chapters discuss the components of a LiDAR system, including laser sources and modulators, LiDAR receivers, beam-steering approaches, and LiDAR processing. ...The middle chapters discuss the components of a LiDAR system, including laser sources and modulators, LiDAR receivers, beam-steering approaches, and LiDAR processing.
Technical Paper

Infrastructure-Based LiDAR Monitoring for Assessing Automated Driving Safety

2022-03-29
2022-01-0081
Building on earlier work from the IAM in creating an infrastructure- based sensor system to evaluate OSA metrics in real- world scenarios, this paper presents an approach for real-time localization and velocity estimation for AVs using a network of LiDAR sensors. The LiDAR data are captured by a network of three Luminar LiDAR sensors at an intersection in Anthem, AZ, while camera data are collected from the same intersection. ...The accuracy of both the localization and velocity estimation using LiDAR is assessed by comparing the LiDAR estimated state vectors against the differential GPS position and velocity measurements from a test vehicle passing through the intersection, as well as against a camera-based algorithm applied on drone video footage It is shown that the proposed method, taking advantage of simultaneous data capture from multiple LiDAR sensors, offers great potential for fast, accurate operational safety assessment of AV’s with an average localization error of only 10 cm observed between LiDAR and real-time differential GPS position data, when tracking a vehicle over 170 meters of roadway. ...Additional sensor modalities such as Light Detecting and Ranging (LiDAR) sensors allow for a wider range of scenarios to be accommodated and may also provide improved measurements of the Operational Safety Assessment (OSA) metrics previously introduced by the Institute of Automated Mobility (IAM).
Technical Paper

LiDAR Based Classification Optimization of Localization Policies of Autonomous Vehicles

2020-04-14
2020-01-1028
Currently, this is achieved by utilizing expensive sensors such as a differential GPS which provides centimeter accuracy, or by using computationally taxing algorithms to attempt to match live input data from LiDARs or cameras to previously recorded data or maps. Within this paper an algorithm and accompanying hardware stack is proposed to reduce the computational load on the localization of the robot relative to a prior map. ...The principal of the software stack is to leverage deep learning and powerful filters to perform classification of landmark objects within a scan of the LiDAR. These landmarks can have highly accurate known world coordinates and will be used for position estimation and localization of the vehicle. ...The first will be developing the training set for these landmarks, which will be done by leveraging a dedicated computing platform designed to filter the raw LiDAR data, removing all unnecessary elements leaving only high density clusters which can be used as landmarks.
Technical Paper

Recognition and Classification of Vehicle Target Using the Vehicle-Mounted Velodyne LIDAR

2014-04-01
2014-01-0322
This paper describes a novel recognition and classification method of vehicle targets in urban road based on a vehicle-mounted Velodyne HDL64E light detection and ranging (LIDAR) system. The autonomous vehicle will choose different driving strategy according to the surrounding traffic environments to guarantee that the driving is safe, stable and efficient. ...Range-imagine can be achieved by projecting the 3D points to a 2.5D grid and taking the LIDAR (Light Detection and Ranging) origin point as the project origin. In this method, the transform just uses in the each cluster instead of whole 3D points.
Journal Article

LiDAR Data Segmentation in Off-Road Environment Using Convolutional Neural Networks (CNN)

2020-04-14
2020-01-0696
Recent developments in the area of autonomous vehicle navigation have emphasized algorithm development for the characterization of LiDAR 3D point-cloud data. The LiDAR sensor data provides a detailed understanding of the environment surrounding the vehicle for safe navigation. ...However, LiDAR point cloud datasets need point-level labels which require a significant amount of annotation effort. ...The simulated LiDAR data was generated by a physics-based platform, the Mississippi State University Autonomous Vehicle Simulator (MAVS).
Technical Paper

Automated AI-based Annotation Framework for 3D Object Detection from LIDAR Data in Industrial Areas.

2024-07-02
2024-01-2999
Additionally, LIDAR sensors are currently popular due to their superior spatial resolution and accuracy compared to RADAR, as well as their robustness to varying lighting conditions compared to cameras. ...However, there is a notable lack of open-source datasets specifically designed for industrial hall scenarios, particularly for 3D LIDAR data. Furthermore, for industrial areas where vehicle platforms with omnidirectional drive are often used, 360° FOV LIDAR sensors are necessary to monitor all critical objects. ...Furthermore, for industrial areas where vehicle platforms with omnidirectional drive are often used, 360° FOV LIDAR sensors are necessary to monitor all critical objects. Although high-resolution sensors would be optimal, mechanical LIDAR sensors with 360° FOV exhibit a significant price increase with increasing resolution.
Technical Paper

Lidar Inertial Odometry and Mapping for Autonomous Vehicle in GPS-Denied Parking Lot

2020-04-14
2020-01-0103
Compared to other odometry using IMU and lidar, we apply a tight coupled of lidar and IMU method to achieve lower drift, which can effectively overcome the degradation problem based on pure lidar method, ensuring precise pose estimation in fast motion. ...Recently, lidar odometry and visual odometry have been introduced into localization systems to overcome the problem of missing GPS signals. ...Compared with visual odometry, lidar odometry is not susceptible to light, which is widely applied in weak-light environments.
Technical Paper

End-to-End Synthetic LiDAR Point Cloud Data Generation and Deep Learning Validation

2022-03-29
2022-01-0164
LiDAR sensors are common in automated driving due to their high accuracy. However, LiDAR processing algorithm development suffers from lack of diverse training data, partly due to sensors’ high cost and rapid development cycles. ...We address the unmet need for abundant, high-quality LiDAR data with the development of a synthetic LiDAR point cloud generation tool and validate this tool’s performance using the KITTI-trained PIXOR object detection model. ...This approach will support low-cost bulk generation of accurate data for training advanced selfdriving algorithms, with configurability to simulate existing and upcoming LiDAR configurations possessing varied channels, range, vertical and horizontal fields of view, and angular resolution.
Technical Paper

Reconstruction of 3D Accident Sites Using USGS LiDAR, Aerial Images, and Photogrammetry

2019-04-02
2019-01-0423
In 2017 the United States Geological Survey (USGS) released historical 3D point clouds (LiDAR) allowing for access to digital 3D data without visiting the site. This offers many unique benefits to the reconstruction community including: safety, budget, time, and historical preservation. ...To determine accuracies achievable using this method, evidence locations solved for using only USGS LiDAR, aerial images and scene photographs (representative of emergency personnel photographs) were compared with known locations documented using total station survey equipment and ground-based 3D laser scanning. ...To further evaluate the quality of the USGS LiDAR, a comparative point cloud analysis of the roadway surfaces was performed. On average, 85% of the USGS LiDAR points were found to be within .5 inches of the ground-based 3D scanning points.
Technical Paper

Unmanned Terminal Vehicle Positioning System Based on Roadside Single-Line Lidar

2021-03-02
2021-01-5029
The main research content of this paper is to design a positioning algorithm for unmanned terminal Automated Guided Vehicle (AGV) based on single-line lidar, including point cloud data acquisition, background filtering, point cloud clustering, vehicle position extraction, and result optimization.
Technical Paper

Raw Data Injection and Failure Testing of Camera, Radar, and Lidar for Highly Automated Systems

2019-03-19
2019-01-1378
This paper explores how to enhance your autonomous system (AS) testing capabilities and quality assurance using a completely automated hardware-in-the-loop (HIL) test environment that interfaces to or simulates autonomous sensor technology, such as cameras, radar, LIDAR, and other key technologies, such as GNSS/maps and V2X communication. The key to performing such real-time testing is the ability to stimulate the various electronic control units (ECUs)/sensors through closed-loop simulation of the vehicle, its environment, traffic, surroundings, etc., along with playback of captured sensor data and its synchronization with key vehicle bus and application data.
Article

Luminar expands into automotive LiDAR with $250 million in fundraising

2019-07-15
Iris is slated to launch commercially on production vehicles beginning in 2022 and is the first sensing platform to exceed the essential performance, safety, cost, and auto-grade requirements needed to deliver SAE International-defined Level 3 and 4 autonomy to consumers.
Technical Paper

Utilizing Neural Networks for Semantic Segmentation on RGB/LiDAR Fused Data for Off-road Autonomous Military Vehicle Perception

2023-04-11
2023-01-0740
Light detection and ranging (LiDAR) is an emerging technology in image segmentation that is able to estimate distances to the objects it detects. ...One advantage of LiDAR is the ability to gather accurate distances regardless of day, night, shadows, or glare. This study examines LiDAR and camera image segmentation fusion to improve an advanced driver-assistance systems (ADAS) algorithm for off-road autonomous military vehicles. ...This study examines LiDAR and camera image segmentation fusion to improve an advanced driver-assistance systems (ADAS) algorithm for off-road autonomous military vehicles.
Technical Paper

Autonomous Vehicle Multi-Sensors Localization in Unstructured Environment

2020-04-14
2020-01-1029
To reduce the uncertainty of vehicle localization in such environments, sensor fusion of LiDAR, Radar, Camera, GPS/IMU, and Odometry sensors is utilized. This paper discusses a hybrid localization technique developed using: LiDAR-based Simultaneous Localization and Mapping (SLAM), GPS/IMU, Odometry data, and object lists from Radar, LiDAR, and Camera sensors. ...This paper discusses a hybrid localization technique developed using: LiDAR-based Simultaneous Localization and Mapping (SLAM), GPS/IMU, Odometry data, and object lists from Radar, LiDAR, and Camera sensors. An Extended Kalman Filter (EKF) is utilized to fuse data from all sensors in two phases. ...The test results show that multi-sensor fusion improves the vehicle’s localization compared to GPS/IMU or LiDAR alone.
Technical Paper

Training of Neural Networks with Automated Labeling of Simulated Sensor Data

2019-04-02
2019-01-0120
This method utilizes physics-based simulation of sensors, along with automated truth labeling, to improve the speed and accuracy of training data acquisition for both camera and LIDAR sensors. This framework is enabled by the MSU Autonomous Vehicle Simulator (MAVS), a physics-based sensor simulator for ground vehicle robotics that includes high-fidelity simulations of LIDAR, cameras, and other sensors. ...This framework is enabled by the MSU Autonomous Vehicle Simulator (MAVS), a physics-based sensor simulator for ground vehicle robotics that includes high-fidelity simulations of LIDAR, cameras, and other sensors.
X