Refine Your Search

Topic

Search Results

Book

LiDAR Technologies and Systems

2019-07-10
Why are vision systems fundamental and critical to autonomous flight? What are the vision system tasks required for autonomous flight? How can those tasks be approached? It addresses the role of vision systems for autonomous operations and discusses the critical tasks required of a vision system, including taxi, takeoff, en-route navigation, detect and avoid, and landing, as well as formation flight or approach and docking at a terminal or with other vehicles. These tasks are analyzed to develop field of view, resolution, latency, and other sensing requirements and to understand when one sensor can be used for multiple applications. Airspace classifications, landing visibility categories, decision height criteria, and typical runway dimensions are introduced. The book provides an overview of sensors and phenomenology from visible through infrared, extending into the radar bands and including both passive and active systems.
Technical Paper

Bi-Directional Adjustable Holder for LiDAR Sensor

2024-01-16
2024-26-0024
LiDAR stands for Light Detection and Ranging. It works on the principle of reflection of light. LiDAR is one among the other sensors like RADAR and Camera to help achieve a higher level (Level 3 & above) of Autonomous driving capabilities. ...LiDAR is one among the other sensors like RADAR and Camera to help achieve a higher level (Level 3 & above) of Autonomous driving capabilities. LiDAR, as a sensor, is used to perceive the environment in 3D by calculating the ‘Time of flight’ of the Laser beam transmitted from LiDAR and the rays reflected from the Object, along with the intensity of reflection from the object. ...LiDAR, as a sensor, is used to perceive the environment in 3D by calculating the ‘Time of flight’ of the Laser beam transmitted from LiDAR and the rays reflected from the Object, along with the intensity of reflection from the object.
Technical Paper

Joint Calibration of Dual LiDARs and Camera Using a Circular Chessboard

2020-04-14
2020-01-0098
Camera and LiDAR are widely equipped on autonomous self-driving cars and developed with many algorithms in recent years. ...The fusion system of camera and LiDAR provides state-of the-art methods for environmental perception due to the defects of single vehicular sensor. ...In this paper, we assemble a test platform which is made up of dual LiDARs and one monocular camera and use the same sensing hardware architecture as intelligent sweeper designed by our laboratory.
Technical Paper

Fail-Operational Safety Architecture for ADAS Systems Considering Domain ECUs

2018-04-03
2018-01-1069
The solutions show how the redundant system architecture and safety architecture can be created efficiently and diverse redundancy for ADAS systems considering the processing chain from sensors such as camera, radar, lidar, etc. to perception and decision algorithms in order to fulfill the ASIL D safety requirements and to increase the system availability with fail-operational for self-driving vehicles with SAE Level 3 and fully self-driving vehicles with SAE level 4 and level 5.
Magazine

Automotive Engineering: March 2022

2022-03-01
Expanding ADAS roles for radar and cameras Evolving into lidar alternatives, the bread-and-butter sensors of ADAS are seeing potential far beyond commodity status.
Magazine

Autonomous Vehicle Engineering: April 2023

2023-04-13
Editorial Threats to automation's reputation The Navigator Increasing EV range through increased compute efficiency The Road to Zero Prototypes ADAS and HMI development are important applications for new simulation solutions. Lidar vs. Everybody in the Onboard Sensor Race Future vehicle systems will feature a reduced sensor array, but still will need a technology combination for safe performance.
Magazine

Automotive Engineering: August 2018

2018-08-02
LiDAR: new "eyes" for vehicle autonomy The steadily-evolving sensor tech offers big leaps forward in capability-once cost is reduced. e-Axles speed electrification Creating a practical 'bridge' between today's legacy ICE architectures and the electric future.
Technical Paper

On the Safety of Autonomous Driving: A Dynamic Deep Object Detection Approach

2019-04-02
2019-01-1044
In other word, considering the inputs of system, which have been emitted from eclectic type of sensors such as camera, radar, and LIDAR, as time-varying signals, can be helpful to engross ‘time’ as a fundamental feature in modeling for forecasting the object, while car is moving on the way.
Technical Paper

Enhancing Safety Features of Advanced Driver Assistance System Warnings by Using Head-Up Displays

2024-04-09
2024-01-2058
ADAS (Advanced Driver Assistance Systems) is a growing technology in automotive industry, intended to provide safety and comfort to the passengers with the help of variety of sensors like radar, camera, LIDAR etc. Though ADAS improved safety of passengers comparing to conventional non-ADAS vehicles, still it has some grey areas for safety enhancement and easy assistance to drivers.
Technical Paper

Multi-Sensor Data Fusion Techniques for RPAS Detect, Track and Avoid

2015-09-15
2015-01-2475
In order to perform an effective detection of objects, a number of high performance, reliable and accurate avionics sensors and systems are adopted including non-cooperative sensors (visual and thermal cameras, Laser radar (LIDAR) and acoustic sensors) and cooperative systems (Automatic Dependent Surveillance-Broadcast (ADS-B) and Traffic Collision Avoidance System (TCAS)).
X