Refine Your Search

Search Results

Viewing 1 to 5 of 5
Technical Paper

Deep 4D Automotive Radar-Camera Fusion Odometry with Cross-Modal Transformer Fusion

2023-12-20
2023-01-7040
Many learning-based methods estimate ego-motion using visual sensors. However, visual sensors are prone to intense lighting variations and textureless scenarios. 4D radar, an emerging automotive sensor, complements visual sensors effectively due to its robustness in adverse weather and lighting conditions. This paper presents an end-to-end 4D radar-visual odometry (4DRVO) approach that combines sparse point cloud data from 4D radar with image information from cameras. Using the Feature Pyramid, Pose Warping, and Cost Volume (PWC) network architecture, we extract 4D radar point features and image features at multiple scales. We then employ a hierarchical iterative refinement approach to supervise the estimated pose. We propose a novel Cross-Modal Transformer (CMT) module to effectively fuse the 4D radar point modality, image modality, and 4D radar point-image connection modality at multiple scales, achieving cross-modal feature interaction and multi-modal feature fusion.
Technical Paper

One Robust Loosely Coupled 4D Millimeter-Wave Image Radar SLAM Method

2023-12-20
2023-01-7051
In this paper, we introduce one imu radar loosely coupled SLAM method based on our 4D millimeter-wave image radar which it outputs pointcloud containing xyz position information and power information in our autonomous vehicles. at common pointcloud-based slam such as lidar slam usually adopt imu-lidar tightly coupled structure, which slam front end outputs odometry reversly affect imu preintegration. slam system badness occurs when front end odometry drift bigger and bigger or one frame pointcloud match failed. so in our method, we decouple imu and radar odometry crossed relationship, fusing imu and wheel odometry to generate one rough pose trajectory as initial guess value for front end registration, not directly from radar estimated odometry pose, that is to say, front end registration is independent of imu preintegration. besides, we empirically propose one idea juding front end registration result to identify match-less environment and adopt relative wheel odometry pose instead of registration pose when match belief value(mbv) is false. this can handle some degrade environment, such as two-side similar greenbelt. finally, to increase loop detection robustness, we propose two-stage loop detection verify method. first stage is RS(radius search) method, if it passes loop verify, not enter second stage, otherwise enter SC(scan context) second stage, after two stage loop, most real loop can be detected by our slam system. based on above ideas, at multi scene’s datasets, office park, residential area, open road, underground parkingplace etc, we can run our slam system successfully, meanwhile at our office park dataset we compare trajectory precision with tightly-coupled slam structure and the detected loop number with one stage loop method, exprimental result proved our proposed method is valid.
Technical Paper

Multi-Target Tracking Method Based on Improved Radar and Camera Data Association

2023-12-20
2023-01-7048
The fusion of 4D millimeter-wave imaging radar and camera is an important development trend of advanced driver assistance systems and autonomous driving. In the field of multi-target tracking, the tracking is easy to lose due to the mutual occlusion of targets in the camera view. Therefore, combining the advantages of visual sensors and 4D millimeter-wave radar, a multi-sensor information fusion association algorithm is proposed. First, the 4D millimeter-wave radar point cloud is preprocessed, outliers are removed, and target-related information in the image is detected; then the point cloud is projected onto the image, and the targets in the segmented region are filtered. The filtered point cloud is clustered, and the correlation between the region projected onto the image and the detection box is calculated. Then use the unscented Kalman filter to predict, design rules to associate targets, and update innovation by multi-point weighting.
Technical Paper

A Method for Generating Occupancy Grid Maps Based on 4D Millimeter-Wave Radar Point Cloud Characteristics

2023-12-20
2023-01-7047
4D millimeter wave radar is a high-resolution sensor that has a strong perception ability of the surrounding environment. This paper uses millimeter wave radar point cloud to establish a static probabilistic occupancy grid map for static environment modeling. In order to obtain a clean occupancy grid map, we classify the point cloud according to the result of dynamic point clustering and project the classified point cloud into the grid map. Based on the distribution and category of millimeter wave radar point cloud, we propose a calculation model of grid occupancy probability. After obtaining the occupancy probability according to the calculation model, we calculate the posterior occupancy probability by using the motion law of self-vehicle and Bayesian filtering, and construct a stable probabilistic occupancy grid map.
Technical Paper

A Sequential Method for Automotive Millimeter-Wave Radar Self-Calibration Based on Optimization

2023-12-20
2023-01-7044
Implementation calibration of automotive radar systems plays a fundamental but crucial role to guarantee sensor performance. The commonly used method relies on the environment such as a specific test station for static calibration or a straight metal guardrail for dynamic calibration. In this paper, a sequential method for estimating the radar angle misalignment derived from the Lagrange Multiplier Method in solving an optimization problem is proposed. The sequential method, which requires radar measurements and vehicle speed measurements as input, is more environment-free and can yield a consistent estimation. A simulation study is conducted to validate the consistency and analyze the influence of noise. The result shows that the radar azimuth measurement noise has little influence that the bias could be compensated and the effect of non-gaussianity is negligible.
X