[IEEE T-PAMI] Awesome BEV perception research and cookbook for all level audience in autonomous diriving
-
Updated
Jan 6, 2024 - Python
[IEEE T-PAMI] Awesome BEV perception research and cookbook for all level audience in autonomous diriving
TensorFlow Implementation for Computing a Semantically Segmented Bird's Eye View (BEV) Image Given the Images of Multiple Vehicle-Mounted Cameras.
[ICCV'21] NEAT: Neural Attention Fields for End-to-End Autonomous Driving
Talk2BEV: Language-Enhanced Bird's Eye View Maps (Accepted to ICRA'24)
[ITSC 23] Official codebase for the paper 'Radar Enlighten the Dark: Enhancing Low-Visibility Perception for Automated Vehicles with Camera-Radar Fusion
This repository is dedicated for 3D detection , sensor fusion and tracking using camera and LIDAR data
A dashboard for monitoring 1 meter social distancing for outdoor areas using computer vision
The code used by the team of the Hanze UAS during the Self Driving Challenge 2024
Transforming multiple vehicle-mounted camera images into a bird’s-eye view (BEV), this project employs deep learning, including a U-Net backbone for semantic segmentation and a spatial transformer module.
Perspective transformation (birds eye) tool, for easy experimenting and finding the right transformation values.
Geo-trax is an end-to-end pipeline to extract and analyze georeferenced vehicle trajectories from quasi-stationary drone imagery.
Add a description, image, and links to the birds-eye-view topic page so that developers can more easily learn about it.
To associate your repository with the birds-eye-view topic, visit your repo's landing page and select "manage topics."