Skip to content

Bird-eye view map construction for a mobile robot based on RGB and point cloud input

Notifications You must be signed in to change notification settings

RuslanAgishev/bev-net

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

5 Commits
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Bird Eye View Networks (BEV-nets)

This repository containes forks of the SOTA works on local map construction for a mobile robot based on sensory input. For relevant papers with code, please, refer to the BEV-map construction notion section.

The end-to-end architecture that directly extracts a bird’s-eye-view semantic representation of a scene given image data from an arbitrary number of cameras.

Given a single color image captured from a driving platform, the model predicts the bird's-eye view semantic layout of the road and other traffic participants.

KITTI Argoverse

Joint Perception and Motion Prediction for Autonomous Driving Based on Bird's Eye View Maps. In addition to semantic information, the model also predicts motion direction of the cells on a local map based on sequence of lidar sweeps input.

About

Bird-eye view map construction for a mobile robot based on RGB and point cloud input

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published