Realtime data viewer and processor (in Python)
-
Updated
Nov 3, 2024 - Python
Realtime data viewer and processor (in Python)
The Laserscan and Pointcloud Combiner is a ROS 1 node that can combine PointCloud2 and LaserScan input sources by selecting the nearest detection. The computed nearest detection is converted to the laserscan format and published to a LaserScan topic. The package supports at most 1 PointCloud2 source and 2 LaserScan sources.
Software for guidance, navigation and control for the Vortex AUVs. Purpose built for competing in AUV/ROV competitions.
Roll control system for the IREC 2020-21 rocket (Endurance) and Drag control system for the IREC 2021-22 rocket (Intrepid)
Bumperbot is an open-source 3D printed self-driving robot powered by ROS 2. Its simple design and low cost make it an excellent learning platform, featured in the "Self Driving and ROS 2 - Learn by Doing! Odometry & Control" and "Self Driving and ROS 2 - Learn by Doing! Map & Localization" courses.
Autonomous Mobile Robot developed and programmed in the online course named "Self-Driving and ROS 2 - Learn By Doing! Odometry & Control"
PyTorch Implementation of Unsupervised Depth Completion with Calibrated Backprojection Layers (ORAL, ICCV 2021)
Sensor fusion research to localize a UAV in a mapped environment
Dual Perspective Fusion Transformer for Camera-Radar-based Object Detection
Object Detection on Radar sensor and RGB camera images. https://ieeexplore.ieee.org/document/9191046 Full Thesis : RADAR+RGB Fusion for Robust Object Detection in Autonomous Vehicles. Zenodo. https://doi.org/10.5281/zenodo.13738235
Implement visual inertial odometry from scratch
Integration of 4 spectral cameras with low level sensor fusion techniques to monitor Vegetation.
Implementation for EKF for Visual Inertial Odometry
Camera-LIDAR Fusion Framework for detection and tracking.
A differential drive robot is controlled using ROS2 Humble running on a Raspberry Pi 4 (running Ubuntu server 22.04). The vehicle is equipped with a raspberry pi camera for visual feedback and an RPlidar A1 sensor used for Simultaneous Localization and Mapping (SLAM), autonomous navigation and obstacle avoidance.
alfred-py: A deep learning utility library for **human**, more detail about the usage of lib to: https://zhuanlan.zhihu.com/p/341446046
Measure orientation of your robot or device using an IMU, visualize its data using this code
LiDAR-Camera Fusion for 3D Object Detection in Autonomous Driving Systems
Companion servers for Sensor Stream App. Stream sensor data, audio and images from your phone to an open source server running on your PC/Raspberry Pi in real-time over your local network.
autonomous mobile robot system for item retrieval within a simulated environment | demonstrates an integrated approach to autonomous exploration, obstacle avoidance, item detection, retrieval, and efficient navigation back to a home zone
Add a description, image, and links to the sensor-fusion topic page so that developers can more easily learn about it.
To associate your repository with the sensor-fusion topic, visit your repo's landing page and select "manage topics."