A BEV visualization tool for Synapse Mobility products
git clone https://github.com/synapsemobility/synapseBEV.git
pip install synapsebev
cd synapsebev
python launch_synapse_eyes.py --config_file configs/synapse_eyes/config_file.yml
The visualization below shows how the perception score increased when multiple robots could share their perception data through synapse eyes.
- Blue: Ego robot
- Red: Other robots in the scene
- Grayscale: White: Fully visible; Black: Non-visible
- Trailing shadow: Slowly decaying visibility (White to black) to the non-visible region as time passes (Due to the dynamic environment)
- Detection range of individual robot is 5*5 grid, centered at the robot.
- Required perception range is 10*10 grid, centered at the robot.
- Perception score: Sum of the 10*10 grid cell values, centered at the robot. Non-visible cell is 0, and visible cell is 1. Partial visibility ranges are: (0, 1)
Assumptions
python launch_synapse_plan.py --config_file configs/synapse_plan/config_file.yml --output_file configs/synapse_plan/output_file.yml
Below, two robots collaboratively planned their path from the starting position to the goal position.
| Current Approach (Failure) | with Synapse (Success) |
|---|---|
![]() |
![]() |


