Skip to content

Live Dense Multi Modal 3D Mapping — A robotic and autonomous system designed for real time 3D reconstruction using a fusion of multiple depth and camera sensors simultaneously at real time speed

License

Notifications You must be signed in to change notification settings

asvegah/robopilot

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

17 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Robopilot

robopilot

Robotic systems for an unstructured world — Live Dense Multi Modal 3D Mapping — A system designed for real time 3D reconstruction using a fusion of multiple depth and camera sensors simultaneously at real time speed.

Robopilot is minimalist and modular autonomous computer vision library for Python. It is developed with a focus on allowing fast experimentation with Distributed Deep Neural Networks. It is based on existing opensource work, associated machine vision, communications and motor-control libraries and the CUDA and Tensor Flow deep-learning framework.

Use Robopilot if you want to

  • Experiment with point clouds, mapping computer vision and neural networks.
  • Log sensor data. (images, user inputs, sensor readings)
  • Leverage distributed visiion data.
  • Capturing an object’s 3D structure from multiple viewpoints simultaneously,
  • Capturing a “panoramic” 3D structure of a scene (extending the field of view of one sensor by using many)
  • Streaming the reconstructed point cloud to a remote location,
  • Increasing the density of a point cloud captured by a single sensor, by having multiple sensors capture the same scene.

Test Platform

  • Nvidia TX1 (x2)
  • RedCat Crawler 1/5 (x1)
  • Xbox Kinect for PC (x2)
  • Intel RTF Drone (x1)

Get Piloting

After building a Robopilot you can turn on your device and go to http://localhost:8887 to pilot.

Modify your device behavior

The robopilot device is controlled by running a sequence of events

#Define a vehicle to take and record pictures 10 times per second.

import time
from robopilot import Vehicle
from robopilot.parts.cv import CvCam
from robopilot.parts.tub_v2 import TubWriter
V = Vehicle()

IMAGE_W = 160
IMAGE_H = 120
IMAGE_DEPTH = 3

#Add a camera part
cam = CvCam(image_w=IMAGE_W, image_h=IMAGE_H, image_d=IMAGE_DEPTH)
V.add(cam, outputs=['image'], threaded=True)

#warmup camera
while cam.run() is None:
    time.sleep(1)

#add tub part to record images
tub = TubWriter(path='./dat', inputs=['image'], types=['image_array'])
V.add(tub, inputs=['image'], outputs=['num_records'])

#start the drive loop at 10 Hz
V.start(rate_hz=10)

About

Live Dense Multi Modal 3D Mapping — A robotic and autonomous system designed for real time 3D reconstruction using a fusion of multiple depth and camera sensors simultaneously at real time speed

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published