Skip to content

yehengchen/DOPE-ROS-D435

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

75 Commits
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

3D Object Pose Estimation - ROS - Realsense d435

Development Environment

  • Ubuntu 16.04.2 - ROS Kinetic

  • Ubuntu 18.04.1 - ROS Melodic


How can you get the datasets object in the real world

In Datasets_obj folder you can printing the object texture onto a box or can of the exact size.

How to make custom synthetic data using UE4

[Bilibili_Demo]

NVIDIA Deep learning Dataset Synthesizer (Synthetic-Data-UE4)

DOPE Installing

Step 1: Download the DOPE code

cd ~/catkin_ws/src
git clone https://github.com/yehengchen/DOPE-ROS-D435.git

Step 2: Install python dependencies

cd ~/catkin_ws/src/dope
pip install -r requirements.txt

Step 3: Install ROS dependencies

cd ~/catkin_ws
rosdep install --from-paths src -i --rosdistro kinetic
sudo apt-get install ros-kinetic-rosbash ros-kinetic-ros-comm
Build

cd ~/catkin_ws
catkin_make

Step 4: Download the weights and save them to the weights folder, i.e., ~/catkin_ws/src/dope/weights/.


ROS Wrapper for Intel® RealSense D435 - Ubuntu 16.04_ROS Kinetic

Step 1: Install the latest Intel® RealSense™ SDK 2.0

Install from Debian Package - In that case treat yourself as a developer. Make sure you follow the instructions to also install librealsense2-dev and librealsense-dkms packages. OR Build from sources by downloading the latest Intel® RealSense™ SDK 2.0 and follow the instructions under Linux Installation

Step 2: Install the ROS distribution Install ROS Kinetic, on Ubuntu 16.04

Step 3: Install Intel® RealSense™ ROS from Sources

cd ~/catkin_ws/src/

Clone the latest Intel® RealSense™ ROS from here into 'catkin_ws/src/'

git clone https://github.com/IntelRealSense/realsense-ros.git
cd realsense-ros/
git checkout `git tag | sort -V | grep -P "^\d+\.\d+\.\d+" | tail -1`
cd ..

Make sure all dependent packages are installed. You can check .travis.yml file for reference. Specifically, make sure that the ros package ddynamic_reconfigure is installed. If ddynamic_reconfigure cannot be installed using APT, you may clone it into your workspace 'catkin_ws/src/' from here (Version 0.2.0)

catkin_init_workspace
cd ..
catkin_make clean
catkin_make -DCATKIN_ENABLE_TESTING=False -DCMAKE_BUILD_TYPE=Release
catkin_make install
echo "source ~/catkin_ws/devel/setup.bash" >> ~/.bashrc
source ~/.bashrc

ROS Wrapper for Intel® RealSense D435 - Ubuntu 18.04_ROS Melodic

  • The steps are described in bellow documentation

    [IntelRealSense -Linux Distribution]

    
    sudo apt-key adv --keyserver keys.gnupg.net --recv-key F6E65AC044F831AC80A06380C8B3A55A6F3EFCDE || sudo apt-key adv --keyserver hkp://keyserver.ubuntu.com:80 --recv-key F6E65AC044F831AC80A06380C8B3A55A6F3EFCDE  
    
    sudo add-apt-repository "deb http://realsense-hw-public.s3.amazonaws.com/Debian/apt-repo bionic main" -u
    
    sudo apt-get install librealsense2-dkms
    
    sudo apt-get install librealsense2-utils
    
    sudo apt-get install librealsense2-dev
    
    sudo apt-get install librealsense2-dbg #(리얼센스 패키지 설치 확인하기)
    
    realsense-viewer
    
    
  • Installing Realsense-ros

    1. catkin workspace
    mkdir -p ~/catkin_ws/src
    cd ~/catkin_ws/src/
    
    1. Download realsense-ros pkg
    git clone https://github.com/IntelRealSense/realsense-ros.git
    cd realsense-ros/
    git checkout `git tag | sort -V | grep -P "^\d+\.\d+\.\d+" | tail -1`
    cd ..
    
    1. Download ddynamic_reconfigure
    cd src
    git clone https://github.com/pal-robotics/ddynamic_reconfigure/tree/kinetic-devel
    cd ..
    
    1. Pkg installation
    catkin_init_workspace
    cd ..
    catkin_make clean
    catkin_make -DCATKIN_ENABLE_TESTING=False -DCMAKE_BUILD_TYPE=Release
    catkin_make install
    echo "source ~/catkin_ws/devel/setup.bash" >> ~/.bashrc
    source ~/.bashrc
    
    1. Run D435 node
    roslaunch realsense2_camera rs_camera.launch
    
    1. Run rviz testing
    rosrun rviz rvzi
    Add > Image to view the raw RGB image
    

Running

1. Start ROS master

cd ~/catkin_ws
source devel/setup.bash
roscore

2. Start camera node (or start your own camera node)

Realsense D435 & usb_cam node (./dope/config/config_pose.yaml):

topic_camera: "/camera/color/image_raw"            #"/usb_cam/image_raw"
topic_camera_info: "/camera/color/camera_info"     #"/usb_cam/camera_info"

Start camera node:

roslaunch realsense2_camera rs_rgbd.launch  # Publishes RGB images to `/camera/color/image_raw`

3. Start DOPE node

roslaunch dope dope.launch [config:=/path/to/my_config.yaml]  # Config file is optional; default is `config_pose.yaml`

4. Start rviz node

rosrun rviz rviz

Debugging

  • The following ROS topics are published (assuming topic_publishing == 'dope'):
    /dope/webcam_rgb_raw       # RGB images from camera
    /dope/dimension_[obj_name] # dimensions of object
    /dope/pose_[obj_name]      # timestamped pose of object
    /dope/rgb_points           # RGB images with detected cuboids overlaid
    /dope/detected_objects     # vision_msgs/Detection3DArray of all detected objects
    /dope/markers              # RViz visualization markers for all objects
*Note:* `[obj_name]` is in {cracker, gelatin, meat, mustard, soup, sugar}
  • To debug in RViz, run rviz, then add one or more of the following displays:

    • Add > Image to view the raw RGB image or the image with cuboids overlaid
    • Add > Pose to view the object coordinate frame in 3D.
    • Add > MarkerArray to view the cuboids, meshes etc. in 3D.
    • Add > Camera to view the RGB Image with the poses and markers from above.

Demo

[Bilibili_Demo]

Citing

If you use this tool in a research project, please cite as follows:

@inproceedings{tremblay2018corl:dope,
 author = {Jonathan Tremblay and Thang To and Balakumar Sundaralingam and Yu Xiang and Dieter Fox and Stan Birchfield},
 title = {Deep Object Pose Estimation for Semantic Robotic Grasping of Household Objects},
 booktitle = {Conference on Robot Learning (CoRL)},
 url = "https://arxiv.org/abs/1809.10790",
 year = 2018
}

License

Copyright (C) 2018 NVIDIA Corporation. All rights reserved. Licensed under the CC BY-NC-SA 4.0 license.

Reference

DOPE - Deep Object Pose Estimation (DOPE) – ROS inference (CoRL 2018)

About

Object 6DoF Pose Estimation for Assembly Robots Trained on Synthetic Data - ROS Kinetic/Melodic Using Intel® RealSense D435

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published