Skip to content

This Project is to have a Robot Arm to do sorting mechanisms for things using darknet with addition of Digital Twins

Notifications You must be signed in to change notification settings

Harighs/Robot-Arm-for-Sorting-Mechanism-using-ROS-and-YOLOv4

Repository files navigation

Robot-Arm-for-Sorting-Mechanism-using-ROS-and-YOLOv4

This Project is to have a Robot Arm to do sorting mechanisms for things using darknet with addition of Digital Twins

Idea: Sorting socks by its color using robot arm from conveyor system Software Tools:

  1. YOLOv4 Neural Networks model - Prediction of socks color
  2. ROS (Robotics Operating System) - Controlling robot arm and other apperatus
  3. MoveIt tool - For path planning and Kinematical definitions & Visualisation

Contents:

  1. Introduction
  2. Requirements for Raspbeery pi (Controlling robot)
  3. Requirements for Ros Master PC (Path Planning)
  4. Requirements for ML work stations (prediction of socks yolov4 model) for RTX3060
  5. Our own custom Dataset
  6. Procedures for building up Yolo Architecture
  7. Procedures for building up Moveit architecture
  8. Procedures for robot raspbeery pi
  9. Running in Docker Container (For Niryo Ned Robot)
  10. Demo

Process plan

1. Introduction

Sorting is huge task in as well as in homes and industries this project is about making an opensource diy socks sorting robot where it can predict the color of socks either black and white and grab it using opencv and does requried path planning using "Moveit"- a ros noetic plugin. and place it in the correct box. The Entire Architecture of would be explained in this image below:

2. Requirements for Raspbeery pi (Controlling robot)

  • Bare (Flashed) Ubuntu OS on Raspberry pi 3/4 (terminal version)
  • Ros noetic for raspberry: link
  • Required dependencies for running PCA9685: link
  • sanity check using builit server-receiver nodes
  • Perfect wiring of hardwares

After this please make the asssembly of diy robot from "Joy-It" link: https://joy-it.net/en/products/Robot02

3. Requirements for Ros Master PC (Path Planning)

4. Requirements for ML work stations (prediction of socks yolov4 model) for RTX3060

  • Cmake>=3.18
  • opencv
  • NvidiaGPU Drivers 470.42.01
  • cuda version 11.4
  • nvcc
  • yolov4 architecture

5. Our own custom Dataset

https://www.kaggle.com/datasets/harigovindasamy/socks-color-dataset-white-and-black

6. Procedures for building up Yolo Architecture:

  1. Disable "Secure - Boot" by the command: source sudo mokutil --disable-validation
    • Check your secure boot status (Enabled/Disabled) through this command: mokutil --sb-state which need this lib: sudo apt-get install mokutil
  1. GUI-Version: Downalod and install from ubuntu store cmake>=3.18
  1. make sure of installation of right opencv, drivers, cuda, nvcc
  2. pull updated version of yolov4 from: https://github.com/AlexeyAB
  3. copy our custom pretrained models from our folder "scripts/socks_model/Robot" to yolo directory
  4. Build using the make file for further information about building visit: https://github.com/AlexeyAB
  5. copy the required scripts "scripts/for_ml_inference" to the workstation

Our socks prediction model is ready now. so we can move to Ros master path planning

7. Procedures for building up Moveit architecture:

  1. first run this:
# For enabling each type:
sudo add-apt-repository universe
sudo add-apt-repository multiverse
sudo add-apt-repository restricted
  1. copy our "urdf_and_mesh_models_for_moveit" to ROS master PC to ubuntu ROS workspace eg: "home/username/catkin_workspace/src/"
  2. run roslaunch moveit_setup_assistance setup_assistance.launch
  3. then select the urdf file from "urdf_and_mesh_models_for_moveit/Aura_robot/urdf/Aura_robot.urdf", the meshes will be loaded and robot will be visible on right side
  4. build the moveit architecutre (naming convection should be neat and constant for whole process)
  5. copy requred codes from "Robot-Arm-for-Sorting-Mechanism-using-ROS-and-YOLOv4\scripts\for_master" to catkin workspace then build it

The Workflow of how moveit works from urdf is explained here:

8. Procedures for robot raspbeery pi:

  1. sanity check of all dependencies installed aready for the PCA9685 hardwares and electrical connections
  2. copy "Robot-Arm-for-Sorting-Mechanism-using-ROS-and-YOLOv4\scripts\robot_pi" to raspberry pi

9. Running in Docker Container : (For Niryo Ned Robot)

Here we have created the docker containers for the models scripts and the all dependencies to run this architecture in the docker container $ xhost local:docker $ docker build . -t <name_for_docker_container> $ docker run -it --rm --privileged --env="DISPLAY" --volume="/tmp/.X11-unix:/tmp/.X11-unix:rw" --device="/dev/video0:/dev/video0" cdl:socks_storing2 python3 inference/only_camera_inference.py

10. Demo

Connecting two ROS systems:

#ip of master: 192.168.0.136
#ip of pi: 192.168.0.162

in Master:
export ROS_MASTER_URI=http://192.168.0.136:11311
export ROS_IP=192.168.0.136

inslave:
export ROS_MASTER_URI=http://192.168.0.136:11311
export ROS_IP=192.168.0.162

#Cheking if connection sucessful:
on master: 
	rosrun rospy_tutorials talker.py
	output: <some msg>
on slave:
	rosrun rospy_tutorials listener.py
	output: <received msg>

ADD THESE LINES IN .BASHRC for all the pc's and pi's

## Custom added
source /opt/ros/noetic/setup.bash
source ~/catkin_tutorials/devel/setup.bash
export ROS_WORKSPACE=~/catkin_tutorials

#connecting device to the ros
export ROS_MASTER_URI=http://192.168.0.136:11311
export ROS_IP=192.168.0.162

Starting the process:

Sample Run:

  1. Register all ros devices in new wifi: (use 'sudo nano ~/.bashrc') [ADD THESE LINES IN .BASHRC]

on master:

  1. run roscore
  2. run

on slave: 1.

Working Pictures

The output of the image/camera feed would be like this:

Working Video of it

Y2Mate.is.-.Socks_sorting_robot_CDL-MINT-zcNAyf5Y-F0-720p-1656061640688.mp4

About

This Project is to have a Robot Arm to do sorting mechanisms for things using darknet with addition of Digital Twins

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published