Skip to content
/ IA-RM Public

Anthropomorphic arm for assistance purposes controlled via a mobile app and object recognition.

Notifications You must be signed in to change notification settings

0Miquel/IA-RM

Repository files navigation

IA-RM

Anthropomorphic arm for assistance purposes controlled via a mobile app and object recognition.

img

Robotics, language and planification project from Universitat Autònoma de Barcelona

Table of contents

Hardware scheme

List of used components:

  • I2C controller: Connected to the servos
  • Raspberry Cam: Connected to the raspberry pi
  • Raspberry pi: Connected to the I2C controller, a battery via USB and the camera.
  • Battery: It provides energy to the I2C controller.
  • USB battery: It provides energy to the Raspberry PI
  • Servos: Connected to the I2C controller. The servos provide the kinematics of the arm.
  • Ultrasonic sensors: Connected to the I2C controller. The sensor estimates the height of the object.

hardware

Software achitecture

Software Architecture

All the software is controlled via Python code, which is compatible with Raspberry Pi.

Object recognition

The model is able to detect the following objects:

  • Banana
  • Orange
  • Apple
  • Glass
  • TV Command

Additionally, it is also able to detect glasses with different types of pills.

The dataset has been created manually using the labelme tool. Moreover, Roboflow API has been used for Data Augmentation, which implemented some augmentations like vertical and horizontal flips, rotations and zooms.

Two different models have been tested for object detection, YOLOv5 and Mask R-CNN provided by detectron2. As a result, it has been decided to use YOLOv5 which performs better in the pills recognition problem.

yolov5

Example of the ground truth vs our predictions:

The training can be checked in the SimulatedObjectsYOLOv5.ipynb notebook.

Mobile application

Mobile application to select the object that will be moved to a specified area. It also gets feedback about the objects in the work area and the robot status.

To develop the mobile App, it is used a framework called Flutter which works with dart programming language. Additionally, to communicate the App and the Python code that controls the robot it is used the framework Flask.

app1

app2

Kinematics

axis

As it consists of a 5 axis robot which has a huge number of different solutions, it has been decided to develop a kinematics geometric approach instead of an algebraic approach which is also faster and it provides a better user experience.

Additionally, a linear movement has been developed so as to achieve a better and safer grasping using the line equation.

Simulation Strategy

The simulator used has been CoppeliaSim. Two main simulations have been developed:

  • Manual: The objects are manually selected by the user through the application and also where to place it.
  • Object recognition: The objects are automatically detected by the vision module and the user decides which object the robot is going to approach.

Furthermore, we have used an android emulator to run the mobile application, which provides the interaction with the robot.

Images from the simulation escene:

image6

Video demonstration

IMAGE ALT TEXT HERE

Authors

  • Miquel Romero Blanch
  • Gerard Graugés Bellver
  • Guillem Martínez Sánchez
  • Oriol Graupera Serra

About

Anthropomorphic arm for assistance purposes controlled via a mobile app and object recognition.

Resources

Stars

Watchers

Forks

Packages

No packages published

Contributors 4

  •  
  •  
  •  
  •  

Languages