This is the repo for the eYRC competition submisison by the team hb_1134, a documentation for the our Project. The Project's purpose was to simulate Holonomic drive, by making a 3 wheel omi-directional bot. The simulation was achieved with the help of Gazebo and ROS.
If you’re interested in helping us improve our Project find out how to contribute.
Report Bug
·
Request Feature
Table of Contents
Cities, by nature, are constantly under development. In the future, the process of achieving the ever-demanding upgrades in the infrastructure of a city will be automated. The impact of such automation can lead to lesser construction time, little to no manual labour and lower execution costs. With more time, energy and finance to spare, this automation will be empowered by technology and imagination of the mind, and leave room for artistic expression. Humans have an inherent need to make sense of the world around them, understand it and then create something new out of it. For a more inclusive city, this is even more true. Demarcations on the road, signage on billboards and bus stops, and well-landscaped gardens provide order to the chaos and create an environment where the inhabitants can connect to their primal need to belong.
Keeping the above scenario in mind, in eYRC 2022-23 we were presented with the theme HolA Bot (HB)! HolA Bot is short for Holonomic Art Bot. As the full name suggests, this theme contained two major components to it: Holonomic Drive Bot and Art!
In this theme, our team had to deploy holonomic robot in Simulated Environment. Unlike the usual, more popular differential drive robots, the holonomic drive robots can control all the three degrees of freedom possible on a plane (translation along the x, y-axis and rotation along the z-axis). This gives the robot the ability to make art that would otherwise not be possible with the usual two-wheeled differential drive robot. This ability is demonstrated by the following video/gif of KUKA omniMove:
This is an example of how you may set-up your our project locally. To get a local copy up and running follow these simple example steps.
This is an example of how to list things you need to use the software and how to install them.
- ROS
- OpenCV
- Clone the repo in your ROS-workspace
git clone git@github.com:atom-robotics-lab/eYRC_HB_1134.git cd ~/catkin_ws/src
- Compile the package
cd ~/catkin_ws catkin_make
- Source the setup script
source ~/catkin_ws/devel/setup.bash
Use this space to show useful examples of how a project can be used. Additionally you may check the resources added below.
-
For running the Task01
roslaunch Task02 gazebo.launch roslaunch Task02 controller.py
-
For running the Task02
roslaunch Task02 gazebo.launch roslaunch Task02 feedback.py roslaunch Task02 controller.py
- task01
- task02
- hardware implementation
- testing individual parts and components
- making bot chassis
- printing holonomic wheels
- fabricating power delivery, daughterboard pcb
- final testing
See the open issues for a full list of proposed features (and known issues).
Contributions are what make the open source community such an amazing place to learn, inspire, and create. Any contributions you make are greatly appreciated.
If you have a suggestion that would make this better, please fork the repo and create a pull request. You can also simply open an issue with the tag "enhancement". Don't forget to give the project a star! Thanks again!
- Fork the Project
- Create your Feature Branch (
git checkout -b feature/AmazingFeature
) - Commit your Changes (
git commit -m 'Add some AmazingFeature'
) - Push to the Branch (
git push origin feature/AmazingFeature
) - Open a Pull Request
For more info refer to contributing.md
Our Socials - Linktree - atom@inventati.org
eYRC_HB_1134 Link: https://github.com/atom-robotics-lab/eYRC_HB_1134