Skip to content

Segmentation and Pose estimation using YOLOv8 for SHARPER Night 2023

License

Notifications You must be signed in to change notification settings

ric-sar/sharpernight2023

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

16 Commits
 
 
 
 
 
 
 
 
 
 

Repository files navigation

SHARPER Night 2023 Logo

This repo contains the YOLOv8 demonstration given during the SHARPER Night 2023 event.


What is SHARPER

SHARPER (SHAring Researchers’ Passion for Enhanced Roadmaps) aims to involve all citizens in discovering the profession of researcher and the role that researchers play in building the future of society through the investigation of the world based on facts, observations and the ability to adapt to and interpret increasingly complex and constantly evolving social and cultural contexts. More in detail, this repo contains the YOLOv8 demonstration given during the SHARPER Night event on the 29th September 2023 in Catania.

What we presented

The PeRCeiVe Lab introduced to public audience some research projects:

  • XrayGPT: GPT applied to Xray, with the capability to have a conversation with an AI agent acting as a doctor;
  • Stable Diffuson: diffusion models capable to generate images by text prompting;
  • MIDGARD: an open-source simulatior for autonomous robot navigation in outdoor unstructured environments;
  • YOLOv8: an object detection and tracking, instance segmentation, image classification and pose estimation architecture.

The YOLOv8 demo

Thanks to the interest shown by the public in segmentation and pose estimation tasks, we have created a ready-to-use version of YOLOv8 that the public can download and play at home. DISCLAIMER: This obviously does not mean that we are the creators of such architecture.

How to run the YOLOv8 demo

To run the YOLOv8 demonstration you have to:

  • Download this repo and extract all the files in your current working directory (e.g., C:\Users\Username). The repo contains the Python YAML environment needed to run YOLOv8 and the YOLOv8 scripts (both in webUI or CLI)
  • Download and install the Anaconda Distribution which is a cross-platform environment and package management system. It allows you to create, delete, update, clone, import, export environments and install, uninstall, search, update packages while solving dependency hell (if you need a more explainable guide, use THIS).

By launching Conda you have to:

  • Import the sharpernight2023 environment. In our case, sharpernight2023.yml located C:\Users\Username contains all the required packages to create and reproduce an environment. Run the following command to install the environment
conda env create -f sharpernight2023.yml

At the end of the process you will be able to run the YOLOv8 scripts

Run the demo

The demonstration comes in two flavours:

  • WebUI using Streamlit
  • Command Line Interface (CLI)

To run the Streamlit WebUI version

  • Activate the sharpernight2023 environment by running the command:
conda activate sharpernight2023
  • Move to the yolo_streamlit directory by running the command:
cd yolo_streamlit
  • Run the script main.py by running the command:
streamlit run main.py --theme.base dark

At the end of the process the default browser will be opened with the Streamlit application. The user must select the camera device to use and start capturing!

To run the CLI version

  • Activate the sharpernight2023 environment by running the command:
conda activate sharpernight2023
  • Move to the yolo_cli directory by running the command:
cd yolo_cli
  • Run the script main.py by running the command:
python main.py
  • The user must select the camera device by typing the corresponding camera device (e.g., a list of available device will be shown, the user must insert the corresponding camera device number) to use the YOLOv8. Then, the script will run and two separated windows will show the segmentation and pose estimation tasks.

ALTERNATIVELY (Windows only) by running the run.bat file, the procedure will be automated in both WebUI and CLI versions.


Acknowledgement

We would like to thank all the OSS communities around the world, especially: Ultralytics, Streamlit, OpenCV and StackOverflow.

About

Segmentation and Pose estimation using YOLOv8 for SHARPER Night 2023

Resources

License

Stars

Watchers

Forks