Skip to content

Latest commit

 

History

History
47 lines (34 loc) · 1.35 KB

README.md

File metadata and controls

47 lines (34 loc) · 1.35 KB

Gestures2Operation

It's our Computer Graphics final project. There are 5 applications of gesture-operation.

Some of them are referenced in the cvzone tutorial. They are all based on mediapipe and camera of computer. Here is final video.

All codes are based on x86 Python. Run

pip install requirements.txt

to install all dependencies.

Mediapipe

Our detection part is made in use of mediapipe, a frame work to dect human working of google.

Its data is like this.(the data: hand).

[
    {'lmList': [
            [478, 523, 0],      #[x, y, z]
            ...
            [525, 294, -15], 
        ], 
     'bbox': (478, 275, 197, 248), 
     'center': (576, 399), 
     'type': 'Left'
    }
]

Key point locations are stored as [x, y, z] of Hand Landmark Model in lmList. The sequence of key points is shown in the figure.

Here's an example.

See more details of demos in their own README.md and ./report/report.pdf