Skip to content

Tcode-Motion/AR-Keyboard

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

5 Commits
 
 
 
 
 
 

Repository files navigation

Typing SVG

Python OpenCV MediaPipe NumPy PyAutoGUI Windows License

Star Forks GitHub


╔══════════════════════════════════════════════════════════════════════╗
║                                                                      ║
║   Type at 200 WPM — using only your hands in mid-air.               ║
║   No physical keyboard. No desk. Just you and the camera.            ║
║                                              — Tcode-Motion ⚡        ║
╚══════════════════════════════════════════════════════════════════════╝

🤖 About

A high-performance, augmented reality virtual keyboard controlled by AI hand gestures and mouse input. Powered by OpenCV and MediaPipe, it overlays a Ghost HUD keyboard on your screen and tracks your finger movements in real-time — letting you type in any Windows application without ever touching a physical keyboard.

🎯 Point your finger. Tap. It types. That's it. Welcome to spatial computing.


🚀 Key Features

⚡ Feature 📖 Description
👻 Ghost HUD Mode Uses Windows ctypes API — Always on Top + No-Focus. Types into ANY app without stealing window focus
🎯 Precision Strike Typing AI detects a "click" from decisive downward fingertip motion — Binary-State Trigger
200 WPM Capable Ultra-low latency with 120ms debounce timer + high-speed tracking (alpha=0.92)
⌨️ Full HP Layout Function Row (F1-F12), Navigation cluster, full Numpad — complete keyboard
🖐️ Dual-Hand Support Independent tracking for all 8 fingers (Index to Pinky) across both hands
🎨 Professional HUD UI Semi-transparent Navy-Glass aesthetic with Electric Cyan borders + Orange status indicators
🔀 Hybrid Control Switch between AI Hand Gestures and Mouse Clicks instantly
⏱️ Dwell Trigger Hold finger over key for 0.6s → visual progress bar fills → auto-types

🧠 Core Logic & Typing Engine

1. 🎯 Precision Strike — Hand Gesture Detection

Uses Depth-Aware Dynamic Thresholding — scales sensitivity to your hand's distance from the camera:

hand_scale = dist(wrist, middle_finger_base)
tap_threshold = hand_scale * 0.08

A keystroke fires if a finger moves downward by more than 8% of the hand's size — works consistently whether your hand is near or far from the camera.

2. ⏱️ Dwell Fallback Trigger

Hold a finger over any key for 0.6 seconds → a visual progress bar fills inside the key → keystroke fires automatically. Perfect for accessibility.

3. 👻 Ghost Window — ctypes Magic

The HUD window is rendered invisible to the Windows Focus Manager:

Flag Effect
WS_EX_NOACTIVATE Prevents window from stealing foreground focus when clicked
WS_EX_TOPMOST Keeps HUD above all other applications
WS_EX_APPWINDOW Keeps it visible in Taskbar

⌨️ Advanced Layout & Controls

🎛️ Function Keys (F1-F12)

Mode Behaviour
Normal Mode Media Controls — Emoji Panel, Brightness, Volume, Mic Mute, Media Prev/Play/Next
Fn-Locked Mode Standard Windows F1-F12 keys

🔢 Dual-Mode Numpad

Mode Behaviour
NumLock ON Standard numeric entry (0-9, . , Enter)
NumLock OFF Navigation Cluster — Home, End, PgUp, PgDn, Arrows, Insert, Delete

🔘 System Buttons

  • Pwr — Top right. Instantly closes the Virtual Keyboard
  • Shift / Ctrl / Alt — Sticky toggles. Click once to lock, click again to release

🛠️ Tech Stack

Python OpenCV MediaPipe NumPy PyAutoGUI ctypes

Library Purpose
OpenCV Camera feed capture, frame rendering, AR overlay
MediaPipe Real-time AI hand landmark detection (21 points per hand)
NumPy Geometry math — distance calculations, thresholding
PyAutoGUI Injects real keystrokes into active Windows applications
ctypes Windows API — Ghost HUD window flags

📦 Installation

Prerequisites

  • Windows 10 or 11
  • Python 3.10+
  • Standard webcam

Setup

# 1. Clone the repository
git clone https://github.com/Tcode-Motion/virtual-keyboard.git
cd virtual-keyboard

# 2. Install dependencies
pip install opencv-python mediapipe pyautogui numpy

# 3. Run the application
python virtual_keyboard_v2_main.py

🎮 How to Use

Step 1 → Run the script → Ghost HUD keyboard appears on screen
Step 2 → Position hands in front of webcam
         → Glowing MediaPipe landmarks overlay your fingers
Step 3 → Use a quick "pecking" tap motion toward camera to type
         OR simply click any key with your mouse
Step 4 → Click into any app (Notepad, Chrome, Discord, Games)
         → Virtual Keyboard injects text directly — globally
Step 5 → Exit: click Pwr button (top right) or press Q

📁 Project Structure

virtual-keyboard/
├── virtual_keyboard_v2_main.py   # Main entry point — run this
└── README.md

📜 System Requirements

Requirement Spec
OS Windows 10 / 11 (Ghost HUD ctypes required)
Python 3.10 or higher
Hardware Standard webcam
Dependencies opencv-python, mediapipe, pyautogui, numpy

🤝 Contributing

Contributions welcome — especially for:

  • 🎯 Gesture smoothing improvements

  • 🎨 HUD aesthetic upgrades

  • 🐧 Linux/macOS port (replacing ctypes with cross-platform alternative)

  • Star the repo if you like it!

  • 🐛 Open an issue for bugs

  • 🔧 Submit a PR for improvements


👨‍💻 Author

Tanmoy — Tcode-Motion

GitHub YouTube

"Predict the future by coding it." ⚡


📄 License

This project is licensed under the MIT License.


About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors

Languages