Skip to content

MaxFaubertProjects/1

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

1 Commit
 
 
 
 
 
 
 
 

Repository files navigation

Skeleton Detection iOS App

An iOS app that uses the device camera to detect and overlay the skeletal structure (bones) of people in real-time using Apple's Vision framework.

Features

  • 🎥 Real-time camera feed processing
  • 🦴 Human pose detection and skeleton overlay
  • 🎨 Visual bone connections with confidence-based opacity
  • 👤 Multi-person detection support
  • 📱 Native iOS app using SwiftUI

Requirements

  • iOS 14.0 or later
  • Xcode 13.0 or later
  • Physical iOS device with camera (simulator doesn't support camera)
  • Swift 5.5+

Technologies Used

  • SwiftUI: Modern UI framework
  • AVFoundation: Camera capture and video processing
  • Vision Framework: Human body pose detection (VNDetectHumanBodyPoseRequest)
  • Core Image: Image processing

Project Structure

SkeletonDetection.xcodeproj/          # Xcode project file
SkeletonDetection/
  ├── SkeletonDetectionApp.swift      # Main app entry point
  ├── ContentView.swift                # Main view with camera and overlay
  ├── CameraManager.swift              # Camera capture and frame management
  ├── CameraView.swift                 # UIKit camera view wrapper
  ├── PoseDetectionService.swift      # Vision framework pose detection
  ├── SkeletonOverlayView.swift       # SwiftUI skeleton rendering
  ├── Info.plist                       # App permissions
  ├── Assets.xcassets/                 # Asset catalog
  └── Preview Content/                 # Preview assets
README.md                              # This file

Quick Start

Option 1: Open Existing Project (Recommended)

The project is already fully configured and ready to use!

  1. Open the project in Xcode:

    open SkeletonDetection.xcodeproj
  2. Connect your iOS device via USB

  3. Select your device as the build target in Xcode (top toolbar)

  4. Configure code signing:

    • Click on the project in the navigator
    • Select "SkeletonDetection" target
    • Go to "Signing & Capabilities"
    • Select your Team (you may need to add your Apple ID)
  5. Click Run (▶️) or press Cmd + R

  6. Grant camera permission when prompted on your device

That's it! The app should now launch and start detecting your skeleton.

Option 2: Manual Setup (If Starting Fresh)

If you need to create the project from scratch:

  1. Open Xcode
  2. Select File → New → Project
  3. Choose iOS → App
  4. Fill in project details:
    • Product Name: SkeletonDetection
    • Interface: SwiftUI
    • Language: Swift
  5. Add all the .swift files to your project
  6. Replace the Info.plist with the provided one

How It Works

1. Camera Capture

The CameraManager class uses AVFoundation to:

  • Request camera permissions
  • Set up a capture session with the front-facing camera
  • Process video frames at high quality
  • Output frames as CVPixelBuffer objects

2. Pose Detection

The PoseDetectionService uses Vision framework to:

  • Process each video frame with VNDetectHumanBodyPoseRequest
  • Detect up to multiple people in the frame
  • Identify 19 body joints (nose, eyes, ears, shoulders, elbows, wrists, hips, knees, ankles, neck, root)
  • Return confidence scores for each detected joint

3. Skeleton Rendering

The SkeletonOverlayView renders:

  • Bones: Lines connecting related joints (e.g., shoulder to elbow, elbow to wrist)
  • Joints: Circles at each detected body point
  • Confidence visualization: Opacity based on detection confidence

4. Coordinate Conversion

Vision framework returns normalized coordinates (0-1) with origin at bottom-left. The app converts these to SwiftUI coordinates with origin at top-left.

Detected Body Points

The app detects these joints:

  • Head: nose, left/right eyes, left/right ears
  • Torso: neck, left/right shoulders, left/right hips, root
  • Arms: left/right elbows, left/right wrists
  • Legs: left/right knees, left/right ankles

Bone Connections

The skeleton overlay draws connections between:

  • Head bones: nose ↔ eyes ↔ ears, nose ↔ neck
  • Torso: neck ↔ shoulders ↔ hips, hip ↔ hip
  • Left arm: shoulder → elbow → wrist
  • Right arm: shoulder → elbow → wrist
  • Left leg: hip → knee → ankle
  • Right leg: hip → knee → ankle

Customization

Change Colors

Edit SkeletonOverlayView.swift:

// Bones color
.stroke(Color.cyan, lineWidth: 3)  // Change to .red, .blue, etc.

// Joints color
Circle().fill(Color.green)  // Change to preferred color

Adjust Joint Size

.frame(width: 8, height: 8)  // Increase for larger dots

Change Line Thickness

.stroke(Color.cyan, lineWidth: 3)  // Increase lineWidth

Adjust Confidence Threshold

Edit PoseDetectionService.swift:

recognizedPoint.confidence > 0.1  // Increase to 0.3 or 0.5 for stricter detection

Switch to Back Camera

Edit CameraManager.swift:

.default(.builtInWideAngleCamera, for: .video, position: .back)  // Change .front to .back

Troubleshooting

Camera Not Working

  • Ensure you're running on a physical device (not simulator)
  • Check that camera permissions are enabled in Settings → Privacy → Camera
  • Verify Info.plist contains NSCameraUsageDescription

Skeleton Not Appearing

  • Ensure good lighting conditions
  • Stand in front of the camera with full body visible
  • Try adjusting the confidence threshold (lower for more detections)
  • Check console for Vision framework errors

Build Errors

  • Ensure deployment target is iOS 14.0 or later
  • Clean build folder: Product → Clean Build Folder (Cmd + Shift + K)
  • Restart Xcode if issues persist

Code Signing Issues

  • Go to Signing & Capabilities in Xcode
  • Ensure "Automatically manage signing" is checked
  • Add your Apple ID account if needed (Xcode → Settings → Accounts)

Performance Issues

  • The app processes every frame, which can be CPU-intensive
  • Consider adding frame skipping logic in CameraManager.swift
  • Reduce video quality if needed (change .high to .medium)

Performance Tips

For better performance:

  1. Use a newer device (A12 chip or later recommended)
  2. Ensure good lighting
  3. Keep the full body visible in frame
  4. Close background apps

Building and Running

From Xcode GUI:

  1. Open SkeletonDetection.xcodeproj
  2. Select your device
  3. Click Run (▶️)

From Command Line:

# List available devices
xcrun xcodebuild -project SkeletonDetection.xcodeproj -list

# Build for connected device
xcodebuild -project SkeletonDetection.xcodeproj \
           -scheme SkeletonDetection \
           -destination 'platform=iOS,id=<device-id>' \
           build

# Install on device (requires proper provisioning)
xcodebuild -project SkeletonDetection.xcodeproj \
           -scheme SkeletonDetection \
           -destination 'platform=iOS,id=<device-id>' \
           install

Future Enhancements

Potential improvements:

  • Recording video with skeleton overlay
  • Saving screenshots
  • Angle measurements between joints
  • Exercise form analysis
  • Motion tracking and gesture recognition
  • 3D skeleton visualization with ARKit
  • Export joint coordinates for analysis
  • Workout rep counter
  • Real-time form correction feedback

Privacy

This app:

  • Only processes camera data locally on device
  • Does not save or transmit any images or video
  • Uses Vision framework which runs entirely on-device
  • Requires camera permission to function

Technical Details

Frameworks Used:

  • AVFoundation: Camera capture
  • Vision: Body pose detection
  • SwiftUI: User interface
  • Core Image: Image processing
  • Combine: Reactive programming

Key Classes:

  • VNDetectHumanBodyPoseRequest: Vision request for body pose
  • VNHumanBodyPoseObservation: Contains detected joint positions
  • AVCaptureSession: Manages camera input/output
  • CVPixelBuffer: Raw video frame data

License

Free to use and modify for personal and commercial projects.

Credits

Built with:

  • Apple Vision Framework
  • AVFoundation
  • SwiftUI
  • Core Image

Support

For issues or questions:

  1. Check the Troubleshooting section
  2. Review Apple's Vision Framework documentation
  3. Check console logs in Xcode for detailed error messages

Getting Started Video Tutorial

When you first run the app:

  1. The app will request camera permission - tap "Allow"
  2. Stand back so your full body is visible in the frame
  3. The skeleton overlay should appear automatically
  4. Move around to see the skeleton track your movements in real-time!

Tips for Best Results

  • Lighting: Ensure good, even lighting on your body
  • Background: A plain background helps detection accuracy
  • Clothing: Avoid baggy clothes that obscure body shape
  • Distance: Stand 6-10 feet from the camera for full body visibility
  • Stability: Hold your device steady or use a tripod

Enjoy tracking your skeleton! 🦴✨

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages