Caution
⭐ Roka AI is really impressive—I'm sharing the Main file for now! ^^ Once I'm done entirely with this project, I'll have it all right here, including the custom object detection model!
What is a VrChAI?
Vr ChAI (Virtual Reality Chat AI) was a summer side project I undertook while participating in the STARS program. It quickly became a passion project that allowed me to explore the capabilities of AI and human interaction within the virtual world of VR Chat. This project enhanced my Python programming skills and provided valuable insights into networking, ports, calling APIs, and training my very own object detection model to meet my needs.
Created VR ChAI, a voice / visual-driven AI virtual player-bot for VR Chat, simulating human-like conversation and behavior for immersive virtual interactions, employing live Computer Vision (CV), Speech Recognition & Synthesis, Retrieval-based Voice Conversion (RVC), and the PythonOSC network protocol to enhance the virtual experience.
Project Media
VrChAi.mp4
The RokaMaster.py
script utilizes the following libraries and tools:
- asyncio: Used for asynchronous operations in Python.
- pythonosc: Library for OSC (Open Sound Control) communication in Python.
- VrChAI.audioProcessing: Module for audio processing in VrChat.
- VrChAI.gptChat: Module for integrating GPT-based chat functionality.
- VrChAI.headpatCounter: Module for counting headpats in VrChat.
- VrChAI.helpMenu: Module providing a help menu for user assistance.
- VrChAI.oscMovement: Module for processing OSC movement commands.
- VrChAI.oscWorldMovement: Module for initiating random world movements using OSC.
- VrChAI.stringProcessing: Module for string manipulation and processing.
- VrChAI.tfVisionLook: Module for vision-related tasks in VrChat.
- VrChAI.tiktockTts: Module for text-to-speech synthesis using TikTok voices.
- controlVariables: Module containing control variables for the script.