Skip to content

OSC App to allow VRChat avatars to interact with eye and facial tracking hardware


Notifications You must be signed in to change notification settings


Repository files navigation

πŸ‘€ VRCFaceTracking

Provides eye tracking and lip tracking in VRChat by providing a bridge between your tracking hardware and VRChat's OSC server.


πŸŽ₯ Demo

πŸ›  Avatar Setup

For this app to work, you'll need to be using an avatar with the correct parameters or an avatar config file with the correct mappings. The system is designed to control your avatar's eyes and lips via simple blend states but what the parameters control is completely up to you.

It's not required to use all of these parameters. In fact, you don't need to use any of them if you intend on using VRChat's built-in eye tracking system. Similar to the setup of parameters with Unity Animation Controllers, these are all case-sensitive and must be copied EXACTLY as shown into your Avatar's base parameters. A typical setup might look something like this:

We strongly encourage you to consult the docs for a setup guide and more info as to what each parameter does

There are a large number of parameters you can use for lip and face tracking.

Combined Lip Parameters - Combined parameters to group mutually exclusive face shapes.

β›“ External Modules

Use the module registry to download addons and add support for your hardware!