Skip to content

jangxx/VRC-Tracked-Objects

Repository files navigation

Logo VRC Tracked Objects (powered by OSC)

A project to bring real world objects into the virtual world of VRChat. The objects need to be added to an avatar and are tracked with Vive trackers.

Getting Started

Before getting started, make sure you have the ".NET Desktop Runtime 6.0" installed (latest version at time of writing is 6.0.29). You can download it here. Make sure you specifically download and install the Desktop Runtime, since the app is not going to run otherwise.

Afterwards download the latest version of the software from the Releases page. There are two downloads, the app itself and a Unity package containing the required avatar setup. You will also need the VRCAvatars3Tools, which you can download from booth for free in order to use the AnimatorControllerCombiner and the ExpressionParametersCombiner.

Setup

This section explains the initial setup, both of the software and also the avatar. If you prefer to watch a video tutorial of this whole process, you can find it on YouTube. This section assumes that you have already set up your avatar with a custom FX layer and custom parameters as well as an expressions menu. If you don't know how to do these things you need to look for another tutorial on basic prop toggles for example, since this section assumes at least basic knowledge of Unity.

  1. Unpack the downloaded files to a location of your liking. The app does not need to be installed and can be run by just clicking on the executable, but you might want to copy it to C:\\Program Files\VRC Tracked Objects for example.

  2. Before you can calibrate the avatar, you will need to add the required setup to the avatar. Drag and drop the Unity package you downloaded into Unity to open it. Afterwards move the TrackedObject Package into your scene and make sure it's located at 0,0,0.

  3. Unpack the TrackedObject Package prefab.

  4. Move the TrackedObject Container into the root of your avatar and the TrackedObject Anchor either into the right hand or left hand bone transform. Where you put it will determine to which hand the object will be relatively placed so take into consideration with which hand you are going to hold the object more often. Due to the relative positioning, the tracking is by far the most stable when held in the hand to which the container is anchored, so if you're adding a bottle for example, anchoring it to your dominant hand is going to be the best option. Set the position and rotation of the Anchor object to be all 0 (that normally causes the calibration cube to sit within the wrist).

  5. Find the VRCAvatars3Tools in the Unity menu and open the AnimatorControllerCombiner. Set the included FX Layer Layers as the source controller and the FX layer on your avatar as the destination. Afterwards copy all layers and paramters by clicking on Combine.

  6. Next you need to open the ExpressionParametersCombiner from the same VRCAvatars3Tools. Set the included Expression Parameters as the source and the VRCExpressionParameters object on your avatar as the destination. Then click Combine.

  7. Finally add a Four Axis puppet to your Expression menu which has the three OscTrackedPos parameters on it, as well as one of the OscTrackedRot ones (personally I chose RotX but it doesn't really matter). Set the Parameter option to OSCTrackingEnabled so that the parameter gets set to true when the menu is open and false when it is closed. This will cause the object to only track and be visible when the menu is open and the parameters are IK synced.
    Example:
    menu setup

  8. Upload the avatar as a new version.

After these steps your avatar is fully set up for the next step, i.e. the calibration. In a later step you will remove the debug cube and replace it with the object you actually want to track and the upload it again.

  1. Start SteamVR and connect at least the controller of the hand you chose as the anchor in step 4 as well as the tracker you want to use for the object. Open the app you downloaded and unpacked in step 1. to be greeted with this window:
    menu setup.
    Open the "Avatars" tab and copy-paste the Avatar ID from Unity into the respective input (you can find the Avatar ID for example in the Pipeline Manager on your avatar root or in the Content Manager section in the VRChat SDK window). Enter a name for the avatar and then click Add.

  2. Select your controller and tracker from the respective drop-down menus. Afterwards you have to start VRChat and switch into your freshly uploaded avatar. For the actual calibration I would recommend you to sit down on your desk with your VR headset so that you can still reach the keyboard while wearing it. Click on the Start calibration button to have three sides of a cube appear in the location of your controller. The task is now to align this cube with the one you have added to your avatar. To do this, use the arrow keys on your keyboard to switch through the seven different inputs. Up and down on the keyboard increments and decrements the value respectively, while left and right will switch to the next and previous input. Make sure that the arrows pointing from X Neg to X Pos, Y Neg to Y Pos and Z Neg to Z Pos point along the respective axes on the debug cube, while being perfectly in line with the sides of the cube. Also ensure that the scale matches. If you are satisfied with the result, click on Stop calibration to finish the calibration process.

  3. You are now ready for the first test! Switch to the "Tracking" tab and click on Start Tracking. In VRChat go to the OSC section of the Action Menu in order to reset the OSC config (so that it includes the new parameters you added). This action should also reload your avatar so that the tracking app can pick up the avatar change. If the "Current status" still says "Inactive (unkown avatar)", switch to another avatar and back so that the app can get notified of the change. Open the Action Menu again, go to your expressions and open the Four Axis puppet you added in step 7. If you did everything correctly, the cube should now follow the tracker you chose! We are almost done at this point.

  4. Open the "File" menu within the tracking app and save the config to a file.

  5. Go back to Unity and replace the debug cube with an object of your liking (or place it next to it within the Container object). It can be a good idea to do this after you have already put the tracker on the object you want to track with the debug cube still visiable so that you can get a sense for the orientation.

  6. Finally toggle the Container off so that your object is hidden by default. It will automatically get toggled on when the menu is open and the object is tracking, but it will be hidden otherwise.

Normal usage

While the initial setup of the system is rather involved, actually using it is really easy. Simply launch the app, load the config under "File > Open config file" and click on start tracking. You can then jump into VRChat and see the tracked object by simply opening the Four Axis puppet menu.

If you want to streamline the process even more, you can check "Start tracking when launched from config file", save the config again and then set up a shortcut that has the path of the config file as its first parameter. This way you can click on a single shortcut which will start the app and start tracking immediately (as long as the controller and tracker are connected).

The user interface

This is an overview over the entire interface of the app as well as an explanation of what each part does.

entire interface

  1. If this is checked, tracking will begin immediately after the program is launched. As the label implies, this only works if the app is launched with the config file as its first launch parameter so that it can be loaded immediately on startup.

  2. Configure your OSC input and output addresses here. Both are needed, because we need bidirectional communication with VRChat. On the one end we need to listen for the enable parameter and on the other end we need to send in the tracking values.

  3. Here you can select your controller and tracker that the tracking is realtive to. The refresh button queries SteeamVR for a list of controllers. If a name is followed by (Not found) it means that the serial number was specified in the config file, but the controller or tracker is not currently connected. After you have connected the tracker, hit Refresh to have the app see the device properly.

  4. These are the parameters that the app publishes and listens to. The Activate parameter is optional. If the field is left blank, tracking data will be fed into the game as soon as a compatible avatar is changed into.

  5. This status field can show the statuses active when a compatible avatar is worn and tracking data is sent, inactive (unknown avatar) if the current avatar is not compatible, inactive (disabled) when the Activate parameter is set to false and inactive when tracking has not been started.

  6. This button attempts to start tracking. It will show an error if the controller or tracker is not connected or if required fields are empty.

  7. This is the global avatar selector. Choose the avatar you want to calibrate or setup parameters for here.

  8. Here you can see and edit the calibration values. Do not that these values are only read at the very beginning of the tracking and calibration process. It is therefore not possible to live edit calibration values, not even while in the calibration procedure.

  9. This buttons starts the calibration procedure. The currently active field will be highlighted in red. Pressing Left Arrow and Right Arrow on the keyboard switches between the different fields, while pressing Up Arrow and Down Arrow increments and decrements the values respectively.

  10. This is the list of currently configured avatars. It is currently not possible to edit avatar names or IDs in the app directly, so if you want to rename an avatar, you need to do it in the config file directly.

  11. Here you can add a new avatar to the system. As mentioned before it's currently not possible to edit a configured avatar so make sure that the ID and name is correct.

Working principle

Bringing externally tracked objects into VRChat comes with one main challenge: How do we synchronize the tracking universe of SteamVR with the transform of the avatar. The solution I came up with is this: Find a point on the avatar that has a (mostly) fixed offset to a tracked device in SteamVR. Ideally we could use the hip for this but:

  1. not everyone has FBT
  2. the relative position of the hip tracker to the hip bone changes every time the avatar is recalibrated.

The next option would be the headset, but unfortunately the Head bone is not actually locked to the headset position with a static offset either. Instead it mostly follows it, but especially when looking up and down it is very common for the HMD position and the Head bone to diverge quite a bit.

The final option are the controllers and this is what I went with. The offset between the position of the controller and the (Right|Left)Hand bones is not completely static either, but from all available options it is by far the best one. Using a controller also comes with the advantage that the offset between the tracker and the controller will be essentially static if the tracker is held in the same hand, so for objects that are supposed to be picked up, using the controller is also the option with the lowest jitter.

So then, with the controller's and the tracker's position in hand we can just start tracking, right? Unfortunately not yet. Each controller is different, but on most of them, the point that is actually tracked is the very tip of the controller, which obviously does not line up with the root of the Hand bone. In order to find this offset, a calibration step is needed. The result of the calibration is a 4x4 transformation matrix including scale which lets us calculate the position of the root of the Hand pretty accurately.

The avatar is then set up in such a way that six nested GameObjects are used to translate and rotate a virtual object in all axes of position and rotation relative to the Hand bone.

Afterwards the final calculation we need to do is:

  1. Calculate the inverse of the controller matrix and multiply it with the tracker matrix to get the transform from the controller to the tracker.
  2. Calculate the inverse of the controller -> Hand bone matrix (i.e. the calibration matrix) in order to get the transform from the Hand bone to the controller (this is of course not done once per update but only one time when tracking is started).
  3. Multiply the inverse of the calibration matrix with the controller -> tracker transform from step 1). This gives us a transform from the Hand bone -> controller -> tracker, i.e. a transform from the hand to the tracker.
  4. Extract the position and rotation from the resulting matrix. The position is simply the last column and the rotation can be extracted from the upper left 3x3 submatrix. To extract the rotation we actually need to invert the rotation component again however, since we are not interested in the rotation required to rotate an object from the Hand bone to the tracker, but instead the inverse rotation to counteract the relative rotation between the controller and the tracker. The virtual object already rotates with the controller after all, since it is parented to it. This needs to be undone to get the virtual object to line up with the tracker.

Troubleshooting

The object moves in weird ways

Check that the calibration is correct. Make sure that the calibration cube has exactly the same size on your avatar as it has when you first put it into the scene (that means its scale should be 1 if your avatar is not scaled and 1/scale if it is).

Also make sure that the Anchor is actually at (0,0,0) and with (0,0,0) rotation relative to the hand bone. Technically this is not strictly neccessary, but for your own sanity it is way easier to do all the transforming and rotating in the tracking app than doing some of it in Unity and some of it in the tracking app. Normally the tracking cube should sit in your wrist and because you remove it after the calibration anyway, there is no need to move it around to make it look like you're holding it in your hand.

The tracking suddenly got worse

The most common reason for this problem that I have found is changes in the scale between the real world and VRChat. This can happen in a variety of ways:

  • When changing the "real height" in the settings
  • When chaning size measurement between wingspan and height
  • probably others too

If you changed anything at all that might be related to your scale in VRChat, try to run the calibration again and see if the size of the cubes still match up. More often than not they don't anymore and small tweaks to the scale are neccessary. Luckily it's only the scale that gets messed up so this calibration step should not take more than a few seconds.

About

A system to read tracking data and feed it into VRChat via OSC

Resources

License

Stars

Watchers

Forks

Packages

No packages published

Languages